What’s also funny is that between "Internet" and "social media", there’s a huge layer called "World Wide Web".
That layer (which is now often confused with Internet itself) is an European invention. (A Brit and a Belgian, sitting in an office on the border between France and Switzerland, could not be more european than that).
The point is not to be pro-european, just to say that if you list inventions and remove the components not invented in the US, you will end with a list of inventions from the US.
I think the point is that defining authorship of intellectual products is a bit like putting authorship on words (even common ones). They are always part of a larger shared discourse. That is not to remove merit from individuals but to acknowledge that merit is not quantitative and could be stretched to fit political motives.
To be fair they did leave out highest per capita consumption and CO2 emissions, freedumb cars for everyone at the expense of public transport, efficiency and equitable urban planning, using guns as can openers and childrens toys, etc.
>freedumb cars for everyone at the expense of public transport, efficiency and equitable urban planning
While America is the poster child for car culture, Europe isn't a panacea either: lots of cities built up post-WWII are car-bound hellscapes too. Lots of other places have bowed to worshiping at the altar of the automobiles too, it's not just the US.
As for equitable urban planning, Europe is going through a huge housing crisis now too.
I have to tell you, between those three things one of them is not like the other. But I'll be enjoying Barbie and Oppenheimer this weekend instead of getting into the typical nationalistic pissing contests that Americans usually start, and non-Americans can't help but get into a frenzy debating.
Very interesting article. It somewhat reminds of current differences in academic philosophy where some distinguish between "analytical" and "continental" schools.
Experimental particle physicist here. What you say about Higgs particles "they don't exist under normal condition" is loosely true of all particles in nature in the sense that a particle is nothing but a "quantum" of a "field". Fields pervade all physical space and can vary in time. Particles (or quanta) simply represent a local state of observable things. A field can only do certain things to certain physical states and at a probabilistic level. Notice that fields do things even with the vacuum which is just another state from which particles can be "extracted".
The peculiar experimental challenge about the Higgs field is that it can extract its quanta from certain physical states (certain initial conditions in a particle physics reaction) only at very high energy and with low probability, but that is true also for other particles. Its truly peculiar thing is that the presence of the Higgs field, in addition to the fields of all other particles that we know of, explains why quanta in general have a mass (although this is not clear for neutrinos) through a mechanism where the Higgs field interacts with the quanta of other particles.
Solitons are self re-enforcing wave packets in some medium, where the dispersive effects get canceled out. I like to think of field quanta as being similar at a very high handwavy level. Of course the trick with quanta is only certain energy levels are allowed.
That excellent writeup reinforced something I've become convinced of: decades of "shut up and calculate" have created gobs of contradictory analogies and false intuitions that pedagogy hasn't caught up to. When I hear:
"In the jargon of field theory, physicists often say that “virtual particles” can briefly and spontaneously appear from the vacuum and then disappear again, even when no one has put enough energy into the field to create a real particle. But what they really mean is that the vacuum itself has random and indelible fluctuations, and sometimes their influence can be felt by the way they kick around real particles."
I can't help but immediately question every jargon word I see, especially "random", "particle" and "wave".
Metaphors have limited applicability. The applicability of metaphors in quantum field theory that the public is ever exposed to is pretty mich limited to sounding cool and inspiring some awe in the face of all the mistery and complexity.
Im my experience, all terms in physics (like "particle", "wave", "energy") are highly context dependent and most PhDs could spend hours debating what is actually meant in a given case. Such discussions almost never lead to publishable results and are thus considered a waste of time, or leasure at best.
Usually, you just "shut up and calculate". Meanwhile, the calculations are motivated by "intuition", which involves combining known or unknown reasonable approximations with a basic theory. This process is never explained systematically and rather the hope is that it will be absorbed via osmosis by the brighter students.
It turns out, you actually don't need to have a coherent concept of "what a particle is" to perform particle physics experiments and evaluate the data. Sometimes, when pressed, operational definitions can be offered. For example, I've heard a professor say: "a particle is defined as a bump at a given energy in this plot". I'm not sure how ironic that was supposed to sound...
In a certain sense I've become pessimistic about the possibility of ever assigning a meaningful ontology to objects at that scale. We have models which give results. Some ways of calculating scattering amplitudes make reference to virtual particles at all. I've begun to think that there is no way to understand physics _except_ to become acquainted with the mathematics of it. The analogies are pointless.
I usually think of the Higgs mechanism (that gives mass) like the surface tension on a pond, whereas the Higgs boson would be ripples. It's not perfect (like any analogy).
The laws of thermodynamics, for example, just like all the laws of physics, are just consistencies we have observed not breaking after decades of trying every corner case we can conjure up. They are quite literally the definition of inductiveness: you see that something holds for a solid number of cases so you assume that it just holds for everything else too.
Of course any scientist who hasn't just memorized a bunch of formulas but has actually grasped the reasoning behind science will tell you that even the most fundamental law of the universe has a possibility of being false, just like the Riemann Hypothesis can't be considered proven just because it works for an unfathomable amount of computed values.
Science is about creating models and then studying them and making predictions off of them. The making of the model is an inductive and empiric ordeal, whereas working on said model is a matter of deduction.