Slobodan Perovic just gave a talk on a project he’s working on: the reason behind the alleged crisis in fundamental physics. His general strategy is to compare the situation in particle physics with that of quantum mechanics in its early stages, and draw lessons from that. One of the problems he diagnoses in experimental particle physics, which he claims did not exist for quantum mechanics, is a lack of diversity in experimental apparatus. The idea is that using less diverse experimental apparatus decreases the chances of new discoveries, as it drastically narrows the space of possible discoveries. A more diverse array of apparatus would be able to cover more of the search space.
Now, I have serious doubts about whether there is any lack of progress in experimental particle physics to be explained, but even supposing there is, I’m rather skeptical that a lack of diversity in the apparatus is responsible for it. To the extent that there is a lack of methodological diversity in experimental particle physics, I do not think it is due to uniformity in apparatus.
A preliminary consideration supporting that last statement is the following. We can agree that condensed matter physics is in an extremely healthy state, and has been for the last 50 years. But I’m not sure that they have any more diversity in their experimental apparatus than particle physicists do. It is more common in condensed matter physics to order entire machines from a vendor, whereas particle physicists tend to have to build their own machines. And there just aren’t that many different vendors offering STEMs or whatever.
I think that the following factors do much more to reduce methodological diversity in experimental particle physics:
- Recycling of personnel across different labs. With the Tevatron winding down, physicists who used to work there will go (or have gone) to either Wall Street or the LHC. This leads to a certain homogenizing of methods even across different experiments. This happens to a much smaller extent in condensed matter physics, since budding researchers are expected to start their own lab rather than join another established collaboration.
- Institutional factors like the Particle Data Group‘s publications. Their Review of Particle Physics is often referred to as the ‘Bible’ by particle physicists. Every particle physicist has a copy on his/her bookshelf. The Review lays out, among other things, data analysis methods that are ‘standard’ for the field. To my knowledge, there is no such equivalent publication in condensed matter.
- Compared to condensed matter labs, the institutional setup for particle physics collaborations is less amenable to the introduction of new methods. Publications under the collaboration’s authorship have to be approved not just by one’s immediate supervisor, but by a hierarchy of scientists within the collaboration, before they are given the collaboration’s collective blessing and allowed to be made public. Now, if one wants to publish something about a new general strategy for statistical analysis, something not specific to the particular setup of one’s experiment, independent publication would probably be OK. So the barriers don’t exist for some kinds of methodological innovation. But if one wants to use a new method that is specific to the conditions of one’s experiment, that, I think, will have to go through the administrative hierarchy for approval. In condensed matter, there are fewer layers of approval to go through.