In recent years, physicists have been watching the data coming in from the Large Hadron Collider (LHC) with a growing sense of unease. We’ve spent decades devising elaborate accounts for the behaviour of the quantum zoo of subatomic particles, the most basic building blocks of the known universe. The Standard Model is the high-water mark of our achievements to date, with some of its theoretical predictions verified to within a one-in-ten-billion chance of error – a simply astounding degree of accuracy. But it leaves many questions unanswered. For one, where does gravity come from? Why do matter particles always possess three, ever-heavier copies, with peculiar patterns in their masses? What is dark matter, and why does the universe contain more matter than antimatter?
In the hope of solving some of these mysteries, physicists have been grafting on elegant and exciting new mathematical structures to the Standard Model. The programme follows an arc traced by fundamental physics since the time of Isaac Newton: the pursuit of unification, in which science strives to explain seemingly disparate ‘surface’ phenomena by identifying, theorising and ultimately proving their shared ‘bedrock’ origin. This top-down, reductive style of thinking has yielded many notable discoveries. Newton perceived that both an apple falling to the ground, and the planets orbiting around the sun, could be explained away by gravity. The physicist Paul Dirac came up with antimatter in 1928 by marrying quantum mechanics and Einstein’s special theory of relativity. And since the late 20th century, string theorists have been trying to reconcile gravity and quantum physics by conceiving of particles as tiny vibrating loops of string that exist in somewhere between 10 and 26 dimensions.
So when the European Organization for Nuclear Research (CERN) cranked up the LHC just outside Geneva for a second time in 2015, hopes for empirical validation were running high.