↩ Accueil

Vue lecture

Jesper Grimstrup’s The Ant Mill: could his anti-string-theory rant do string theorists a favour?

Imagine you had a bad breakup in college. Your ex-partner is furious and self-publishes a book that names you in its title. You’re so humiliated that you only dimly remember this ex, though the book’s details and anecdotes ring true.

According to the book, you used to be inventive, perceptive and dashing. Then you started hanging out with the wrong crowd, and became competitive, self-involved and incapable of true friendship. Your ex struggles to turn you around; failing, they leave. The book, though, is so over-the-top that by the end you stop cringing and find it a hoot.

That’s how I think most Physics World readers will react to The Ant Mill: How Theoretical High-energy Physics Descended into Groupthink, Tribalism and Mass Production of Research. Its author and self-publisher is the Danish mathematician-physicist Jesper Grimstrup, whose previous book was Shell Beach: the Search for the Final Theory.

After receiving his PhD in theoretical physics at the Technical University of Vienna in 2002, Grimstrup writes, he was “one of the young rebels” embarking on “a completely unexplored area” of theoretical physics, combining elements of loop quantum gravity and noncommutative geometry. But there followed a decade of rejected articles and lack of opportunities.

Grimstrup became “disillusioned, disheartened, and indignant” and in 2012 left the field, selling his flat in Copenhagen to finance his work. Grimstrup says he is now a “self-employed researcher and writer” who lives somewhere near the Danish capital. You can support him either through Ko-fi or Paypal.

Fomenting fear

The Ant Mill opens with a copy of the first page of the letter that Grimstrup’s fellow Dane Niels Bohr sent in 1917 to the University of Copenhagen successfully requesting a four-storey building for his physics institute. Grimstrup juxtaposes this incident with the rejection of his funding request, almost a century later, by the Danish Council for Independent Research.

Today, he writes, theoretical physics faces a situation “like the one it faced at the time of Niels Bohr”, but structural and cultural factors have severely hampered it, making it impossible to pursue promising new ideas. These include Grimstrup’s own “quantum holonomy theory, which is a candidate for a fundamental theory”. The Ant Mill is his diagnosis of how this came about.

The Standard Model of particle physics, according to Grimstrup, is dominated by influential groups that squeeze out other approaches.

A major culprit, in Grimstrup’s eyes, was the Standard Model of particle physics. That completed a structure for which theorists were trained to be architects and should have led to the flourishing of a new crop of theoretical ideas. But it had the opposite effect. The field, according to Grimstrup, is now dominated by influential groups that squeeze out other approaches.

The biggest and most powerful is string theory, with loop quantum gravity its chief rival. Neither member of the coterie can make testable predictions, yet because they control jobs, publications and grants they intimidate young researchers and create what Grimstrup calls an “undercurrent of fear”. (I leave assessment of this claim to young theorists.)

Half the chapters begin with an anecdote in which Grimstrup describes an instance of rejection by a colleague, editor or funding agency. In the book’s longest chapter Grimstrup talks about his various rejections – by the Carlsberg Foundation, The European Physics Journal C, International Journal of Modern Physics A, Classical and Quantum Gravity, Reports on Mathematical Physics, Journal of Geometry and Physics, and the Journal of Noncommutative Geometry.

Grimstrup says that the reviewers and editors of these journals told him that his papers variously lacked concrete physical results, were exercises in mathematics, seemed the same as other papers, or lacked “relevance and significance”. Grimstrup sees this as the coterie’s handiwork, for such journals are full of string theory papers open to the same criticism.

“Science is many things,” Grimstrup writes at the end. “[S]imultaneously boring and scary, it is both Indiana Jones and anonymous bureaucrats, and it is precisely this diversity that is missing in the modern version of science”. What the field needs is “courage…hunger…ambition…unwillingness to compromise…anarchy.

Grimstrup hopes that his book will have an impact, helping to inspire young researchers to revolt, and to make all the scientific bureaucrats and apparatchiks and bookkeepers and accountants “wake up and remember who they truly are”.

The critical point

The Ant Mill is an example of what I have called “rant literature” or rant-lit. Evangelical, convinced that exposing truth will make sinners come to their senses and change their evil ways, rant lit can be fun to read, for it is passionate and full of florid metaphors.

Theoretical physicists, Grimstrup writes, have become “obedient idiots” and “technicians”. He slams theoretical physics for becoming a “kingdom”, a “cult”, a “hamster wheel”, and “ant mill”, in which the ants march around in a pre-programmed “death spiral”.

Grimstrup hammers away at theories lacking falsifiability, but his vehemence invites you to ask: “Is falsifiability really the sole criterion for deciding whether to accept or fail to pursue a theory?”

An attentive reader, however, may come away with a different lesson. Grimstrup calls falsifiability the “crown jewel of the natural sciences” and hammers away at theories lacking it. But his vehemence invites you to ask: “Is falsifiability really the sole criterion for deciding whether to accept or fail to pursue a theory?”

In his 2013 book String Theory and the Scientific Method, for instance, the Stockholm University philosopher of science Richard Dawid suggested rescuing the scientific status of string theory by adding such non-empirical criteria to evaluating theories as clarity, coherence and lack of alternatives. It’s an approach that both rescues the formalistic approach to the scientific method and undermines it.

Dawid, you see, is making the formalism follow the practice rather than the other way around. In other words, he is able to reformulate how we make theories because he already knows how theorizing works – not because he only truly knows what it is to theorize after he gets the formalism right.

Grimstrup’s rant, too, might remind you of the birth of the Yang–Mills theory in 1954. Developed by Chen Ning Yang and Robert Mills, it was a theory of nuclear binding that integrated much of what was known about elementary particle theory but implied the existence of massless force-carrying particles that then were known not to exist. In fact, at one seminar Wolfgang Pauli unleashed a tirade against Yang for proposing so obviously flawed a theory.

The theory, however, became central to theoretical physics two decades later, after theorists learned more about the structure of the world. The Yang-Mills story, in other words, reveals that theory-making does not always conform to formal strictures and does not always require a testable prediction. Sometimes it just articulates the best way to make sense of the world apart from proof or evidence.

The lesson I draw is that becoming the target of a rant might not always make you feel repentant and ashamed. It might inspire you into deep reflection on who you are in a way that is insightful and vindicating. It might even make you more rather than less confident about why you’re doing what you’re doing

Your ex, of course, would be horrified.

The post Jesper Grimstrup’s <em>The Ant Mill</em>: could his anti-string-theory rant do string theorists a favour? appeared first on Physics World.

  •  

Hints of a boundary between phases of nuclear matter found at RHIC

In a major advance for nuclear physics, scientists on the STAR Detector at the Relativistic Heavy Ion Collider (RHIC) in the US have spotted subtle but striking fluctuations in the number of protons emerging from high-energy gold–gold collisions. The observation might be the most compelling sign yet of the long-sought “critical point” marking a boundary separating different phases of nuclear matter. This similar to how water can exist in liquid or vapour phases depending on temperature and pressure.

Team member Frank Geurts at Rice University in the US tells Physics World that these findings could confirm that the “generic physics properties of phase diagrams that we know for many chemical substances apply to our most fundamental understanding of nuclear matter, too.”

A phase diagram maps how a substance transforms between solid, liquid, and gas. For everyday materials like water, the diagram is familiar, but the behaviour of nuclear matter under extreme heat and pressure remains a mystery.

Atomic nuclei are made of protons and neutrons tightly bound together. These protons and neutrons are themselves made of quarks that are held together by gluons. When nuclei are smashed together at high energies, the protons and neutrons “melt” into a fluid of quarks and gluons called a quark–gluon plasma. This exotic high-temperature state is thought to have filled the universe just microseconds after the Big Bang.

Smashing gold ions

The quark–gluon plasma is studied by accelerating heavy ions like gold nuclei to nearly the speed of light and smashing them together. “The advantage of using heavy-ion collisions in colliders such as RHIC is that we can repeat the experiment many millions, if not billions, of times,” Geurts explains.

By adjusting the collision energy, researchers can control the temperature and density of the fleeting quark–gluon plasma they create. This allows physicists to explore the transition between ordinary nuclear matter and the quark–gluon plasma. Within this transition, theory predicts the existence of a critical point where gradual change becomes abrupt.

Now, the STAR Collaboration has focused on measuring the minute fluctuations in the number of protons produced in each collision. These “proton cumulants,” says Geurts, are statistical quantities that “help quantify the shape of a distribution – here, the distribution of the number of protons that we measure”.

In simple terms, the first two cumulants correspond to the average and width of that distribution, while higher-order cumulants describe its asymmetry and sharpness. Ratios of these cumulants are tied to fundamental properties known as susceptibilities, which become highly sensitive near a critical point.

Unexpected discovery

Over three years of experiments, the STAR team studied gold–gold collisions at a wide range of energies, using sophisticated detectors to track and identify the protons and antiprotons created in each event. By comparing how the number of these particles changed with energy, the researchers discovered something unexpected.

As the collision energy decreased, the fluctuations in proton numbers did not follow a smooth trend. “STAR observed what it calls non-monotonic behaviour,” Geurts explains. “While at higher energies the ratios appear to be suppressed, STAR observes an enhancement at lower energies.” Such irregular changes, he said, are consistent with what might happen if the collisions pass near the critical point — the boundary separating different phases of nuclear matter.

For Volodymyr Vovchenko, a physicist at the University of Houston who was not involved in the research, the new measurements represent “a major step forward”. He says that “the STAR Collaboration has delivered the most precise proton-fluctuation data to date across several collision energies”.

Still, interpretation remains delicate. The corrections required to extract pure physical signals from the raw data are complex, and theoretical calculations lag behind in providing precise predictions for what should happen near the critical point.

“The necessary experimental corrections are intricate,” Vovchenko said, and some theoretical models “do not yet implement these corrections in a fully consistent way.” That mismatch, he cautions, “can blur apples-to-apples comparisons.”

The path forward

The STAR team is now studying new data from lower-energy collisions, focusing on the range where the signal appears strongest. The results could reveal whether the observed pattern marks the presence of a nuclear matter critical point or stems from more conventional effects.

Meanwhile, theorists are racing to catch up. “The ball now moves largely to theory’s court,” Vovchenko says. He emphasizes the need for “quantitative predictions across energies and cumulants of various order that are appropriate for apples-to-apples comparisons with these data.”

Future experiments, including RHIC’s fixed-target program and new facilities such as the FAIR accelerator in Germany, will extend the search even further. By probing lower energies and producing vastly larger datasets, they aim to map the transition between ordinary nuclear matter and quark–gluon plasma with unprecedented precision.

Whether or not the critical point is finally revealed, the new data are a milestone in the exploration of the strong force and the early universe. As Geurts put it, these findings trace “landmark properties of the most fundamental phase diagram of nuclear matter,” bringing physicists one step closer to charting how everything  – from protons to stars – first came to be.

The research is described in Physical Review Letters.

The post Hints of a boundary between phases of nuclear matter found at RHIC appeared first on Physics World.

  •  

Cosmic muons monitor river sediments surrounding Shanghai tunnel

Photograph of the portable muon detector in the Shanghai tunnel
Trundling along A portable version of the team’s muon detector was used along the length of the tunnel. (Courtesy: Kim Siang Khaw et al/Journal of Applied Physics/CC BY 4.0)

Researchers in China say that they are the first to use cosmic-ray muography to monitor the region surrounding a tunnel. Described as a lightweight, robust and affordable scintillator setup, the technology was developed by Kim Siang Khaw at Shanghai Jiao Tong University and colleagues. They hope that their approach could provide a reliable and non-invasive method for the real-time monitoring of subterranean infrastructure.

Monitoring the structural health of tunnels and other underground infrastructure is challenging because of the lack of access. Inspection often relies on techniques such as borehole drilling, sonar scanning, and multibeam echo sounders to determine when maintenance is needed. These methods can be invasive, low resolution and involve costly and disruptive shutdowns. As a result there is often a trade-off between the quality of inspections and the frequency at which they are done.

This applies to the Shanghai Outer Ring Tunnel: a major travel artery in China’s largest city, which runs for almost 3 km beneath the Huangpu River. Completed in 2023, the submerged section of the tunnel is immersed in water-saturated sediment, creating a unique set of challenges for structural inspection.

Time-varying stresses

In particular, different layers of sediment surrounding the tunnel can vary widely in their density, permeability, and cohesion. As they build up above the tunnel, they can impart uneven, time-varying stresses, making it incredibly challenging for existing techniques to accurately assess when maintenance is needed.

To address these challenges, a multi-disciplinary team was formed to explore possible solutions. “During these talks, the [Shanghai Municipal Bureau of Planning and Natural Resources] emphasized the practical challenges of monitoring sediment build-up around critical infrastructure, such as the Shanghai Outer Ring Tunnel, without causing disruptive and costly shutdowns,” Khaw describes.

Among the most promising solutions they discussed was muography, which involves detecting the muons created when high-energy cosmic rays interact with Earth’s upper atmosphere. These muons can penetrate deep beneath Earth’s surface and are absorbed at highly predictable rates depending on the density of the material they pass through.

A simple version of muography involves placing a muon detector on the surface of an object and another detector beneath the object. By comparing the muon fluxes in the two detectors, the density of the object can be determined. By measuring the flux attenuation along different paths through the object, an image of the interior density of the object can be obtained.

Muography has been used for several decades in areas as diverse as archaeology, volcanology and monitoring riverbanks. So far, however, its potential for monitoring underground infrastructure has gone largely untapped.

“We took this ‘old-school’ technique and pioneered its use in a completely new scenario: dynamically monitoring low-density, watery sediment build-up above a submerged, operational tunnel,” Khaw explains. “Our approach was not just in the hardware, but in integrating the detector data with a simplified tunnel model and validating it against environmental factors like river tides.”

With its durable, lightweight, and affordable design, the scintillator features a dual-layer configuration that suppresses background noise while capturing cosmic muons over a broad range of angles. Crucially, it is portable and could be discreetly positioned inside an underground tunnel to carry out real-time measurements, even as traffic flows.

Sediment profiles

To test the design, Khaw’s team took measurements along the full length of the Shanghai Outer Ring Tunnel while it was undergoing maintenance; allowing them to map out a profile of the sediment surrounding the tunnel. They then compared their muon flux measurements with model predictions based on sediment profiles for the Huangpu River measured in previous years. They were pleased to obtain results that were better than anticipated.

“We didn’t know the actual tidal height until we completed the measurement and checked tidal gauge data,” Khaw describes. “The most surprising and exciting discovery was a clear anti-correlation between muon flux and the tidal height of the Huangpu River.” Unexpectedly, the detector was also highly effective at measuring the real-time height of water above the tunnel, with its detected flux closely following the ebb and flow of the tides.

Reassuringly, the team’s measurements confirmed that there are no as-yet unmapped obstructions or gaps in the sediment above the tunnel thereby confirming the structure’s safety.

“Additionally, we have effectively shown a dual-purpose technology: it offers a reliable, non-invasive method for sediment monitoring and also reveals a new technique for tidal monitoring,” says Khaw. “This opens the possibility of using muon detectors as multi-functional sensors for comprehensive urban infrastructure and environmental oversight.”

The research is described in the Journal of Applied Physics.

The post Cosmic muons monitor river sediments surrounding Shanghai tunnel appeared first on Physics World.

  •  

Discovery of the Higgs boson at CERN inspires new stained-glass artwork

London-based artist Oksana Kondratyeva has created a new stained-glass artwork – entitled Discovery – that is inspired by the detection of the Higgs boson at CERN’s Large Hadron Collider (LHC) in 2012.

Born in Ukraine, Kondratyeva has a PhD in the theory of architecture and has an artist residency at the Romont Glass Museum (Vitromusée Romont) in Switzerland, where Discovery is currently exhibited.

In 2023 Kondratyeva travelled to visit the LHC at CERN, which she notes represents “more than a laboratory [but] a gateway to the unknown”.

Discovery draws inspiration from the awe I felt standing at the frontier of human knowledge, where particles collide at unimaginable energies and new forms of matter are revealed,” Kondratyeva told Physics World.

Kondratyeva says that the focal point of the artwork – a circle structured with geometric precision – represents the collision of two high-energy protons.

The surrounding lead lines in the panel trace the trajectories of particle decays as they move through a magnetic field: right-curved lines represent positively charged particles, left-curved lines indicate negatively charged ones, while straight lines signify neutral particles unaffected by the magnetic field.

The geometric composition within the central circle reflects the hidden symmetries of physical laws – patterns that only emerge when studying the behaviour of particle interactions.

Kondratyeva says that the use of mouth-blown flashed glass adds further depth to the piece, with colours and subtle shades moving from hot and luminous at the centre to cooler, more subdued tones toward the edges.

“Through glass, light and colour I sought to express the invisible forces and delicate symmetries that define our universe – ideas born in the realm of physics, yet deeply resonant in artistic expression,” notes Kondratyeva. “The work also continues a long tradition of stained glass as a medium of storytelling, reflecting the deep symmetries of nature and the human drive to find order in chaos.”

In 2022 Kondratyeva teamed up with Rigetti Computing to create piece of art inspired by the packaging for a quantum chip. Entitled Per scientiam ad astra (through science to the stars), the artwork was displayed at the 2024 British Glass Biennale at the Ruskin Glass Centre in Stourbridge, UK.

The post Discovery of the Higgs boson at CERN inspires new stained-glass artwork appeared first on Physics World.

  •  

The pros and cons of reinforcement learning in physical science

Today’s artificial intelligence (AI) systems are built on data generated by humans. They’re trained on huge repositories of writing, images and videos, most of which have been scraped from the Internet without the knowledge or consent of their creators. It’s a vast and sometimes ill-gotten treasure trove of information – but for machine-learning pioneer David Silver, it’s nowhere near enough.

“I think if you provide the knowledge that humans already have, it doesn’t really answer the deepest question for AI, which is how it can learn for itself to solve problems,” Silver told an audience at the 12th Heidelberg Laureate Forum (HLF) in Heidelberg, Germany, on Monday.

Silver’s proposed solution is to move from the “era of human data”, in which AI passively ingests information like a student cramming for an exam, into what he calls the “era of experience” in which it learns like a baby exploring its world. In his HLF talk on Monday, Silver played a sped-up video of a baby repeatedly picking up toys, manipulating them and putting them down while crawling and rolling around a room. To murmurs of appreciation from the audience, he declared, “I think that provides a different perspective of how a system might learn.”

Silver, a computer scientist at University College London, UK, has been instrumental in making this experiential learning happen in the virtual worlds of computer science and mathematics. As head of reinforcement learning at Google DeepMind, he was instrumental in developing AlphaZero, an AI system that taught itself to play the ancient stones-and-grid game of Go. It did this via a so-called “reward function” that pushed it to improve over many iterations, without ever being taught the game’s rules or strategy.

More recently, Silver coordinated a follow-up project called AlphaProof that treats formal mathematics as a game. In this case, AlphaZero’s reward is based on getting correct proofs. While it isn’t yet outperforming the best human mathematicians, in 2024 it achieved silver-medal standard on problems at the International Mathematical Olympiad.

Learning in the physics playroom

Could a similar experiential learning approach work in the physical sciences? At an HLF panel discussion on Tuesday afternoon, particle physicist Thea Klaeboe Åarrestad began by outlining one possible application. Whenever CERN’s Large Hadron Collider (LHC) is running, Åarrestad explained, she and her colleagues in the CMS experiment must control the magnets that keep protons on the right path as they zoom around the collider. Currently, this task is performed by a person, working in real time.

Four people sitting on a stage with a large screen in the background. Another person stands beside them
Up for discussion: A panel discussion on machine learning in physical sciences at the Heidelberg Laureate Forum. l-r: Moderator George Musser, Kyle Cranmer, Thea Klaeboe Åarrestad, David Silver and Maia Fraser. (Courtesy: Bernhard Kreutzer/HLFF)

In principle, Åarrestad continued, a reinforcement-learning AI could take over that job after learning by experience what works and what doesn’t. There’s just one problem: if it got anything wrong, the protons would smash into a wall and melt the beam pipe. “You don’t really want to do that mistake twice,” Åarrestad deadpanned.

For Åarrestad’s fellow panellist Kyle Cranmer, a particle physicist who works on data science and machine learning at the University of Wisconsin-Madison, US, this nightmare scenario symbolizes the challenge with using reinforcement learning in physical sciences. In situations where you’re able to do many experiments very quickly and essentially for free – as is the case with AlphaGo and its descendants – you can expect reinforcement learning to work well, Cranmer explained. But once you’re interacting with a real, physical system, even non-destructive experiments require finite amounts of time and money.

Another challenge, Cranmer continued, is that particle physics already has good theories that predict some quantities to multiple decimal places. “It’s not low-hanging fruit for getting an AI to come up with a replacement framework de novo,” Cranmer said. A better option, he suggested, might be to put AI to work on modelling atmospheric fluid dynamics, which are emergent phenomena without first-principles descriptions. “Those are super-exciting places to use ideas from machine learning,” he said.

Not for nuclear arsenals

Silver, who was also on Tuesday’s panel, agreed that reinforcement learning isn’t always the right solution. “We should do this in areas where mistakes are small and it can learn from those small mistakes to avoid making big mistakes,” he said. To general laughter, he added that he would not recommend “letting an AI loose on nuclear arsenals”, either.

Reinforcement learning aside, both Åarrestad and Cranmer are highly enthusiastic about AI. For Cranmer, one of the most exciting aspects of the technology is the way it gets scientists from different disciplines talking to each other. The HLF, which aims to connect early-career researchers with senior figures in mathematics and computer science, is itself a good example, with many talks in the weeklong schedule devoted to AI in one form or another.

For Åarrestad, though, AI’s most exciting possibility relates to physics itself. Because the LHC produces far more data than humans and present-day algorithms can handle, Åarrestad explained, much of it is currently discarded. The idea that, as a result, she and her colleagues could be throwing away major discoveries sometimes keeps her up at night. “Is there new physics below 1 TeV?” Åarrestad wondered.

Someday, maybe, an AI might be able to tell us.

The post The pros and cons of reinforcement learning in physical science appeared first on Physics World.

  •  

Relive the two decades when physicists basked in the afterglow of the Standard Model

The Large Electron–Positron collider
Tunnel vision The successful consolidation of particle physics in the 1980s and 1990s, typified by work at the Large Electron–Positron collider, is the theme of a symposium held at CERN from 10–13 November 2025. (Courtesy: CERN)

Call it millennial, generation Y or fin de siècle, high-energy physics during the last two decades of the 20th century had a special flavour. The principal pieces of the Standard Model of particle physics had come together remarkably tightly – so tightly, in fact, that physicists had to rethink what instruments to build, what experiments to plan, and what theories to develop to move forward. But it was also an era when the hub of particle physics moved from the US to Europe.

The momentous events of the 1980s and 1990s will be the focus of the 4th International Symposium on the History of Particle Physics, which is being held on 10–13 November at CERN. The meeting will take place more than four decades after the first symposium in the series was held at Fermilab near Chicago in 1980. Entitled The Birth of Particle Physics, that initial meeting covered the years 1930 to 1950.

Speakers back then included trailblazers such as Paul Dirac, Julian Schwinger and Victor Weisskopf. They reviewed discoveries such as the neutron and the positron and the development of relativistic quantum field theory. Those two decades before 1950 were a time when particle physicists “constructed the room”, so to speak, in which the discipline would be based.

The second symposium – Pions to Quarks – was also held at Fermilab and covered the 1950s. Accelerators could now create particles seen in cosmic-ray collisions, populating what Robert Oppenheimer called the “particle zoo”. Certain discoveries of this era, such as parity violation in the weak interaction, were so shocking that C N Yang likened it to having a blackout and not knowing if the room would look the same when the lights came back on. Speakers at that 1985 event included Luis Alvarez, Val Fitch, Abdus Salam, Robert Wilson and Yang himself.

The third symposium, The Rise of the Standard Model, was held in Stanford, California, in 1992 and covered the 1960s and 1970s. It was a time not of blackouts but of disruptions that dimmed the lights. Charge-parity violation and the existence of two types of neutrino were found in the 1960s, followed in the 1970s by deep inelastic electron scattering and quarks, neutral currents, a fourth quark and gluon jets.

These discoveries decimated alternative approaches to quantum field theory, which was duly established for good as the skeleton of high-energy physics. The era culminated with Sheldon Glashow, Abdus Salam and Steven Weinberg winning the 1979 Nobel Prize for Physics for their part in establishing the Standard Model. Speakers at that third symposium included Murray Gell-Mann, Leon Lederman and Weinberg himself.

Changing times

The upcoming CERN event, on whose programme committee I serve, will start exactly where the previous symposium ended. “1980 is a natural historical break,” says conference co-organizer Michael Riordan, who won the 2025 Abraham Pais Prize for History of Physics. “It begins a period of the consolidation of the Standard Model. Colliders became the main instruments, and were built with specific standard-model targets in mind. And the centre of gravity of the discipline moved across the Atlantic to Europe.”

The conference will address physics that took place at CERN’s Super Proton Synchrotron (SPS), where the W and Z particles were discovered in 1983. It will also examines the SPS’s successor – the Large Electron-Positron (LEP) collider. Opened in 1989, it was used to make precise measurements of these and other implications of the Standard Model until being controversially shut down in 2000 to make way for the Large Hadron Collider (LHC).

There will be coverage as well of failed accelerator projects, which – perhaps perversely – can be equally interesting and revealing as successful facilities

Speakers at the meeting will also discuss Fermilab’s Tevatron, where the top quark – another Standard Model component – was found in 1995. Work at the Stanford Linear Accelerator Center, DESY in Germany, and Tsukuba, Japan, will be tackled too. There will be coverage as well of failed accelerator projects, which – perhaps perversely – can be equally interesting and revealing as successful facilities.

In particular, I will speak about ISABELLE, a planned and partially built proton–proton collider at Brookhaven National Laboratory, which was terminated in 1983 to make way for the far more ambitious Superconducting Super Collider (SSC). ISABELLE was then transformed into the Relativistic Heavy Ion Collider (RHIC), which was completed in 1999 and took nuclear physics into the high-energy regime.

Riordan will talk about the fate of the SSC, which was supposed to discover the Higgs boson or whatever else plays its mass-generating role. But in 1993 the US Congress terminated that project, a traumatic episode for US physics, about which Riordan co-authored the book Tunnel Visions. Its cancellation signalled the end of the glory years for US particle physics and the realization of the need for international collaborations in ever-costlier accelerator projects.

The CERN meeting will also explore more positive developments such as the growing convergence of particle physics and cosmology during the 1980s and 1990s. During that time, researchers stepped up their studies of dark matter, neutrino oscillations and supernovas. It was a period that saw the construction of underground detectors at Gran Sasso in Italy and Kamiokande in Japan.

Other themes to be explored include the development of the Web – which transformed the world – and the impact of globalization, the end of the Cold War, and the rise of high-energy physics in China, and physics in Russia, former Soviet Union republics, and former Eastern Bloc countries. While particle physics became more global, it also grew more dependent on, and vulnerable to, changing political ambitions, economic realities and international collaborations. The growing importance of diversity, communication and knowledge transfer will be looked at too.

The critical point

The years between 1980 and 2000 were a distinct period in the history of particle physics. It took place in the afterglow of the triumph of the Standard Model. The lights in high energy physics did not go out or even dim, to use Yang’s metaphor. Instead, the Standard Model shed so much light on high-energy physics that the effort and excitement focused around consolidating the model.

Particle physics, during those years, was all about finding the deeply hidden outstanding pieces, developing the theory, and connecting with other areas of physics. The triumph was so complete that physicists began to wonder what bigger and more comprehensive structure the Standard Model’s “room” might be embedded in – what was “beyond the Standard Model”. A quarter of a century on, our attempt to make out that structure is still an ongoing task.

The post Relive the two decades when physicists basked in the afterglow of the Standard Model appeared first on Physics World.

  •  

Top quarks embrace in quasi-bound toponium

For decades, physicists believed that the top quark, the heaviest known subatomic particle, was too short-lived to form a temporary pair with its antimatter partner. Unlike lighter quarks, which can combine to form protons, neutrons, or longer-lived quark–antiquark pairs, the top quark decays almost instantly. This made the idea of a top–antitop bound state – a fleeting association held together by the strong force – seem impossible. But now, the CMS collaboration at the Large Hadron Collider (LHC) has found the first evidence of such a state, which is dubbed toponium.

Gautier Hamel de Monchenault, spokesperson for CMS, explains, “Many physicists long believed this was impossible. That’s why this result is so significant — it challenges assumptions that have been around for decades, and particle physics textbooks will likely need to be updated because of it.”

Protons and neutrons are formed from quarks, which are fundamental particles that cannot be broken down into smaller constituents.

“There are six types of quark,” explains the German physicist Christian Schwanenberger, who is at DESY and the University of Hamburg and was not involved in the study. “Five of them form bound states thanks to the strong force, one of the four fundamental forces of nature. The top quark, however, is somehow different. It is the heaviest fundamental particle we know, but so far we have not observed it forming bound states in the same way the others do.”

Quasi-bound state

The top quark’s extreme mass makes it decay almost immediately after it is produced. “The top and antitop quarks just have time to exchange a few gluons, the carriers of the strong force, before one of them decays, hence the appellation ‘quasi-bound state’,” Hamel de Monchenault explains.

By detecting these ephemeral interactions, physicists can observe the strong force in a new regime – and the CMS team developed a clever new method to do so. The breakthrough came when the team examined how the spins of the top quark and antitop quark influence each other to create a subtle signature in the particles produced when the quarks decay.

Top quarks are produced in proton–proton collisions at the LHC, where they quickly decay into other particles. These include bottom quarks that then decay to form jets of particles, which can be detected. Top quarks can also decay to form W bosons, which themselves decay into lighter particles (leptons) such as electrons or muons, accompanied by neutrinos.

“We can detect the charged leptons directly and measure their energy very precisely, but we have to infer the presence of the neutrinos indirectly, through an imbalance of the total energy measured,” says Hamel de Monchenault. By studying the pattern and energy of the leptons and jets, the CMS team deduced the existence of top–antitop pairs and spotted the subtle signature of the fleeting quasi-bound state.

Statistical significance

The CMS researchers observed an excess of events in which the top and antitop quarks were produced almost at rest relative to each other – the precise condition needed for a quasi-bound state to form. “The signal has a statistical significance above 5σ, which means the chance it’s just a statistical fluctuation is less than one in a few million,” Hamel de Monchenault says.

While this excess accounts for only about 1% of top quark pair production, it aligns with predictions for toponium formation and offers insights into the strong force.

“Within the achieved precision, the result matches the predictions of advanced calculations involving the strong force,” explains Hamel de Monchenault. “An effect once thought too subtle to detect with current technology has now been observed. It’s comforting in a way: even the heaviest known quarks are not always alone – they can briefly embrace their opposites.”

Future directions

The discovery has energized the particle physics community. “Scientists are excited to explore the strong force in a completely new regime,” says Schwanenberger. Researchers will refine theoretical models, simulate toponium more precisely, and study its decay patterns and excited states. Much of this work will rely on the High-Luminosity LHC, expected to start operations around 2030, and potentially on future electron–positron colliders capable of studying top quarks with unprecedented precision.

“The present results are based on LHC data recorded between 2015 and 2018 [Run 2]. Since 2022, ATLAS and CMS are recording data at a slightly higher energy, which is favourable for top quark production. The amount of data already surpasses that of Run 2, and we expect that with such huge amounts of data, the properties of this new signal can be studied in detail,” Hamel de Monchenault says.

This research could ultimately answer a fundamental question: is the top quark simply another quark like its lighter siblings, or could it hold the key to physics beyond the Standard Model? “Investigating different toponium states will be a key part of the top quark research programme,” Schwanenberger says. “It could reshape our understanding of matter itself and reveal whatever holds the world together in its inmost folds.”

The results are published in Reports on Progress in Physics.

The post Top quarks embrace in quasi-bound toponium appeared first on Physics World.

  •  

Tritium and helium targets shed light on three-nucleon interactions

An experiment that scattered high-energy electrons from helium-3 and tritium nuclei has provided the first evidence for three-nucleon short-range correlations. The data were taken in 2018 at Jefferson Lab in the US and further studies of these correlations could improve our understanding of both atomic nuclei and neutron stars.

Atomic nuclei contain nucleons (protons and neutrons) that are bound together by the strong force. These nucleons are not static and they can move rapidly about the nucleus. While nucleons can move independently, they can also move as correlated pairs, trios and larger groupings. Studying this correlated motion can provide important insights into interactions between nucleons – interactions that define the structures of tiny nuclei and huge neutron stars.

The momenta of nucleons can be measured by scattering a beam of high-energy electrons from nuclei. This is because the de Broglie wavelength of these electrons is smaller that the size of the nucleons – allowing individual nucleons to be isolated. During the scattering process, momentum is exchanged between a nucleon and an electron, and how this occurs provides important insights into the correlations between nucleons.

Electron scattering has already revealed that most of the momentum in nuclei is associated with single nucleons, with some also assigned to correlated pairs. These experiments also suggested that nuclei have additional momenta that had not been accounted for.

Small but important

“We know that the three-nucleon interaction is important in the description of nuclear properties, even though it’s a very small contribution,” explains John Arrington at the Lawrence Berkeley National Laboratory in the US. “Until now, there’s never really been any indication that we’d observed them at all. This work provides a first glimpse at them.”

In 2018, Arrington and others did a series of electron-scattering experiments at Jefferson Lab with helium-3 and tritium targets. Now Arrington and an international team of physicists has scoured this scattering data for evidence of short-range, three-nucleon correlations.

Studying these correlations in nuclei with just three nucleons is advantageous because there are no correlations between four or more nucleons. These correlations would make it more difficult to isolate three-nucleon effects in the scattering data.

A further benefit of looking at tritium and helium-3 is that they are “mirror nuclei”. Tritium comprises one proton and two neutrons, while helium-3 comprises two protons and a neutron. The strong force that binds nucleons together acts equally on protons and neutrons. However, there are subtle differences in how protons and neutrons interact with each other – and these differences can be studied by comparing tritium and helium-3 electron scattering experiments.

A clean picture

“We’re trying to show that it’s possible to study three-nucleon correlations at Jefferson Lab even though we can’t get the energies necessary to do these studies in heavy nuclei,” says principle investigator Shujie Li, at Lawrence Berkeley. “These light systems give us a clean picture — that’s the reason we put in the effort of getting a radioactive target material.”

Both helium-3 and tritium are rare isotopes of their respective elements. Helium-3 is produced from the radioactive decay of tritium, which itself is produced in nuclear reactors. Tritium is a difficult isotope to work with because it is used to make nuclear weapons; has a half–life of about 12 years; and is toxic when ingested or inhaled. To succeed, the team had to create a special cryogenic chamber to contain their target of tritium gas.

Analysis of the scattering experiments revealed tantalizing hints of three-nucleon short-range correlations. Further investigation is need to determine exactly how the correlations occur. Three nucleons could become correlated simultaneously, for example, or an existing correlated pair could become correlated to a third nucleon.

Three-nucleon interactions are believed to play an important role in the properties of neutron stars, so further investigation into some of the smallest of nuclei could shed light on the inner workings of much more massive objects. “It’s much easier to study a three-nucleon correlation in the lab than in a neutron star,” says Arrington.

The research is described in Physics Letters B.

The post Tritium and helium targets shed light on three-nucleon interactions appeared first on Physics World.

  •