↩ Accueil

Vue lecture

The pros and cons of reinforcement learning in physical science

Today’s artificial intelligence (AI) systems are built on data generated by humans. They’re trained on huge repositories of writing, images and videos, most of which have been scraped from the Internet without the knowledge or consent of their creators. It’s a vast and sometimes ill-gotten treasure trove of information – but for machine-learning pioneer David Silver, it’s nowhere near enough.

“I think if you provide the knowledge that humans already have, it doesn’t really answer the deepest question for AI, which is how it can learn for itself to solve problems,” Silver told an audience at the 12th Heidelberg Laureate Forum (HLF) in Heidelberg, Germany, on Monday.

Silver’s proposed solution is to move from the “era of human data”, in which AI passively ingests information like a student cramming for an exam, into what he calls the “era of experience” in which it learns like a baby exploring its world. In his HLF talk on Monday, Silver played a sped-up video of a baby repeatedly picking up toys, manipulating them and putting them down while crawling and rolling around a room. To murmurs of appreciation from the audience, he declared, “I think that provides a different perspective of how a system might learn.”

Silver, a computer scientist at University College London, UK, has been instrumental in making this experiential learning happen in the virtual worlds of computer science and mathematics. As head of reinforcement learning at Google DeepMind, he was instrumental in developing AlphaZero, an AI system that taught itself to play the ancient stones-and-grid game of Go. It did this via a so-called “reward function” that pushed it to improve over many iterations, without ever being taught the game’s rules or strategy.

More recently, Silver coordinated a follow-up project called AlphaProof that treats formal mathematics as a game. In this case, AlphaZero’s reward is based on getting correct proofs. While it isn’t yet outperforming the best human mathematicians, in 2024 it achieved silver-medal standard on problems at the International Mathematical Olympiad.

Learning in the physics playroom

Could a similar experiential learning approach work in the physical sciences? At an HLF panel discussion on Tuesday afternoon, particle physicist Thea Klaeboe Åarrestad began by outlining one possible application. Whenever CERN’s Large Hadron Collider (LHC) is running, Åarrestad explained, she and her colleagues in the CMS experiment must control the magnets that keep protons on the right path as they zoom around the collider. Currently, this task is performed by a person, working in real time.

Four people sitting on a stage with a large screen in the background. Another person stands beside them
Up for discussion: A panel discussion on machine learning in physical sciences at the Heidelberg Laureate Forum. l-r: Moderator George Musser, Kyle Cranmer, Thea Klaeboe Åarrestad, David Silver and Maia Fraser. (Courtesy: Bernhard Kreutzer/HLFF)

In principle, Åarrestad continued, a reinforcement-learning AI could take over that job after learning by experience what works and what doesn’t. There’s just one problem: if it got anything wrong, the protons would smash into a wall and melt the beam pipe. “You don’t really want to do that mistake twice,” Åarrestad deadpanned.

For Åarrestad’s fellow panellist Kyle Cranmer, a particle physicist who works on data science and machine learning at the University of Wisconsin-Madison, US, this nightmare scenario symbolizes the challenge with using reinforcement learning in physical sciences. In situations where you’re able to do many experiments very quickly and essentially for free – as is the case with AlphaGo and its descendants – you can expect reinforcement learning to work well, Cranmer explained. But once you’re interacting with a real, physical system, even non-destructive experiments require finite amounts of time and money.

Another challenge, Cranmer continued, is that particle physics already has good theories that predict some quantities to multiple decimal places. “It’s not low-hanging fruit for getting an AI to come up with a replacement framework de novo,” Cranmer said. A better option, he suggested, might be to put AI to work on modelling atmospheric fluid dynamics, which are emergent phenomena without first-principles descriptions. “Those are super-exciting places to use ideas from machine learning,” he said.

Not for nuclear arsenals

Silver, who was also on Tuesday’s panel, agreed that reinforcement learning isn’t always the right solution. “We should do this in areas where mistakes are small and it can learn from those small mistakes to avoid making big mistakes,” he said. To general laughter, he added that he would not recommend “letting an AI loose on nuclear arsenals”, either.

Reinforcement learning aside, both Åarrestad and Cranmer are highly enthusiastic about AI. For Cranmer, one of the most exciting aspects of the technology is the way it gets scientists from different disciplines talking to each other. The HLF, which aims to connect early-career researchers with senior figures in mathematics and computer science, is itself a good example, with many talks in the weeklong schedule devoted to AI in one form or another.

For Åarrestad, though, AI’s most exciting possibility relates to physics itself. Because the LHC produces far more data than humans and present-day algorithms can handle, Åarrestad explained, much of it is currently discarded. The idea that, as a result, she and her colleagues could be throwing away major discoveries sometimes keeps her up at night. “Is there new physics below 1 TeV?” Åarrestad wondered.

Someday, maybe, an AI might be able to tell us.

The post The pros and cons of reinforcement learning in physical science appeared first on Physics World.

  •  

Relive the two decades when physicists basked in the afterglow of the Standard Model

The Large Electron–Positron collider
Tunnel vision The successful consolidation of particle physics in the 1980s and 1990s, typified by work at the Large Electron–Positron collider, is the theme of a symposium held at CERN from 10–13 November 2025. (Courtesy: CERN)

Call it millennial, generation Y or fin de siècle, high-energy physics during the last two decades of the 20th century had a special flavour. The principal pieces of the Standard Model of particle physics had come together remarkably tightly – so tightly, in fact, that physicists had to rethink what instruments to build, what experiments to plan, and what theories to develop to move forward. But it was also an era when the hub of particle physics moved from the US to Europe.

The momentous events of the 1980s and 1990s will be the focus of the 4th International Symposium on the History of Particle Physics, which is being held on 10–13 November at CERN. The meeting will take place more than four decades after the first symposium in the series was held at Fermilab near Chicago in 1980. Entitled The Birth of Particle Physics, that initial meeting covered the years 1930 to 1950.

Speakers back then included trailblazers such as Paul Dirac, Julian Schwinger and Victor Weisskopf. They reviewed discoveries such as the neutron and the positron and the development of relativistic quantum field theory. Those two decades before 1950 were a time when particle physicists “constructed the room”, so to speak, in which the discipline would be based.

The second symposium – Pions to Quarks – was also held at Fermilab and covered the 1950s. Accelerators could now create particles seen in cosmic-ray collisions, populating what Robert Oppenheimer called the “particle zoo”. Certain discoveries of this era, such as parity violation in the weak interaction, were so shocking that C N Yang likened it to having a blackout and not knowing if the room would look the same when the lights came back on. Speakers at that 1985 event included Luis Alvarez, Val Fitch, Abdus Salam, Robert Wilson and Yang himself.

The third symposium, The Rise of the Standard Model, was held in Stanford, California, in 1992 and covered the 1960s and 1970s. It was a time not of blackouts but of disruptions that dimmed the lights. Charge-parity violation and the existence of two types of neutrino were found in the 1960s, followed in the 1970s by deep inelastic electron scattering and quarks, neutral currents, a fourth quark and gluon jets.

These discoveries decimated alternative approaches to quantum field theory, which was duly established for good as the skeleton of high-energy physics. The era culminated with Sheldon Glashow, Abdus Salam and Steven Weinberg winning the 1979 Nobel Prize for Physics for their part in establishing the Standard Model. Speakers at that third symposium included Murray Gell-Mann, Leon Lederman and Weinberg himself.

Changing times

The upcoming CERN event, on whose programme committee I serve, will start exactly where the previous symposium ended. “1980 is a natural historical break,” says conference co-organizer Michael Riordan, who won the 2025 Abraham Pais Prize for History of Physics. “It begins a period of the consolidation of the Standard Model. Colliders became the main instruments, and were built with specific standard-model targets in mind. And the centre of gravity of the discipline moved across the Atlantic to Europe.”

The conference will address physics that took place at CERN’s Super Proton Synchrotron (SPS), where the W and Z particles were discovered in 1983. It will also examines the SPS’s successor – the Large Electron-Positron (LEP) collider. Opened in 1989, it was used to make precise measurements of these and other implications of the Standard Model until being controversially shut down in 2000 to make way for the Large Hadron Collider (LHC).

There will be coverage as well of failed accelerator projects, which – perhaps perversely – can be equally interesting and revealing as successful facilities

Speakers at the meeting will also discuss Fermilab’s Tevatron, where the top quark – another Standard Model component – was found in 1995. Work at the Stanford Linear Accelerator Center, DESY in Germany, and Tsukuba, Japan, will be tackled too. There will be coverage as well of failed accelerator projects, which – perhaps perversely – can be equally interesting and revealing as successful facilities.

In particular, I will speak about ISABELLE, a planned and partially built proton–proton collider at Brookhaven National Laboratory, which was terminated in 1983 to make way for the far more ambitious Superconducting Super Collider (SSC). ISABELLE was then transformed into the Relativistic Heavy Ion Collider (RHIC), which was completed in 1999 and took nuclear physics into the high-energy regime.

Riordan will talk about the fate of the SSC, which was supposed to discover the Higgs boson or whatever else plays its mass-generating role. But in 1993 the US Congress terminated that project, a traumatic episode for US physics, about which Riordan co-authored the book Tunnel Visions. Its cancellation signalled the end of the glory years for US particle physics and the realization of the need for international collaborations in ever-costlier accelerator projects.

The CERN meeting will also explore more positive developments such as the growing convergence of particle physics and cosmology during the 1980s and 1990s. During that time, researchers stepped up their studies of dark matter, neutrino oscillations and supernovas. It was a period that saw the construction of underground detectors at Gran Sasso in Italy and Kamiokande in Japan.

Other themes to be explored include the development of the Web – which transformed the world – and the impact of globalization, the end of the Cold War, and the rise of high-energy physics in China, and physics in Russia, former Soviet Union republics, and former Eastern Bloc countries. While particle physics became more global, it also grew more dependent on, and vulnerable to, changing political ambitions, economic realities and international collaborations. The growing importance of diversity, communication and knowledge transfer will be looked at too.

The critical point

The years between 1980 and 2000 were a distinct period in the history of particle physics. It took place in the afterglow of the triumph of the Standard Model. The lights in high energy physics did not go out or even dim, to use Yang’s metaphor. Instead, the Standard Model shed so much light on high-energy physics that the effort and excitement focused around consolidating the model.

Particle physics, during those years, was all about finding the deeply hidden outstanding pieces, developing the theory, and connecting with other areas of physics. The triumph was so complete that physicists began to wonder what bigger and more comprehensive structure the Standard Model’s “room” might be embedded in – what was “beyond the Standard Model”. A quarter of a century on, out attempts to make out that structure is still an ongoing task.

The post Relive the two decades when physicists basked in the afterglow of the Standard Model appeared first on Physics World.

  •  

Top quarks embrace in quasi-bound toponium

For decades, physicists believed that the top quark, the heaviest known subatomic particle, was too short-lived to form a temporary pair with its antimatter partner. Unlike lighter quarks, which can combine to form protons, neutrons, or longer-lived quark–antiquark pairs, the top quark decays almost instantly. This made the idea of a top–antitop bound state – a fleeting association held together by the strong force – seem impossible. But now, the CMS collaboration at the Large Hadron Collider (LHC) has found the first evidence of such a state, which is dubbed toponium.

Gautier Hamel de Monchenault, spokesperson for CMS, explains, “Many physicists long believed this was impossible. That’s why this result is so significant — it challenges assumptions that have been around for decades, and particle physics textbooks will likely need to be updated because of it.”

Protons and neutrons are formed from quarks, which are fundamental particles that cannot be broken down into smaller constituents.

“There are six types of quark,” explains the German physicist Christian Schwanenberger, who is at DESY and the University of Hamburg and was not involved in the study. “Five of them form bound states thanks to the strong force, one of the four fundamental forces of nature. The top quark, however, is somehow different. It is the heaviest fundamental particle we know, but so far we have not observed it forming bound states in the same way the others do.”

Quasi-bound state

The top quark’s extreme mass makes it decay almost immediately after it is produced. “The top and antitop quarks just have time to exchange a few gluons, the carriers of the strong force, before one of them decays, hence the appellation ‘quasi-bound state’,” Hamel de Monchenault explains.

By detecting these ephemeral interactions, physicists can observe the strong force in a new regime – and the CMS team developed a clever new method to do so. The breakthrough came when the team examined how the spins of the top quark and antitop quark influence each other to create a subtle signature in the particles produced when the quarks decay.

Top quarks are produced in proton–proton collisions at the LHC, where they quickly decay into other particles. These include bottom quarks that then decay to form jets of particles, which can be detected. Top quarks can also decay to form W bosons, which themselves decay into lighter particles (leptons) such as electrons or muons, accompanied by neutrinos.

“We can detect the charged leptons directly and measure their energy very precisely, but we have to infer the presence of the neutrinos indirectly, through an imbalance of the total energy measured,” says Hamel de Monchenault. By studying the pattern and energy of the leptons and jets, the CMS team deduced the existence of top–antitop pairs and spotted the subtle signature of the fleeting quasi-bound state.

Statistical significance

The CMS researchers observed an excess of events in which the top and antitop quarks were produced almost at rest relative to each other – the precise condition needed for a quasi-bound state to form. “The signal has a statistical significance above 5σ, which means the chance it’s just a statistical fluctuation is less than one in a few million,” Hamel de Monchenault says.

While this excess accounts for only about 1% of top quark pair production, it aligns with predictions for toponium formation and offers insights into the strong force.

“Within the achieved precision, the result matches the predictions of advanced calculations involving the strong force,” explains Hamel de Monchenault. “An effect once thought too subtle to detect with current technology has now been observed. It’s comforting in a way: even the heaviest known quarks are not always alone – they can briefly embrace their opposites.”

Future directions

The discovery has energized the particle physics community. “Scientists are excited to explore the strong force in a completely new regime,” says Schwanenberger. Researchers will refine theoretical models, simulate toponium more precisely, and study its decay patterns and excited states. Much of this work will rely on the High-Luminosity LHC, expected to start operations around 2030, and potentially on future electron–positron colliders capable of studying top quarks with unprecedented precision.

“The present results are based on LHC data recorded between 2015 and 2018 [Run 2]. Since 2022, ATLAS and CMS are recording data at a slightly higher energy, which is favourable for top quark production. The amount of data already surpasses that of Run 2, and we expect that with such huge amounts of data, the properties of this new signal can be studied in detail,” Hamel de Monchenault says.

This research could ultimately answer a fundamental question: is the top quark simply another quark like its lighter siblings, or could it hold the key to physics beyond the Standard Model? “Investigating different toponium states will be a key part of the top quark research programme,” Schwanenberger says. “It could reshape our understanding of matter itself and reveal whatever holds the world together in its inmost folds.”

The results are published in Reports on Progress in Physics.

The post Top quarks embrace in quasi-bound toponium appeared first on Physics World.

  •  

Tritium and helium targets shed light on three-nucleon interactions

An experiment that scattered high-energy electrons from helium-3 and tritium nuclei has provided the first evidence for three-nucleon short-range correlations. The data were taken in 2018 at Jefferson Lab in the US and further studies of these correlations could improve our understanding of both atomic nuclei and neutron stars.

Atomic nuclei contain nucleons (protons and neutrons) that are bound together by the strong force. These nucleons are not static and they can move rapidly about the nucleus. While nucleons can move independently, they can also move as correlated pairs, trios and larger groupings. Studying this correlated motion can provide important insights into interactions between nucleons – interactions that define the structures of tiny nuclei and huge neutron stars.

The momenta of nucleons can be measured by scattering a beam of high-energy electrons from nuclei. This is because the de Broglie wavelength of these electrons is smaller that the size of the nucleons – allowing individual nucleons to be isolated. During the scattering process, momentum is exchanged between a nucleon and an electron, and how this occurs provides important insights into the correlations between nucleons.

Electron scattering has already revealed that most of the momentum in nuclei is associated with single nucleons, with some also assigned to correlated pairs. These experiments also suggested that nuclei have additional momenta that had not been accounted for.

Small but important

“We know that the three-nucleon interaction is important in the description of nuclear properties, even though it’s a very small contribution,” explains John Arrington at the Lawrence Berkeley National Laboratory in the US. “Until now, there’s never really been any indication that we’d observed them at all. This work provides a first glimpse at them.”

In 2018, Arrington and others did a series of electron-scattering experiments at Jefferson Lab with helium-3 and tritium targets. Now Arrington and an international team of physicists has scoured this scattering data for evidence of short-range, three-nucleon correlations.

Studying these correlations in nuclei with just three nucleons is advantageous because there are no correlations between four or more nucleons. These correlations would make it more difficult to isolate three-nucleon effects in the scattering data.

A further benefit of looking at tritium and helium-3 is that they are “mirror nuclei”. Tritium comprises one proton and two neutrons, while helium-3 comprises two protons and a neutron. The strong force that binds nucleons together acts equally on protons and neutrons. However, there are subtle differences in how protons and neutrons interact with each other – and these differences can be studied by comparing tritium and helium-3 electron scattering experiments.

A clean picture

“We’re trying to show that it’s possible to study three-nucleon correlations at Jefferson Lab even though we can’t get the energies necessary to do these studies in heavy nuclei,” says principle investigator Shujie Li, at Lawrence Berkeley. “These light systems give us a clean picture — that’s the reason we put in the effort of getting a radioactive target material.”

Both helium-3 and tritium are rare isotopes of their respective elements. Helium-3 is produced from the radioactive decay of tritium, which itself is produced in nuclear reactors. Tritium is a difficult isotope to work with because it is used to make nuclear weapons; has a half–life of about 12 years; and is toxic when ingested or inhaled. To succeed, the team had to create a special cryogenic chamber to contain their target of tritium gas.

Analysis of the scattering experiments revealed tantalizing hints of three-nucleon short-range correlations. Further investigation is need to determine exactly how the correlations occur. Three nucleons could become correlated simultaneously, for example, or an existing correlated pair could become correlated to a third nucleon.

Three-nucleon interactions are believed to play an important role in the properties of neutron stars, so further investigation into some of the smallest of nuclei could shed light on the inner workings of much more massive objects. “It’s much easier to study a three-nucleon correlation in the lab than in a neutron star,” says Arrington.

The research is described in Physics Letters B.

The post Tritium and helium targets shed light on three-nucleon interactions appeared first on Physics World.

  •  

Quantum control of individual antiprotons puts the Standard Model to the test

Physicists have taken a major step toward unlocking the mysteries of antimatter by being the first to perform coherent spin spectroscopy on a single antiproton. Done by researchers on CERN’s BASE collaboration, the experiment measures the magnetic properties of antimatter with record-breaking precision. As a result, it could help us understand why there is much more matter than antimatter in the universe,

“The level of control the authors have achieved over an individual antimatter particle is unprecedented,” says Dmitry Budker, a physicist at the University of California, Berkeley, who was not involved in the study. “This opens the path to much more precise tests of fundamental symmetries of nature.”

In theory, the universe should have been born with equal amounts of matter and antimatter. Yet all the visible structures we see today – including stars, galaxies, planets and people – are made almost entirely of matter. This cosmic imbalance remains one of the biggest open questions in physics and is known as the baryon asymmetry problem.

“The general motivation for studying antiprotons is to test fundamental symmetries and our understanding of them,” says Stefan Ulmer, a senior member of BASE and head of the Ulmer Fundamental Symmetries Laboratory at RIKEN in Japan. “What we know about antimatter is that it appears as a symmetric solution to quantum mechanical equations – there’s no obvious reason why the universe should not contain equal amounts of matter and antimatter.”

This mystery can be probed by doing very precise comparisons of properties of matter and antimatter particles – in this case, the proton and the antiproton. For example, the Standard Model says that protons and antiprotons should have identical masses but equal and opposite electrical charges. Any deviations from the Standard Model description could shed light on baryon asymmetry.

Leap in precision

Now, the BASE (Baryon Antibaryon Symmetry Experiment) team has focused on coherent spectroscopy, which is a quantum technique that uses microwave pulses to manipulate the spin states of a single antiproton.

“We were doing spectroscopy on the spin of a single trapped antiproton, stored in a cryogenic Penning trap system,” Ulmer explains. “It is significant because this is of highest importance in studying the fundamental properties of the particle.”

By applying microwave radiation at just the right frequency, the team induced Rabi oscillations –periodic flipping of the antiproton’s spin – and observed the resulting resonances. The key result was a resonance peak 16 times narrower than in any previous antiproton measurements, meaning the team could pinpoint the transition frequency with much greater accuracy. Combined with a 1.5-fold improvement in signal-to-noise ratio, the measurement paves the way for at least a tenfold increase in the precision of antiproton magnetic moment measurements.“In principle, we could reduce the linewidth by another factor of ten if additional technology is developed,” says Ulmer.

Budker described the measurement as unprecedented, adding, “This is a key to future precise tests of CPT invariance and other fundamental-physics experiments”.

Deeply held principle

CPT symmetry – the idea that the laws of physics remain unchanged if charge, parity, and time are simultaneously reversed – is one of the most deeply held principles in physics. Testing it to higher and higher precision is essential for identifying any cracks in the Standard Model.

Ulmer says the team observed antiproton spin coherence times of up to 50 s. Coherence here refers to the ability of the antiproton’s quantum spin state to remain stable and unperturbed over time, which is essential for achieving high-precision measurements.

Measuring magnetic moments of nuclear particles is already notoriously difficult, but doing so for antimatter pushes the limits of experimental physics.

“These measurements require the development of experiments that are about three orders of magnitude more sensitive than any other apparatus developed before,” says Ulmer. “You need to build the world’s most sensitive detectors for single particles, the smallest Penning traps, and superimpose ultra-extreme magnetic gradients.”

The BASE team started development in 2005 and had early successes in proton measurements by 2011. Antiproton studies began in earnest in 2017, but achieving coherent spin control – as in the current work – required further innovations including ultra-homogeneous magnetic fields, cryogenic temperatures, and the exquisite control of noise.

Toward a deeper understanding

These improvements could also make other experiments possible. “This will also allow more precise measurements of other nuclear magnetic moments, and paves a path to better measurements in proton–antiproton mass comparisons,” Ulmer notes.

There may even be distant connections to quantum computing. “If coherence times for matter and antimatter are identical – something we aim to test – then the antimatter qubit might have applications in quantum information,” he says. “But honestly, operating an antimatter quantum computer, if you could do the same with matter, would be inefficient.”

More realistically, the team hopes to use their transportable trap system, BASE STEP, to bring antiprotons to a dedicated offline laboratory for even higher-resolution studies.

“The BASE collaboration keeps a steady course on increasing the precision of fundamental symmetry tests,” says Budker. “This is an important step in that direction.”

The research is described in Nature.

The post Quantum control of individual antiprotons puts the Standard Model to the test appeared first on Physics World.

  •  

Exographer: a scientific odyssey in pixel form

In an era where video games often prioritize fast-paced action and instant gratification, Exographer offers a refreshing change. With a contemplative journey that intertwines the realms of particle physics and interactive storytelling, this beautifully pixelated game invites players to explore a decaying alien civilization through the lens of scientific discovery while challenging them with dexterity and intellect.​

Exographer was developed by particle physicist and science-fiction author Raphaël Granier de Cassagnac and his video-game studio SciFunGames. At its core, it is a puzzle-platformer – where the player’s character has to move around an environment using platforms while solving puzzles. The character in question is Ini, an alien explorer who discovers a multifunctional camera in the opening scenes of the game’s narrative. Stranded on a seemingly deserted planet, Ini is tasked with unlocking the mystery of the world’s fallen civilization.

The camera quickly becomes central to gameplay, allowing for environmental analysis, teleportation to previously visited locations and, most intriguingly, the discovery of subatomic particles through puzzles inspired by Feynman diagrams. These challenges require players to match particle trajectories using various analytical tools, mirroring the investigative processes of real-world physicists. ​

It is in these games where the particle physics really shines through. Beamlines have to be tracked and redirected to unveil greater understanding of the particles that make up this strange land and, with that, Ini’s abilities to understand the world.

As you crack one puzzle, a door opens and off you pootle to another blockage or locked door. Players will doubtless, as I did, find themselves wandering around areas pondering how to unlock it. A tip for those a little stuck: use the camera wherever a background seems a little different. In most circumstances, clues and cues will be waiting there.

Pixels and particles

The game’s environments are meticulously crafted, drawing inspiration from actual laboratories and observatories. I played the game on Nintendo Switch, but it is also available on several other platforms – including PS5, Xbox and Steam – and it looks pretty much identical on each. The pixel art style is not merely a visual choice but a thematic one, symbolizing the fundamental “pixels” of the universe of elementary particles. As players delve deeper, they encounter representations of particles including electrons, gluons and muons, each unlocking new abilities that alter gameplay and exploration. ​

Meanwhile, the character of Ini moves in a smooth and – for those gamers among us with a love of physics – realistic way. There is even a hint of lighter gravity as you hold down the button to activate a longer jump.

Computer game pixel art representation of an underwater neutrino observatory
Game with depth An undersea puzzle in Exographer features a Km3Net-inspired neutrino observatory. (Courtesy: SciFunGames)

What sets Exographer apart is its ability to educate without compromising entertainment. The integration of scientific concepts is seamless, offering players a glimpse into the world of particle physics without overwhelming them. However, it’s worth noting that some puzzles may present a steep learning curve, potentially posing challenges for those less familiar with scientific reasoning.

Complementing the game’s visual and intellectual appeal is its atmospheric soundtrack, composed by Yann Van Der Cruyssen, known for his work on the game Stray. As with Stray – where you take the role of a stray cat with a backpack – the music enhances the sense of wonder and discovery, underscoring the game’s themes of exploration and scientific inquiry. ​

Exographer is more than just a game; it’s an experience that bridges the gap between science and (pixelated) art. It challenges players to think critically, to explore patiently, and to appreciate the intricate beauty of the universe’s building blocks. For those willing to engage with its depth, Exographer offers a rewarding journey that lingers after the console is turned off.

The post <em>Exographer</em>: a scientific odyssey in pixel form appeared first on Physics World.

  •  

CP violation in baryons is seen for the first time at CERN

The first experimental evidence of the breaking of charge–parity (CP) symmetry in baryons has been obtained by CERN’s LHCb Collaboration. The result is consistent with the Standard Model of particle physics and could lead to constraints on theoretical attempts to extend the Standard Model to explain the excess of matter over antimatter in the universe.

Current models of cosmology say that the Big Bang produced a giant burst of matter and antimatter, the vast majority of which recombined and annihilated shortly afterwards. Today however, the universe appears to be made almost exclusively of matter with very little antimatter in evidence. This excess of matter is not explained by the Standard Model and it existence is an important mystery in physics.

In 1964, James Cronin, Valentine Fitch and colleagues at Princeton University in the US conducted an experiment on the decay of neutral K mesons. This showed that the weak interaction violated CP symmetry, indicating that matter and antimatter could behave differently. Fitch and Cronin bagged the 1980 Nobel Prize for Physics and the Soviet physicist Andrei Sakharov subsequently suggested that, if amplified at very high mass scales in the early universe, CP violation could have induced the matter–antimatter asymmetry shortly after the Big Bang.

Numerous observations of CP violation have subsequently been made in other mesonic systems. The phenomenon is now an accepted part of the Standard Model is parametrized by the Cabibbo–Kobayashi–Maskawa (CKM) matrix. This describes the various probabilities of quarks of different generations changing into each other through the weak interaction – a process called mixing.

Tiny effect

However, the CP violation produced through the CKM mechanism is much smaller effect than would have been required to create the matter left over by the Big Bang, as Xueting Yang of China’s Peking University explains.

“The number of baryons remaining divided by the number of photons produced when the baryons and antibaryons met and produced two photons is required to be about 10-10 in Big Bang theory…whereas this kind of quantity is only 10-18 in the Standard Model prediction.”

What is more, CP violation had never been observed in baryons. “Theoretically the prediction for baryon decay is very imprecise,” says Yang, who is a member of the LHCb collaboration. “It’s much more difficult to calculate it than the meson decays because there’s some interaction with the strong force.” Baryons (mostly protons and neutrons) make up almost all the hadronic matter in the universe, so this left open the slight possibility that the explanation might lie in some inconsistency between baryonic CP violation and the Standard Model prediction.

In the new work, Yang and colleagues at LHCb looked at the decays of beauty (or bottom) baryons and antibaryons. These heavy cousins of neutrons contain an up quark, a down quark and a beauty quark and were produced in proton–proton collisions at the Large Hadron Collider in 2011–2018. These baryons and antibaryons can decay via multiple channels. In one, a baryon decays to a proton, a positive K-meson and a pair of pions – or, conversely, an antibaryon decays to an antiproton, a negative K-meson and a pair of pions. CP violation should create an asymmetry between these processes, and the researchers looked for evidence of this asymmetry in the numbers of particles detected at different energies from all the collisions.

Standard Model prevails

The team found that the CP violation seen was consistent with the Standard Model and inconsistent with zero by 5.2σ. “The experimental result is more precise than what we can get from theory,” says Yang. Other LHCb researchers scrutinized alternative decay channels of the beauty baryon: “Their measurement results are still consistent with CP symmetry…There should be CP violation also in their decay channels, but we don’t have enough statistics to claim that the deviation is more than 5σ.”

The current data do not rule any extensions to the current Standard Model out, says Yang, simply because none of those extensions make precise predictions about the overall degree of CP violation expected in baryons. However, the LHC is now in its third run, and the researchers hope to acquire information on, for example, the intermediate particles involved in the decay: “We may be able to provide some measurements that are more comparable for theories and which can provide some constraints on the Standard Model predictions for CP violation,” says Yang.

The research is described in a paper in Nature.

“It’s an important paper – an old type of CP violation in a new system,” says Tom Browder of  the University of Hawaii. “Theorists will try to interpret this within the context of the Standard Model, and there have already been some attempts, but there are some uncertainties due to the strong interaction that preclude making a precise test.” He says the results could nevertheless potentially help to constrain extensions of the Standard Model, such as CP violating decays involving dark matter proposed by the late Ann Nelson at the University of Washington in Seattle and her colleagues.

The post CP violation in baryons is seen for the first time at CERN appeared first on Physics World.

  •  

Muon g-2 achieves record precision, but theoretical tensions remain

In 2018, the Muon g-2 Experiment at Fermilab near Chicago, set out to measure the muon’s anomalous magnetic moment to a precision of 140 parts per billion (ppb). This component of the muon’s magnetic moment is the result of several subtle quantum effects and is also known as the muon g-2 – which reflects how the gyromagnetic ratio of the muon deviates from the simple value of two.

After six years of producing, storing, and measuring more than a trillion muons, the collaboration released its long-anticipated final result in June, achieving an unprecedented precision of 127 ppb. This landmark measurement not only solidifies confidence in the experimental value of muon g-2 but also sets a new benchmark as the most precise accelerator-based measurement of a fundamental particle to date.

Studies of the muon g-2 have served as a rigorous test of the Standard Model – physicist’s leading theory describing known particles and forces – for much of the last century. Theoretically, the muon’s anomalous magnetic moment can be predicted from the Standard Model to a similar precision as the experiment. For decades, a persistent discrepancy between prediction and measurement hinted at the possibility of new physics, with experimental results favouring a higher value than the theory. Such a difference, if confirmed, could point to phenomena not accounted for in the Standard Model – potentially explaining unresolved mysteries like the existence of dark matter.

However, extraordinary claims require extraordinary scrutiny. To address the experimental side, Fermilab launched the Muon g-2 Experiment. On the theoretical side, the Muon g-2 Theory Initiative was established as a global collaboration of theorists working to refine the Standard Model prediction using state-of-the-art methods, techniques, and input data.

Problematic contribution

One of the most problematic contributions to the theoretical value is the hadronic vacuum polarization (HVP), historically determined using experimental data as input to complex calculations. While the Theory Initiative has improved these methods, progress has remained limited due to discrepancies in the available experimental data. Crucially, a recent input from the CMD-3 Experiment diverged significantly from previous results, suggesting a larger HVP contribution (see figure below). This, in turn, yields a Standard Model prediction that aligns with the new Fermilab measurement – apparently eliminating the discrepancy and, with it, any evidence of new physics.

Muon g-2 values
Evolving results Summary of the four values of the anomalous magnetic moment of the muon aμ that have been obtained from different experiments and models. The most recent (2025) theory and experiment values are in agreement. (Courtesy: Alex Keshavarzi)

Despite years of investigation, the origin of the CMD-3 tension remains unknown. Its result stands in contrast to a vast catalogue of earlier data from multiple experiments over decades. As a result, the traditional, data-driven approach to estimating the HVP is deemed currently unable to produce a reliable estimate .

Thanks to the efforts of the Theory Initiative, however, the HVP can now also be calculated using lattice QCD (quantum chromodynamics) simulations on supercomputers, reaching a precision comparable to that of the data-driven methods. Multiple independent lattice QCD groups have arrived at consistent values, which also agree with the Fermilab measurement, indicating no discrepancy and thus no sign of new physics. This computational feat, once considered out of reach, marks a major breakthrough. Yet, the tension remains unresolved: Why do lattice QCD and CMD-3 agree, while both conflict with decades of experimental data?

No physics beyond the Standard Model

Given the improved control in lattice QCD, the Theory Initiative has also recently updated its recommended Standard Model prediction with the HVP fully based on lattice results. The resulting value agrees with the Fermilab measurement and currently implies no evidence for physics beyond the Standard Model. However, the Initiative has emphasized that this is far from conclusive. Future predictions are intended to incorporate data-driven estimates again – once the inconsistencies in the experimental input are resolved.

The field now faces two possibilities. One is that the CMD-3 result and lattice QCD are correct. In this case, there is no new physics – but an impressive validation of the Standard Model. The other scenario is that new experimental HVP input data align with the older results, supporting a smaller HVP contribution. This would reintroduce the discrepancy with the Fermilab result, reviving the exciting possibility of new physics. In either case, the inconsistencies between CMD-3, lattice QCD, and the existing data must be explained.

So, is there new physics or not? We know there must be. The Standard Model cannot not explain dark matter, the accelerating expansion of the universe, the absence of antimatter, or the quantum nature of gravity. Precision tests like muon g-2 offer a window into this unknown. That window has not closed – for now it’s propped open.

Where we’ll be in five years is uncertain. The Muon g-2 Theory Initiative will continue to refine predictions and resolve open questions. For now, one thing is clear: the Muon g-2 Experiment at Fermilab has delivered an historic achievement and its legacy will continue to contribute to our understanding of fundamental physics for decades to come.

 

The post Muon g-2 achieves record precision, but theoretical tensions remain appeared first on Physics World.

  •  

Inside ATLAS: Sara Alderweireldt explains how the CERN experiment homes in on new physics

This podcast features an interview with Sara Alderweireldt, who is a physicist working on the ATLAS experiment at CERN – the world-famous physics lab that straddles the Swiss-French border and is home to the Large Hadron Collider (LHC).

Based at the UK’s University of Edinburgh, Alderweireldt is in conversation with Physics World’s Margaret Harris and explains how physicists sift through the vast amount of information produced by ATLAS’ myriad detectors in search of new physics.

They also chat about the ongoing high-luminosity upgrade to the LHC and its experiments – which will be finished in 2030 – and the challenges and rewards of working a very long term project.

The post Inside ATLAS: Sara Alderweireldt explains how the CERN experiment homes in on new physics appeared first on Physics World.

  •