↩ Accueil

Vue normale

Reçu aujourd’hui — 11 décembre 2025 Physics World

Top 10 Breakthroughs of the Year in physics for 2025 revealed

11 décembre 2025 à 15:27

Physics World Top 10 breakthroughsPhysics World is delighted to announce its Top 10 Breakthroughs of the Year for 2025, which includes research in astronomy, antimatter, atomic and molecular physics and more. The Top Ten is the shortlist for the Physics World Breakthrough of the Year, which will be revealed on Thursday 18 December.

Our editorial team has looked back at all the scientific discoveries we have reported on since 1 January and has picked 10 that we think are the most important. In addition to being reported in Physics World in 2025, the breakthroughs must meet the following criteria:

  • Significant advance in knowledge or understanding
  • Importance of work for scientific progress and/or development of real-world applications
  • Of general interest to Physics World readers

Here, then, are the Physics World Top 10 Breakthroughs for 2025, listed in no particular order. You can listen to Physics World editors make the case for each of our nominees in the Physics World Weekly podcast. And, come back next week to discover who has bagged the 2025 Breakthrough of the Year.

Finding the stuff of life on an asteroid

Tim McCoy and Cari Corrigan
Analysing returned samples Tim McCoy (right), curator of meteorites at the Smithsonian’s National Museum of Natural History, and research geologist Cari Corrigan examine scanning electron microscope (SEM) images of a Bennu sample. (Courtesy: James Di Loreto, Smithsonian)

To Tim McCoy, Danny Glavin, Jason Dworkin, Yoshihiro Furukawa, Ann Nguyen, Scott Sandford, Zack Gainsforth and an international team of collaborators for identifying salt, ammonia, sugar, nitrogen- and oxygen-rich organic materials, and traces of metal-rich supernova dust, in samples returned from the near-Earth asteroid 101955 Bennu. The incredible chemical richness of this asteroid, which NASA’s OSIRIS-REx spacecraft visited in 2020, lends support to the longstanding hypothesis that asteroid impacts could have “seeded” the early Earth with the raw ingredients needed for life to form. The discoveries also enhance our understanding of how Bennu and other objects in the solar system formed out of the disc of material that coalesced around the young Sun.

The first superfluid molecule

To Takamasa Momose of the University of British Columbia, Canada, and Susumu Kuma of the RIKEN Atomic, Molecular and Optical Physics Laboratory, Japan for observing superfluidity in a molecule for the first time. Molecular hydrogen is the simplest and lightest of all molecules, and theorists predicted that it would enter a superfluid state at a temperature between 1‒2 K. But this is well below the molecule’s freezing point of 13.8 K, so Momose, Kuma and colleagues first had to develop a way to keep the hydrogen in a liquid state. Once they did that, they then had to work out how to detect the onset of superfluidity. It took them nearly 20 years, but by confining clusters of hydrogen molecules inside helium nanodroplets, embedding a methane molecule within the clusters, and monitoring the methane’s rotation, they were finally able to do it. They now plan to study larger clusters of hydrogen, with the aim of exploring the boundary between classical and quantum behaviour in this system.

Hollow-core fibres break 40-year limit on light transmission

To researchers at the University of Southampton and Microsoft Azure Fiber in the UK, for developing a new type of optical fibre that reduces signal loss, boosts bandwidth and promises faster, greener communications. The team, led by Francesco Poletti, achieved this feat by replacing the glass core of a conventional fibre with air and using glass membranes that reflect light at certain frequencies back into the core to trap the light and keep it moving through the fibre’s hollow centre. Their results show that the hollow-core fibres exhibit 35% less attenuation than standard glass fibres – implying that fewer amplifiers would be needed in long cables – and increase transmission speeds by 45%. Microsoft has begun testing the new fibres in real systems, installing segments in its network and sending live traffic through them. These trials open the door to gradual rollout and Poletti suggests that the hollow-core fibres could one day replace existing undersea cables.

First patient treatments delivered with proton arc therapy

Trento Proton Therapy Centre researchers
PAT pioneers The research team in the proton therapy gantry room. (Courtesy: UO Fisica Sanitaria and UO Protonterapia, APSS, Trento)

To Francesco Fracchiolla and colleagues at the Trento Proton Therapy Centre in Italy for delivering the first clinical treatments using proton arc therapy (PAT). Proton therapy – a precision cancer treatment – is usually performed using pencil-beam scanning to precisely paint the dose onto the tumour. But this approach can be limited by the small number of beam directions deliverable in an acceptable treatment time. PAT overcomes this by moving to an arc trajectory with protons delivered over a large number of beam angles and the potential to optimize the number of energies used for each beam direction. Working with researchers at RaySearch Laboratories in Sweden, the team performed successful dosimetric comparisons with clinical proton therapy plans. Following a feasibility test that confirmed the viability of clinical PAT delivery, the researchers used PAT to treat nine cancer patients. Importantly, all treatments were performed using the centre’s existing proton therapy system and clinical workflow.

A protein qubit for quantum biosensing

To Peter Maurer and David Awschalom at the University of Chicago Pritzker School of Molecular Engineering and colleagues for designing a protein quantum bit (qubit) that can be produced directly inside living cells and used as a magnetic field sensor. While many of today’s quantum sensors are based on nitrogen–vacancy (NV) centres in diamond, they are large and hard to position inside living cells. Instead, the team used fluorescent proteins, which are just 3 nm in diameter and can be produced by cells at a desired location with atomic precision. These proteins possess similar optical and spin properties to those of NV centre-based qubits – namely that they have a metastable triplet state. The researchers used a near-infrared laser pulse to optically address a yellow fluorescent protein and read out its triplet spin state with up to 20% spin contrast. They then genetically modified the protein to be expressed in bacterial cells and measured signals with a contrast of up to 8%. They note that although this performance does not match that of NV quantum sensors, it could enable magnetic resonance measurements directly inside living cells, which NV centres cannot do.

First two-dimensional sheets of metal

To Guangyu ZhangLuojun Du and colleagues at the Institute of Physics of the Chinese Academy of Sciences for producing the first 2D sheets of metal. Since the discovery of graphene – a sheet of carbon just one atom thick – in 2004, hundreds of other 2D materials have been fabricated and studied. In most of these, layers of covalently bonded atoms are separated by gaps where neighbouring layers are held together only by weak van der Waals (vdW) interactions, making it relatively easy to “shave off” single layers to make 2D sheets. Many thought that making atomically thin metals, however, would be impossible given that each atom in a metal is strongly bonded to surrounding atoms in all directions. The technique developed by Zhang and Du and colleagues involves heating powders of pure metals between two monolayer-MoS2/sapphire vdW anvils. Once the metal powders are melted into a droplet, the researchers applied a pressure of 200 MPa and continued this “vdW squeezing” until the opposite sides of the anvils cooled to room temperature and 2D sheets of metal were formed. The team produced five atomically thin 2D metals – bismuth, tin, lead, indium and gallium – with the thinnest being around 6.3 Å. The researchers say their work is just the “tip of the iceberg” and now aim to study fundamental physics with the new materials.

Quantum control of individual antiprotons

Photo of a physicist working at the BASE experiment
Exquisite control Physicist Barbara Latacz at the BASE experiment at CERN. (Courtesy: CERN)

To CERN’s BASE collaboration for being the first to perform coherent spin spectroscopy on a single antiproton – the antimatter counterpart of the proton. Their breakthrough is the most precise measurement yet of the antiproton’s magnetic properties, and could be used to test the Standard Model of particle physics. The experiment begins with the creation of high-energy antiprotons in an accelerator. These must be cooled (slowed down) to cryogenic temperatures without being lost to annihilation. Then, a single antiproton is held in an ultracold electromagnetic trap, where microwave pulses manipulate its spin state. The resulting resonance peak was 16 times narrower than previous measurements, enabling a significant leap in precision. This level of quantum control opens the door to highly sensitive comparisons of the properties of matter (protons) and antimatter (antiprotons). Unexpected differences could point to new physics beyond the Standard Model and may also reveal why there is much more matter than antimatter in the visible universe.

A smartphone-based early warning system for earthquakes

To Richard Allen, director of the Berkeley Seismological Laboratory at the University of California, Berkeley, and Google’s Marc Stogaitis and colleagues for creating a global network of Android smartphones that acts as an earthquake early warning system. Traditional early warning systems use networks of seismic sensors that rapidly detect earthquakes in areas close to the epicentre and issue warnings across the affected region. Building such seismic networks, however, is expensive, and many earthquake-prone regions do not have them. The researchers utilized the accelerometer in millions of phones in 98 countries to create the Android Earthquake Alert (AEA) system. Testing the app between 2021 and 2024 led to the detection of an average of 312 earthquakes a month, with magnitudes ranging from 1.9 to 7.8. For earthquakes of magnitude 4.5 or higher, the system sent “TakeAction” alerts to users, sending them, on average, 60 times per month for an average of 18 million individual alerts per month. The system also delivered lesser “BeAware” alerts to regions expected to experience a shaking intensity of magnitude 3 or 4. The team now aims to produce maps of ground shaking, which could assist the emergency response services following an earthquake.

A “weather map” for a gas giant exoplanet

To Lisa Nortmann at Germany’s University of Göttingen and colleagues for creating the first detailed “weather map” of an exoplanet. The forecast for exoplanet WASP-127b is brutal with winds reaching 33,000 km/hr, which is much faster than winds found anywhere in the Solar System. The WASP-127b is a gas giant located about 520 light–years from Earth and the team used the CRIRES+ instrument on the European Southern Observatory’s Very Large Telescope to observe the exoplanet as it transited across its star in less than 7 h. Spectral analysis of the starlight that filtered through WASP-127b’s atmosphere revealed Doppler shifts caused by supersonic equatorial winds. By analysing the range of Doppler shifts, the team created a rough weather map of  WASP-127b, even though they could not resolve light coming from specific locations on the exoplanet. Nortmann and colleagues concluded that the exoplanet’s poles are cooler that the rest of WASP-127b, where temperatures can exceed 1000 °C. Water vapour was detected in the atmosphere, raising the possibility of exotic forms of rain.

Highest-resolution images ever taken of a single atom

To the team led by Yichao Zhang at the University of Maryland and Pinshane Huang of the University of Illinois at Urbana-Champaign for capturing the highest-resolution images ever taken of individual atoms in a material. The team used an electron-microscopy technique called electron ptychography to achieve a resolution of 15 pm, which is about 10 times smaller than the size of an atom. They studied a stack of two atomically-thin layers of tungsten diselenide, which were rotated relative to each other to create a moiré superlattice. These twisted 2D materials are of great interest to physicists because their electronic properties can change dramatically with small changes in rotation angle. The extraordinary resolution of their microscope allowed them to visualize collective vibrations in the material called moiré phasons. These are similar to phonons, but had never been observed directly until now. The team’s observations align with theoretical predictions for moiré phasons. Their microscopy technique should boost our understanding of the role that moiré phasons and other lattice vibrations play in the physics of solids. This could lead to the engineering of new and useful materials.

ROPP banner

Physics World‘s coverage of the Breakthrough of the Year is supported by Reports on Progress in Physics, which offers unparalleled visibility for your ground-breaking research.

The post Top 10 Breakthroughs of the Year in physics for 2025 revealed appeared first on Physics World.

Exploring this year’s best physics research in our Top 10 Breakthroughs of 2025

11 décembre 2025 à 15:27

This episode of the Physics World Weekly podcast features a lively discussion about our Top 10 Breakthroughs of 2025, which include important research in quantum sensing, planetary science, medical physics, 2D materials and more. Physics World editors explain why we have made our selections and look at the broader implications of this impressive body of research.

The top 10 serves as the shortlist for the Physics World Breakthrough of the Year award, the winner of which will be announced on 18 December.

Links to all the nominees, more about their research and the selection criteria can be found here.

ROPP banner

Physics World‘s coverage of the Breakthrough of the Year is supported by Reports on Progress in Physics, which offers unparalleled visibility for your ground-breaking research.

The post Exploring this year’s best physics research in our Top 10 Breakthroughs of 2025 appeared first on Physics World.

Astronomers observe a coronal mass ejection from a distant star

11 décembre 2025 à 10:00

The Sun regularly produces energetic outbursts of electromagnetic radiation called solar flares. When these flares are accompanied by flows of plasma, they are known as coronal mass ejections (CMEs). Now, astronomers at the Netherlands Institute for Radio Astronomy (ASTRON) have spotted a similar event occurring on a star other than our Sun – the first unambiguous detection of a CME outside our solar system.

Astronomers have long predicted that the radio emissions associated with CMEs from other stars should be detectable. However, Joseph Callingham, who led the ASTRON study, says that he and his colleagues needed the highly sensitive low-frequency radio telescope LOFAR – plus ESA’s XMM-Newton space observatory and “some smart software” developed by Cyril Tasse and Philippe Zarka at the Observatoire de Paris-PSL, France – to find one.

A short, intense radio signal from StKM 1-1262

Using these tools, the team detected short, intense radio signals from a star located around 40 light-years away from Earth. This star, called StKM 1-1262, is very different from our Sun. At only around half of the Sun’s mass, it is classed as an M-dwarf star. It also rotates 20 times faster and boasts a magnetic field 300 times stronger. Nevertheless, the burst it produced had the same frequency, time and polarization properties as the plasma emission from an event called a solar type II burst that astronomers identify as a fast CME when it comes from the Sun.

“This work opens up a new observational frontier for studying and understanding eruptions and space weather around other stars,” says Henrik Eklund, an ESA research fellow working at the European Space Research and Technology Centre (ESTEC) in Noordwijk, Netherlands, who was not involved in the study. “We’re no longer limited to extrapolating our understanding of the Sun’s CMEs to other stars.”

Implications for life on exoplanets

The high speed of this burst – around 2400 km/s – would be atypical for our own Sun, with only around 1 in every 20 solar CMEs reaching that level. However, the ASTRON team says that M-dwarfs like StKM 1-1262 could emit CMEs of this type as often as once a day.

An artist's impression of the XMM-Newton telescope, showing the telescope against a black, starry background with the Earth nearby
Spotting a distant coronal mass ejection: An artist’s impression of XMM-Newton. (Courtesy: ESA/C Carreau)

According to Eklund, this has implications for extraterrestrial life, as most of the known planets in the Milky Way are thought to orbit stars of this type, and such bursts could be powerful enough to strip their atmospheres. “It seems that intense space weather may be even more extreme around smaller stars – the primary hosts of potentially habitable exoplanets,” he says. “This has important implications for how these planets keep hold of their atmospheres and possibly remain habitable over time.”

Erik Kuulkers, a project scientist at XMM-Newton who was also not directly involved in the study, suggests that this atmosphere-stripping ability could modify the way we hunt for life in stellar systems akin to our Solar System. “A planet’s habitability for life as we know it is defined by its distance from its parent star – whether or not it sits within the star’s ‘habitable zone’, a region where liquid water can exist on the surface of planets with suitable atmospheres,” Kuulkers says. “What if that star was especially active, regularly producing CMEs, however? A planet regularly bombarded by these ejections might lose its atmosphere entirely, leaving behind a barren uninhabitable world, despite its orbit being ‘just right’.

Kuulkers adds that the study’s results also contain lessons for our own Solar System. “Why is there still life on Earth despite the violent material being thrown at us?” he asks. “It is because we are safeguarded by our atmosphere.”

Seeking more data

The ASTRON team’s next step will be to look for more stars like StKM 1-1262, which Kuulkers agrees is a good idea. “The more events we can find, the more we learn about CMEs and their impact on a star’s environment,” he says. Additional observations at other wavelengths “would help”, he adds, “but we have to admit that events like the strong one reported on in this work don’t happen too often, so we also need to be lucky enough to be looking at the right star at the right time.”

For now, the ASTRON researchers, who report their work in Nature, say they have reached the limit of what they can detect with LOFAR. “The next step is to use the next generation Square Kilometre Array, which will let us find many more such stars since it is so much more sensitive,” Callingham tells Physics World.

The post Astronomers observe a coronal mass ejection from a distant star appeared first on Physics World.

Reçu hier — 10 décembre 2025 Physics World

Sterile neutrinos: KATRIN and MicroBooNE come up empty handed

10 décembre 2025 à 17:49

Two major experiments have found no evidence for sterile neutrinos – hypothetical particles that could help explain some puzzling observations in particle physics. The KATRIN experiment searched for sterile neutrinos that could be produced during the radioactive decay of tritium; whereas the MicroBooNE experiment looked for the effect of sterile neutrinos on the transformation of muon neutrinos into electron neutrinos.

Neutrinos are low-mass subatomic particles with zero electric charge that interact with matter only via the weak nuclear force and gravity. This makes neutrinos difficult to detect, despite the fact that the particles are produced in copious numbers by the Sun, nuclear reactors and collisions in particle accelerators.

Neutrinos were first proposed in 1930 to explain the apparent missing momentum, spin and energy in the radioactive beta decay of nuclei. The they were first observed in 1956 and by 1975 physicists were confident that three types (flavours) of neutrino existed – electron, muon and tau – along with their respective antiparticles. At the same time, however, it was becoming apparent that something was amiss with the Standard Model description of neutrinos because the observed neutrino flux from sources like the Sun did not tally with theoretical predictions.

Gaping holes

Then in the late 1990s experiments in Canada and Japan revealed that neutrinos of one flavour transform into other flavours as then propagate through space. This quantum phenomenon is called neutrino oscillation and requires that neutrinos have both flavour and mass. Takaaki Kajita and Art McDonald shared the 2015 Nobel Prize for Physics for this discovery – but that is not the end of the story.

One gaping hole in our knowledge is that physicists do not know the neutrino masses – having only measured upper limits for the three flavours. Furthermore, there is some experimental evidence that the current Standard-Model description of neutrino oscillation is not quite right. This includes lower-than-expected neutrino fluxes from some beta-decaying nuclei and some anomalous oscillations in neutrino beams.

One possible explanation for these oscillation anomalies is the existence of a fourth type of neutrino. Because we have yet to detect this particle, the assumption is that it does not interact via the weak interaction – which is why these hypothetical particles are called sterile neutrinos.

Electron energy curve

Now, two very different neutrino experiments have both reported no evidence of sterile neutrinos. One is KATRIN, which is located at the Karlsruhe Institute of Technology (KIT) in Germany. It has the prime mission of making a very precise measurement of the mass of the electron antineutrino. The idea is to measure the energy spectrum of electrons emitted in the beta decay of tritium and infer an upper limit on the mass of the electron antineutrino from the shape of the curve.

If sterile neutrinos exist, then they could sometimes be emitted in place of electron antineutrinos during beta decay. This would change the electron energy spectrum – but this was not observed at KATRIN.

“In the measurement campaigns underlying this analysis, we recorded over 36 million electrons and compared the measured spectrum with theoretical models. We found no indication of sterile neutrinos,” says Kathrin Valerius of the Institute for Astroparticle Physics at KIT and co-spokesperson of the KATRIN collaboration.

Meanwhile, physicists on the MicroBooNE experiment at Fermilab in the US have looked for evidence for sterile neutrinos in how muon neutrinos oscillate into electron neutrinos. Beams of muon neutrinos are created by firing a proton beam at a solid target. The neutrinos at Fermilab then travel several hundred metres (in part through solid ground) to MicroBooNE’s liquid-argon time projection chamber. This detects electron neutrinos with high spatial and energy resolution, allowing detailed studies of neutrino oscillations.

If sterile neutrinos exist, they would be involved in the oscillation process and would therefore affect the number of electron neutrinos detected by MicroBooNE. Neutrino beams from two different sources were used in the experiments, but no evidence for sterile neutrinos was found.

Together, these two experiments rule out sterile neutrinos as an explanation for some – but not all – previously observed oscillation anomalies. So more work is needed to fully understand neutrino physics. Indeed, current and future neutrino experiments are well placed to discover physics beyond the Standard Model, which could lead to solutions to some of the greatest mysteries of physics.

“Any time you rule out one place where physics beyond the Standard Model could be, that makes you look in other places,” says Justin Evans at the UK’s University of Manchester, who is co-spokesperson for MicroBooNE. “This is a result that is going to really spur a creative push in the neutrino physics community to come up with yet more exciting ways of looking for new physics.”

Both groups report their results in papers in Nature: Katrin paper; MicroBooNE paper.

The post Sterile neutrinos: KATRIN and MicroBooNE come up empty handed appeared first on Physics World.

Bridging borders in medical physics: guidance, challenges and opportunities

10 décembre 2025 à 15:00
Book cover: Global Medical Physics: A Guide for International Collaboration
Educational aid Global Medical Physics: A Guide for International Collaboration explores the increasing role of medical physicists in international collaborations. The book comes in paperback, hardback and ebook format. An open-access ebook will be available in the near future. (Courtesy: CRC Press/Taylor & Francis)

As the world population ages and the incidence of cancer and cardiac disease grows alongside, there’s an ever-increasing need for reliable and effective diagnostics and treatments. Medical physics plays a central role in both of these areas – from the development of a suite of advanced diagnostic imaging modalities to the ongoing evolution of high-precision radiotherapy techniques.

But access to medical physics resources – whether equipment and infrastructure, education and training programmes, or the medical physicists themselves – is massively imbalanced around the world. In low- and middle-income countries (LMICs), fewer than 50% of patients have access to radiotherapy, with similar shortfalls in the availability of medical imaging equipment. Lower-income countries also have the least number of medical physicists per capita.

This disparity has led to an increasing interest in global health initiatives, with professional organizations looking to provide support to medical physicists in lower income regions. Alongside, medical physicists and other healthcare professionals seek to collaborate internationally in clinical, educational and research settings.

Successful multicultural collaborations, however, can be hindered by cultural, language and ethical barriers, as well as issues such as poor access to the internet and the latest technology advances. And medical physicists trained in high-income contexts may not always understand the circumstances and limitations of those working within lower income environments.

Aiming to overcome these obstacles, a new book entitled Global Medical Physics: A Guide for International Collaboration provides essential guidance for those looking to participate in such initiatives. The text addresses the various complexities of partnering with colleagues in different countries and working within diverse healthcare environments, encompassing clinical and educational medical physics circles, as well as research and academic environments.

“I have been involved in providing support to medical physicists in lower income contexts for a number of years, especially through the International Atomic Energy Agency (IAEA), but also through professional organizations like the American Association of Physicists in Medicine (AAPM),” explains the book’s editor Jacob Van Dyk, emeritus professor at Western University in Canada. “It is out of these experiences that I felt it might be appropriate and helpful to provide some educational materials that address these issues. The outcome was this book, with input from those with these collaborative experiences.”

Shared experience

The book brings together contributions from 34 authors across 21 countries, including both high- and low-resource settings. The authors – selected for their expertise and experience in global health and medical physics activities – provide guidelines for success, as well as noting potential barriers and concerns, on a wide range of themes targeted at multiple levels of expertise.

This guidance includes, for example: advice on how medical physicists can contribute to educational, clinical and research-based global collaborations and the associated challenges; recommendations on building global inter-institutional collaborations, covering administrative, clinical and technical challenges and ethical issues; and a case study on the Radiation Planning Assistant project, which aims to use automated contouring and treatment planning to assist radiation oncologists in LMICs.

In another chapter, the author describes the various career paths available to medical physicists, highlighting how they can help address the disparity in healthcare resources through their careers. There’s also a chapter focusing on CERN as an example of a successful collaboration engaging a worldwide community, including a discussion of CERN’s involvement in collaborative medical physics projects.

With the rapid emergence of artificial intelligence (AI) in healthcare, the book takes a look at the role of information and communication technologies and AI within global collaborations. Elsewhere, authors highlight the need for data sharing in medical physics, describing example data sharing applications and technologies.

Other chapters consider the benefits of cross-sector collaborations with industry, sustainability within global collaborations, the development of effective mentoring programmes – including a look at challenges faced by LMICs in providing effective medical physics education and training – and equity, diversity and inclusion and ethical considerations in the context of global medical physics.

The book rounds off by summarizing the key topics discussed in the earlier chapters. This information is divided into six categories: personal factors, collaboration details, project preparation, planning and execution, and post-project considerations.

“Hopefully, the book will provide an awareness of factors to consider when involved in global international collaborations, not only from a high-income perspective but also from a resource-constrained perspective,” says Van Dyk. “It was for this reason that when I invited authors to develop chapters on specific topics, they were encouraged to invite a co-author from another part of the world, so that it would broaden the depth of experience.”

The post Bridging borders in medical physics: guidance, challenges and opportunities appeared first on Physics World.

Can we compare Donald Trump’s health chief to Soviet science boss Trofim Lysenko?

10 décembre 2025 à 12:00

The US has turned Trofim Lysenko into a hero.

Born in 1898, Lysenko was a Ukrainian plant breeder, who in 1927 found he could make pea and grain plants develop at different rates by applying the right temperatures to their seeds. The Soviet news organ Pravda was enthusiastic, saying his discovery could make crops grow in winter, turn barren fields green, feed starving cattle and end famine.

Despite having trained as a horticulturist, Lysenko rejected the then-emerging science of genetics in favour of Lamarckism, according to which organisms can pass on acquired traits to offspring. This meshed well with the Soviet philosophy of “dialectical materialism”, which sees both the natural and human worlds as evolving not through mechanisms but environment.

Stalin took note of Lysenko’s activities and had him installed as head of key Soviet science agencies. Once in power, Lysenko dismissed scientists who opposed his views, cancelled their meetings, funded studies of discredited theories, and stocked committees with loyalists. Although Lysenko had lost his influence by the time Stalin died in 1953 – with even Pravda having turned against him – Soviet agricultural science had been destroyed.

A modern parallel

Lysenko’s views and actions have a resonance today when considering the activities of Robert F Kennedy Jr, who was appointed by Donald Trump as secretary of the US Department of Health and Human Services in February 2025. Of course, Trump has repeatedly sought to impose his own agenda on US science, with his destructive impact outlined in a detailed report published by the Union of Concerned Scientists in July 2025.

Last May Trump signed executive order 14303, “Restoring Gold Standard Science”, which blasts scientists for not acting “in the best interests of the public”. He has withdrawn the US from the World Health Organization (WHO), ordered that Federal-sponsored research fund his own priorities, redefined the hazards of global warming, and cancelled the US National Climate Assessment (NSA), which had been running since 2000.

But after Trump appointed Kennedy, the assault on science continued into US medicine, health and human services. In what might be called a philosophy of “political materialism”, Kennedy fired all 17 members of the Advisory Committee on Immunization Practices of the US Centers for Disease Control and Prevention (CDC), cancelled nearly $500m in mRNA vaccine contracts, hired a vaccine sceptic to study its connection with autism despite numerous studies that show no connection, and ordered the CDC to revise its website to reflect his own views on the cause of autism.

In his 2021 book The Real Anthony Fauci: Bill Gates, Big Pharma, and the Global War on Democracy and Public Health, Kennedy promotes not germ theory but what he calls “miasma theory”, according to which diseases are prevented by nutrition and lifestyle.

Divergent stories

Of course, there are fundamental differences between the 1930s Soviet Union and the 2020s United States. Stalin murdered and imprisoned his opponents, while the US administration only defunds and fires them. Stalin and Lysenko were not voted in, while Trump came democratically to power, with elected representatives confirming Kennedy. Kennedy has also apologized for his most inflammatory remarks, though Stalin and Lysenko never did (nor does Trump for that matter).

What’s more, Stalin’s and Lysenko’s actions were more grounded in apparent scientific realities and social vision than Trump’s or Kennedy’s. Stalin substantially built up much of the Soviet science and technology infrastructure, whose dramatic successes include launching the first Earth satellite Sputnik in 1957. Though it strains credulity to praise Stalin, his vision to expand Soviet agricultural production during a famine was at least plausible and its intention could be portrayed as humanitarian. Lysenko was a scientist, Kennedy is not.

As for Lysenko, his findings seemed to carry on those of his scientific predecessors. Experimentally, he expanded the work of Russian botanist Ivan Michurin, who bred new kinds of plants able to grow in different regions. Theoretically, his work connected not only with dialectical materialism but also with that of the French naturalist Jean-Baptiste Lamarck, who claimed that acquired traits can be inherited.

Trump and Kennedy are off-the-wall by comparison. Trump has called climate change a con job and hoax and seeks to stop research that says otherwise. In 2019 he falsely stated that Hurricane Dorian was predicted to hit Alabama, then ordered the National Oceanic and Atmospheric Administration to issue a statement supporting him. Trump has said he wants the US birth rate to rise and that he will be the “fertilization president”, but later fired fertility and IVF researchers at the CDC.

As for Kennedy, he has said that COVID-19 “is targeted to attack Caucasians and Black people” and that Ashkenazi Jews and Chinese are the most immune (he disputed the remark, but it’s on video). He has also sought to retract a 2025 vaccine study from the Annals of Internal Medicine (178 1369) that directly refuted his views on autism.

The critical point

US Presidents often have pet scientific projects. Harry Truman created the National Science Foundation, Dwight D Eisenhower set up NASA, John F Kennedy started the Apollo programme, while Richard Nixon launched the Environmental Protection Agency (EPA) and the War on Cancer. But it’s one thing to support science that might promote a political agenda and another to quash science that will not.

One ought to be able to take comfort in the fact that if you fight nature, you lose – except that the rest of us lose as well. Thanks to Lysenko’s actions, the Soviet Union lost millions of tons of grain and hundreds of herds of cattle. The promise of his work evaporated and Stalin’s dreams vanished.

Lysenko, at least, was motivated by seeming scientific promise and social vision; the US has none. Trump has damaged the most important US scientific agencies, destroyed databases and eliminated the EPA’s research arm, while Kennedy has replaced health advisory committees with party loyalists.

While Kennedy may not last his term – most Trump Cabinet officials don’t – the paths he has sent science policy on surely will. For Trump and Kennedy, the policy seems to consist only of supporting pet projects. Meanwhile, cases of measles in the US have reached their highest level in three decades, the seas continue to rise and the climate is changing. It is hard to imagine how enemy agents could damage US science more effectively.

The post Can we compare Donald Trump’s health chief to Soviet science boss Trofim Lysenko? appeared first on Physics World.

Diagnosing brain cancer without a biopsy

10 décembre 2025 à 10:19

Early diagnosis of primary central nervous system lymphoma (PCNSL) remains challenging because brain biopsies are invasive and imaging often lacks molecular specificity. A team led by researchers at Shenzhen University has now developed a minimally invasive fibre-optic plasmonic sensor capable of detecting PCNSL-associated microRNAs in the eye’s aqueous humor with attomolar sensitivity.

At the heart of the approach is a black phosphorus (BP)–engineered surface plasmon resonance (SPR) interface. An ultrathin BP layer is deposited on a gold-coated fiber tip. Because of the work-function difference between BP and gold, electrons transfer from BP into the Au film, creating a strongly enhanced local electric field at the metal–semiconductor interface. This BP–Au charge-transfer nano-interface amplifies refractive-index changes at the surface far more efficiently than conventional metal-only SPR chips, enabling the detection of molecular interactions that would otherwise be too subtle to resolve and pushing the limit of detection down to 21 attomolar without nucleic-acid amplification. The BP layer also provides a high-area, biocompatible surface for immobilizing RNA reporters.

To achieve sequence specificity, the researchers integrated CRISPR-Cas13a, an RNA-guided nuclease that becomes catalytically active only when its target sequence is perfectly matched to a designed CRISPR RNA (crRNA). When the target microRNA (miR-21) is present, activated Cas13a cleaves RNA reporters attached to the BP-modified fiber surface, releasing gold nanoparticles and reducing the local refractive index. The resulting optical shift is read out in real time through the SPR response of the BP-enhanced fiber probe, providing single-nucleotide-resolved detection directly on the plasmonic interface.

With this combined strategy, the sensor achieved a limit of detection of 21 attomolar in buffer and successfully distinguished single-base-mismatched microRNAs. In tests on aqueous-humor samples from patients with PCNSL, the CRISPR-BP-FOSPR assay produced results that closely matched clinical qPCR data, despite operating without any amplification steps.

Because aqueous-humor aspiration is a minimally invasive ophthalmic procedure, this BP-driven plasmonic platform may offer a practical route for early PCNSL screening, longitudinal monitoring, and potentially the diagnosis of other neurological diseases reflected in eye-fluid biomarkers. More broadly, the work showcases how black-phosphorus-based charge-transfer interfaces can be used to engineer next-generation, fibre-integrated biosensors that combine extreme sensitivity with molecular precision.

Do you want to learn more about this topic?

Theoretical and computational tools to model multistable gene regulatory networks by Federico BocciDongya JiaQing NieMohit Kumar Jolly and José Onuchic (2023)

The post Diagnosing brain cancer without a biopsy appeared first on Physics World.

5f electrons and the mystery of δ-plutonium

10 décembre 2025 à 10:18

Plutonium is considered a fascinating element. It was first chemically isolated in 1941 at the University of California, but its discovery was hidden until after the Second World War. There are six distinct allotropic phases of plutonium with very different properties. At ambient pressure, continuously increasing the temperature converts the room-temperature, simple monoclinic a phase through five phase transitions, the final one occurring at approximately 450°C.

The delta (δ) phase is perhaps the most interesting allotrope of plutonium. δ-plutonium is technologically important, has a very simple crystal structure, but its electronic structure has been debated for decades. Researchers have attempted to understand its anomalous behaviour and how the properties of δ-plutonium are connected to the 5f electrons.

The 5f electrons are found in the actinide group of elements which includes plutonium. Their behaviour is counterintuitive. They are sensitive to temperature, pressure and composition, and behave in both a localised manner, staying close to the nucleus and in a delocalised (itinerant) manner, more spread out and contributing to bonding. Both these states can support magnetism depending on actinide element. The 5f electrons contribute to δ-phase stability, anomalies in the material’s volume and bulk modulus, and to a negative thermal expansion where the δ-phase reduces in size when heated.

Research group from Lawrence Livermore National Laboratory
Research group from Lawrence Livermore National Laboratory. Left to right: Lorin Benedict, Alexander Landa, Kyoung Eun Kweon, Emily Moore, Per Söderlind, Christine Wu, Nir Goldman, Randolph Hood and Aurelien Perron. Not in image: Babak Sadigh and Lin Yang (Courtesy: Blaise Douros/Lawrence Livermore National Laboratory)

In this work, the researchers present a comprehensive model to predict the thermodynamic behaviour of δ-plutonium, which has a face-centred cubic structure. They use density functional theory, a computational technique that explores the overall electron density of the system and incorporate relativistic effects to capture the behaviour of fast-moving electrons and complex magnetic interactions. The model includes a parameter-free orbital polarization mechanism to account for orbital-orbital interactions, and incorporates anharmonic lattice vibrations and magnetic fluctuations, both transverse and longitudinal modes, driven by temperature-induced excitations. Importantly, it is shown that negative thermal expansion results from magnetic fluctuations.

This is the first model to integrate electronic effects, magnetic fluctuations, and lattice vibrations into a cohesive framework that aligns with experimental observations and semi-empirical models such as CALPHAD. It also accounts for fluctuating states beyond the ground state and explains how gallium composition influences thermal expansion. Additionally, the model captures the positive thermal expansion behaviour of the high-temperature epsilon phase, offering new insight into plutonium’s complex thermodynamics.

Read the full article

First principles free energy model with dynamic magnetism for δ-plutonium

Per Söderlind et al 2025 Rep. Prog. Phys. 88 078001

Do you want to learn more about this topic?

Pu 5f population: the case for n = 5.0 J G Tobin and M F Beaux II (2025)

The post 5f electrons and the mystery of δ-plutonium appeared first on Physics World.

Scientists explain why ‘seeding’ clouds with silver iodide is so efficient

10 décembre 2025 à 09:58

Silver iodide crystals have long been used to “seed” clouds and trigger precipitation, but scientists have never been entirely sure why the material works so well for that purpose. Researchers at TU Wien in Austria are now a step closer to solving the mystery thanks to a new study that characterized surfaces of the material in atomic-scale detail.

“Silver iodide has been used in atmospheric weather modification programs around the world for several decades,” explains Jan Balajka from TU Wien’s Institute of Applied Physics, who led this research. “In fact, it was chosen for this purpose as far back as the 1940s because of its atomic crystal structure, which is nearly identical to that of ice – it has the same hexagonal symmetry and very similar distances between atoms in its lattice structure.”

The basic idea, Balajka continues, originated with the 20th-century American atmospheric scientist Bernard Vonnegut, who suggested in 1947 that introducing small silver iodide (AgI) crystals into a cloud could provide nuclei for ice to grow on. But while Vonnegut’s proposal worked (and helped to inspire his brother Kurt’s novel Cat’s Cradle), this simple picture is not entirely accurate. The stumbling block is that nucleation occurs at the surface of a crystal, not inside it, and the atomic structure of an AgI surface differs significantly from its interior.

A task that surface science has solved

To investigate further, Balajka and colleagues used high-resolution atomic force microscopy (AFM) and advanced computer simulations to study the atomic structure of 2‒3 nm diameter AgI crystals when they are broken into two pieces. The team’s measurements revealed that the surfaces of both freshly cleaved structures differed from those found inside the crystal.

More specifically, team member Johanna Hütner, who performed the experiments, explains that when an AgI crystal is cleaved, the silver atoms end up on one side while the iodine atoms appear on the other. This has implications for ice growth, because while the silver side maintains a hexagonal arrangement that provides an ideal template for the growth of ice layers, the iodine side reconstructs into a rectangular pattern that no longer lattice-matches the hexagonal symmetry of ice crystals. The iodine side is therefore incompatible with the epitaxial growth of hexagonal ice.

“Our works solves this decades-long controversy of the surface vs bulk structure of AgI, and shows that structural compatibility does matter,” Balajka says.

Difficult experiments

According to Balajka, the team’s experiments were far from easy. Many experimental methods for studying the structure and properties of material surfaces are based on interactions with charged particles such as electrons or ions, but AgI is an electrical insulator, which “excludes most of the tools available,” he explains. Using AFM enabled them to overcome this problem, he adds, because this technique detects interatomic forces between a sharp tip and the surface and does not require a conductive sample.

Another problem is that AgI is photosensitive and decomposes when exposed to visible light. While this property is useful in other contexts – AgI was a common ingredient in early photographic plates – it created complications for the TU Wien team. “Conventional AFM setups make use of optical laser detection to map the topography of a sample,” Balajka notes.

To avoid destroying their sample while studying it, the researchers therefore had to use a non-contact AFM based on a piezoelectric sensor that detects electrical signals and does not require optical readout. They also adapted their setup to operate in near-darkness, using only red light while manipulating the Ag to ensure that stray light did not degrade the samples.

The computational modelling part of the work introduced yet another hurdle to overcome. “Both Ag and I are atoms with a high number of electrons in their electron shells and are thus highly polarizable,” Balajka explains. “The interaction between such atoms cannot be accurately described by standard computational modelling methods such as density functional theory (DFT), so we had to employ highly accurate random-phase approximation (RPA) calculations to obtain reliable results.”

Highly controlled conditions

The researchers acknowledge that their study, which is detailed in Science Advances, was conducted under highly controlled conditions – ultrahigh vacuum, low pressure and temperature and a dark environment – that are very different from those that prevail inside real clouds. “The next logical step for us is therefore to confirm whether our findings hold under more representative conditions,” Balajka says. “We would like to find out whether the structure of AgI surfaces is the same in air and water, and if not, why.”

The researchers would also like to better understand the atomic arrangement of the rectangular reconstruction of the iodine surface. “This would complete the picture for the use of AgI in ice nucleation, as well as our understanding of AgI as a material overall,” Balajka says.

The post Scientists explain why ‘seeding’ clouds with silver iodide is so efficient appeared first on Physics World.

Reçu avant avant-hier Physics World

Slow spectroscopy sheds light on photodegradation

9 décembre 2025 à 18:22

Using a novel spectroscopy technique, physicists in Japan have revealed how organic materials accumulate electrical charge through long-term illumination by sunlight – leading to material degradation. Ryota Kabe and colleagues at the Okinawa Institute of Science and Technology have shown how charge separation occurs gradually via a rare multi-photon ionization process, offering new insights into how plastics and organic semiconductors degrade in sunlight.

In a typical organic solar cell, an electron-donating material is interfaced with an electron acceptor. When the donor absorbs a photon, one of its electrons may jump across the interface, creating a bound electron-hole pair which may eventually dissociate – creating two free charges from which useful electrical work can be extracted.

Although such an interface vastly boosts the efficiency of this process, it is not necessary for charge separation to occur when an electron donor is illuminated. “Even single-component materials can generate tiny amounts of charge via multiphoton ionization,” Kabe explains. “However, experimental evidence has been scarce due to the extremely low probability of this process.”

To trigger charge separation in this way, an electron needs to absorb one or more additional photons while in its excited state. Since the vast majority of electrons fall back into their ground states before this can happen, the spectroscopic signature of this charge separation is very weak. This makes it incredibly difficult to detect using conventional spectroscopy techniques, which can generally only make observations over timescales of up to a few milliseconds.

The opposite approach

“While weak multiphoton pathways are easily buried under much stronger excited-state signals, we took the opposite approach in our work,” Kabe describes. “We excited samples for long durations and searched for traces of accumulated charges in the slow emission decay.”

Key to this approach was an electron donor called NPD. This organic material has a relatively long triplet lifetime, where an excited electron is prevented from transitioning back to its ground state. As a result, these molecules emit phosphorescence over relatively long timescales.

In addition, Kabe’s team dispersed their NPD samples into different host materials with carefully selected energy levels. In one medium, the energies of both the highest-occupied and lowest-unoccupied molecular orbitals lay below NPD’s corresponding levels, so that the host material acted as an electron acceptor. As a result, charge transfer occurred in the same way as it would across a typical donor-acceptor interface.

Yet in another medium, the host’s lowest-unoccupied orbital lay above NPD’s – blocking charge transfer, and allowing triplet states to accumulate instead. In this case, the only way for charge separation to occur was through multi-photon ionization.

Slow emission decay analysis

Since NPD’s long triplet lifetime allowed its electrons to be excited gradually over an extended period of illumination, its weak charge accumulation became detectable through slow emission decay analysis. In contrast, more conventional methods involve multiple, ultra-fast laser pulses, severely restricting the timescale over which measurements can be made. Altogether, this approach enabled the team to clearly distinguish between the two charge generation pathways.

“Using this method, we confirmed that charge generation occurred via resonance-enhanced multiphoton ionization mediated by long-lived triplet states, even in single-component organic materials,” Kabe describes.

This result offers insights into how plastics and organic semiconductors are degraded by sunlight over years or decades. The conventional explanation is that sunlight generates free radicals. These are molecules that lose an electron through ionization, leaving behind an unpaired electron which readily reacts with other molecules in the surrounding environment. Since photodegradation unfolds over such a long timescale, researchers could not observe this charge generation in single-component organic materials – until now.

“The method will be useful for analysing charge behaviour in organic semiconductor devices and for understanding long-term processes such as photodegradation that occur gradually under continuous light exposure,” Kabe says.

The research is described in Science Advances.

The post Slow spectroscopy sheds light on photodegradation appeared first on Physics World.

Fermilab opens new building dedicated to Tevatron pioneer Helen Edwards

9 décembre 2025 à 15:59

Fermilab has officially opened a new building named after the particle physicist Helen Edwards. Officials from the lab and the US Department of Energy (DOE) opened the Helen Edwards Engineering Research Center at a ceremony held on 5 December.  The new building is the lab’s largest purpose-built lab and office space since the lab’s iconic Wilson Hall, which was completed in 1974.

Construction of the Helen Edwards Engineering Research Center began in 2019 and was completed three years later. The centre is an 7500 m2 multi-story lab and office building that is adjacent and connected to Wilson Hall.

The new centre is designed as a collaborative lab where engineers, scientists and technicians design, build and test technologies across several areas of research such as neutrino science, particle detectors, quantum science and electronics.

The centre also features cleanrooms, vibration-sensitive labs and cryogenic facilities in which the components of the near detector for the Deep Underground Neutrino Experiment will be assembled and tested.

A pioneering spirit

With a PhD in experimental particle physics from Cornell University, Edwards was heavily involved with commissioning the university’s 10 GeV electron synchrotron. In 1970 Fermilab’s director Robert Wilson appointed Edwards as associate head of the lab’s booster section and she later became head of the accelerator division.

While at Fermilab, Edwards’ primary responsibility was designing, constructing, commissioning and operating the Tevatron, which led to the discoveries of the top quark in 1995 and the tau neutrino in 2000.

Edwards retired in the early 1990s but continued to work as guest scientists at Fermilab and officially switched the Tevatron off during a ceremony held on 30 September 2011. Edwards died in 2016.

Darío Gil, the undersecretary for science at the DOE says that Edwards’ scientific work “is a symbol of the pioneering spirit of US research”.

“Her contributions to the Tevatron and the lab helped the US become a world leader in the study of elementary particles,” notes Gil. “We honour her legacy by naming this research centre after her as Fermilab continues shaping the next generation of research using [artificial intelligence], [machine learning] and quantum physics.”

The post Fermilab opens new building dedicated to Tevatron pioneer Helen Edwards appeared first on Physics World.

Memristors could measure a single quantum of resistance

9 décembre 2025 à 10:52

A proposed new way of defining the standard unit of electrical resistance would do away with the need for strong magnetic fields when measuring it. The new technique is based on memristors, which are programmable resistors originally developed as building blocks for novel computing architectures, and its developers say it would considerably simplify the experimental apparatus required to measure a single quantum of resistance for some applications.

Electrical resistance is a physical quantity that represents how much a material opposes the flow of electrical current. It is measured in ohms (Ω), and since 2019, when the base units of the International System of Units (SI) were most recently revised, the ohm has been defined in terms of the von Klitzing constant h/e2, where h and e are the Planck constant and the charge on an electron, respectively.

To measure this resistance with high precision, scientists use the fact that the von Klitzing constant is related to the quantized change in the Hall resistance of a two-dimensional electron system (such as the one that forms in a semiconductor heterostructure) in the presence of a strong magnetic field. This quantized change in resistance is known as the quantum Hall effect (QHE), and in a material like GaAs or AlGaAs, it shows up at fields of around 10 Tesla. Generating such high fields typically requires a superconducting electromagnet, however.

A completely different approach

Researchers connected to a European project called MEMQuD are now advocating a completely different approach. Their idea is based on memristors, which are programmable resistors that “remember” their previous resistance state even after they have been switched off. This previous resistance state can be changed by applying a voltage or current.

In the new work, a team led by Gianluca Milano of Italy’s Istituto Nazionale di Ricerca Metrologia (INRiM); Vitor Cabral of the Instituto Português da Qualidade; and Ilia Valov of the Institute of Electrochemistry and Energy Systems at the Bulgarian Academy of Sciences studied a device based on memristive nanoionics cells made from conducting filaments of silver. When an electrical field is applied to these filaments, their conductance changes in distinct, quantized steps.

The MEMQuD team reports that the quantum conductance levels achieved in this set-up are precise enough to be exploited as intrinsic standard values. Indeed, a large inter-laboratory comparison confirmed that the values deviated by just -3.8% and 0.6% from the agreed SI values for the fundamental quantum of conductance, G0, and 2G0, respectively. The researchers attribute this precision to tight, atomic-level control over the morphology of the nanochannels responsible for quantum conductance effects, which they achieved by electrochemically polishing the silver filaments into the desired configuration.

A national metrology institute condensed into a microchip

The researchers say their results are building towards a concept known as an “NMI-in-a-chip” – that is, condensing the services of a national metrology institute into a microchip. “This could lead to measuring devices that have their resistance references built-in directly into the chip,” says Milano, “so doing away with complex measurements in laboratories and allowing for devices with zero-chain traceability – that is, those that do not require calibration since they have embedded intrinsic standards.”

Yuma Okazaki of Japan’s National Institute of Advanced Industrial Science and Technology (AIST), who was not involved in this work, says that the new technique could indeed allow end users to directly access a quantum resistance standard.

“Notably, this method can be demonstrated at room temperature and under ambient conditions, in contrast to conventional methods that require cryogenic and vacuum equipment, which is expensive and require a lot of electrical power,” Okazaki says. “If such a user-friendly quantum standard becomes more stable and its uncertainty is improved, it could lead to a new calibration scheme for ensuring the accuracy of electronics used in extreme environments, such as space or the deep ocean, where traditional quantum standards that rely on cryogenic and vacuum conditions cannot be readily used.”

The MEMQuD researchers, who report their work in Nature Nanotechnology, now plan to explore ways to further decrease deviations from the agreed SI values for G0 and 2G0. These include better material engineering, an improved measurement protocol, and strategies for topologically protecting the memristor’s resistance.

The post Memristors could measure a single quantum of resistance appeared first on Physics World.

Oak Ridge Quantum Science Center prioritizes joined-up thinking, multidisciplinary impacts

8 décembre 2025 à 15:00

Travis Humble is a research leader who’s thinking big, dreaming bold, yet laser-focused on operational delivery. The long-game? To translate advances in fundamental quantum science into a portfolio of enabling technologies that will fast-track the practical deployment of quantum computers for at-scale scientific, industrial and commercial applications.

As director of the Quantum Science Center (QSC) at Oak Ridge National Laboratory (ORNL) in East Tennessee, Humble and his management team are well placed to transform that research vision into scientific, economic and societal upside. Funded to the tune of $115 million through its initial five-year programme (2020–25), QSC is one of five dedicated National Quantum Information Science Research Centers (NQISRC) within the US Department of Energy (DOE) National Laboratory system.

Validation came in spades last month when, despite the current turbulence around US science funding, QSC was given follow-on DOE backing of $125 million over five years (2025–30) to create “a new scientific ecosystem” for fault-tolerant, quantum-accelerated high-performance computing (QHPC). In short, QSC will target the critical research needed to amplify the impact of quantum computing through its convergence with leadership-class exascale HPC systems.

“Our priority in Phase II QSC is the creation of a common software ecosystem to host the compilers, programming libraries, simulators and debuggers needed to develop hybrid-aware algorithms and applications for QHPC,” explains Humble. Equally important, QSC researchers will develop and integrate new techniques in quantum error correction, fault-tolerant computing protocols and hybrid algorithms that combine leading-edge computing capabilities for pre- and post-processing of quantum programs. “These advances will optimize quantum circuit constructions and accelerate the most challenging computational tasks within scientific simulations,” Humble adds.

Classical computing, quantum opportunity

At the heart of the QSC programme sits ORNL’s leading-edge research infrastructure for classical HPC, a capability that includes Frontier, the first supercomputer to break the exascale barrier and still one of the world’s most powerful. On that foundation, QSC is committed to building QHPC architectures that take advantage of both quantum computers and exascale supercomputing to tackle all manner of scientific and industrial problems beyond the reach of today’s HPC systems alone.

“Hybrid classical-quantum computing systems are the future,” says Humble. “With quantum computers connecting both physically and logically to existing HPC systems, we can forge a scalable path to integrate quantum technologies into our scientific infrastructure.”

Frontier, a high-performance supercomputer
Quantum acceleration ORNL’s current supercomputer, Frontier, was the first high-performance machine to break the exascale barrier. Plans are in motion for a next-generation supercomputer, Discovery, to come online at ORNL by 2028. (Courtesy: Carlos Jones/ORNL, US DOE)

Industry partnerships are especially important in this regard. Working in collaboration with the likes of IonQ, Infleqtion and QuEra, QSC scientists are translating a range of computationally intensive scientific problems – quantum simulations of exotic matter, for example – onto the vendors’ quantum computing platforms, generating excellent results out the other side.

“With our broad representation of industry partners,” notes Humble, “we will establish a common framework by which scientific end-users, software developers and hardware architects can collaboratively advance these tightly coupled, scalable hybrid computing systems.”

It’s a co-development model that industry values greatly. “Reciprocity is key,” Humble adds. “At QSC, we get to validate that QHPC can address real-world research problems, while our industry partners gather user feedback to inform the ongoing design and optimization of their quantum hardware and software.”

Quantum impact

Innovation being what it is, quantum computing systems will continue to trend on an accelerating trajectory, with more qubits, enhanced fidelity, error correction and fault-tolerance key reference points on the development roadmap. Phase II QSC, for its part, will integrate five parallel research thrusts to advance the viability and uptake of QHPC technologies.

The collaborative software effort, led by ORNL’s Vicente Leyton, will develop openQSE, an adaptive, end-to-end software ecosystem for QHPC systems and applications. Yigit Subasi from Los Alamos National Laboratory (LANL) will lead the hybrid algorithms thrust, which will design algorithms that combine conventional and quantum methods to solve challenging problems in the simulation of model materials.

Meanwhile, the QHPC architectures thrust, under the guidance of ORNL’s Chris Zimmer, will co-design hybrid computing systems that integrate quantum computers with leading-edge HPC systems. The scientific applications thrust, led by LANL’s Andrew Sornberger, will develop and validate applications of quantum simulation to be implemented on prototype QHPC systems. Finally, ORNL’s Michael McGuire will lead the thrust to establish experimental baselines for quantum materials that ultimately validate QHPC simulations against real-world measurements.

Longer term, ORNL is well placed to scale up the QHPC model. After all, the laboratory is credited with pioneering the hybrid supercomputing model that uses graphics processing units in addition to conventional central processing units (including the launch in 2012 of Titan, the first supercomputer of this type operating at over 10 petaFLOPS).

“The priority for all the QSC partners,” notes Humble, “is to transition from this still-speculative research phase in quantum computing, while orchestrating the inevitable convergence between quantum technology, existing HPC capabilities and evolving scientific workflows.”

Collaborate, coordinate, communicate

Much like its NQISRC counterparts (which have also been allocated further DOE funding through 2030), QSC provides the “operational umbrella” for a broad-scope collaboration of more than 300 scientists and engineers from 20 partner institutions. With its own distinct set of research priorities, that collective activity cuts across other National Laboratories (Los Alamos and Pacific Northwest), universities (among them Berkeley, Cornell and Purdue) and businesses (including IBM and IQM) to chart an ambitious R&D pathway addressing quantum-state (qubit) resilience, controllability and, ultimately, the scalability of quantum technologies.

“QSC is a multidisciplinary melting pot,” explains Humble, “and I would say, alongside all our scientific and engineering talent, it’s the pooled user facilities that we are able to exploit here at Oak Ridge and across our network of partners that gives us our ‘grand capability’ in quantum science [see box, “Unique user facilities unlock QSC opportunities”]. Certainly, when you have a common research infrastructure, orchestrated as part a unified initiative like QSC, then you can deliver powerful science that translates into real-world impacts.”

Unique user facilities unlock QSC opportunities

Stephen Streiffer tours the LINAC Tunnel at the Spallation Neutron Source
Neutron insights ORNL director Stephen Streiffer tours the linear accelerator tunnel at the Spallation Neutron Source (SNS). QSC scientists are using the SNS to investigate entirely new classes of strongly correlated materials that demonstrate topological order and quantum entanglement. (Courtesy: Alonda Hines/ORNL, US DOE)

Deconstructed, QSC’s Phase I remit (2020–25) spanned three dovetailing and cross-disciplinary research pathways: discovery and development of advanced materials for topological quantum computing (in which quantum information is stored in a stable topological state – or phase – of a physical system rather than the properties of individual particles or atoms); development of next-generation quantum sensors (to characterize topological states and support the search for dark matter); as well as quantum algorithms and simulations (for studies in fundamental physics and quantum chemistry).

Underpinning that collective effort: ORNL’s unique array of scientific user facilities. A case in point is the Spallation Neutron Source (SNS), an accelerator-based neutron-scattering facility that enables a diverse programme of pure and applied research in the physical sciences, life sciences and engineering. QSC scientists, for example, are using SNS to investigate entirely new classes of strongly correlated materials that demonstrate topological order and quantum entanglement – properties that show great promise for quantum computing and quantum metrology applications.

“The high-brightness neutrons at SNS give us access to this remarkable capability for materials characterization,” says Humble. “Using the SNS neutron beams, we can probe exotic materials, recover the neutrons that scatter off of them and, from the resultant signals, infer whether or not the materials exhibit quantum properties such as entanglement.”

While SNS may be ORNL’s “big-ticket” user facility, the laboratory is also home to another high-end resource for quantum studies: the Center for Nanophase Material Science (CNMS), one of the DOE’s five national Nanoscience Research Centers, which offers QSC scientists access to specialist expertise and equipment for nanomaterials synthesis; materials and device characterization; as well as theory, modelling and simulation in nanoscale science and technology.

Thanks to these co-located capabilities, QSC scientists pioneered another intriguing line of enquiry – one that will now be taken forward elsewhere within ORNL – by harnessing so-called quantum spin liquids, in which electron spins can become entangled with each other to demonstrate correlations over very large distances (relative to the size of individual atoms).

In this way, it is possible to take materials that have been certified as quantum-entangled and use them to design new types of quantum devices with unique geometries – as well as connections to electrodes and other types of control systems – to unlock novel physics and exotic quantum behaviours. The long-term goal? Translation of quantum spin liquids into a novel qubit technology to store and process quantum information.

SNS, CNMS and Oak Ridge Leadership Computing Facility (OLCF) are DOE Office of Science user facilities.

When he’s not overseeing the technical direction of QSC, Humble is acutely attuned to the need for sustained and accessible messaging. The priority? To connect researchers across the collaboration – physicists, chemists, material scientists, quantum information scientists and engineers – as well as key external stakeholders within the DOE, government and industry.

“In my experience,” he concludes, ”the ability of the QSC teams to communicate efficiently – to understand each other’s concepts and reasoning and to translate back and forth across disciplinary boundaries – remains fundamental to the success of our scientific endeavours.”

Further information

Listen to the Physics World podcast: Oak Ridge’s Quantum Science Center takes a multidisciplinary approach to developing quantum materials and technologies

Scaling the talent pipeline in quantum science

Quantum science graduate students and postdoctoral researchers present and discuss their work during a poster session
The next generation Quantum science graduate students and postdoctoral researchers present and discuss their work during a poster session at the fifth annual QSC Summer School. Hosted at Purdue University in April this year, the school is one of several workforce development efforts supported by QSC. (Courtesy: Dave Mason/Purdue University)

With an acknowledged shortage of skilled workers across the quantum supply chain, QSC is doing its bit to bolster the scientific and industrial workforce. Front-and-centre: the fifth annual QSC Summer School, which was held at Purdue University in April this year, hosting 130 graduate students (the largest cohort to date) through an intensive four-day training programme.

The Summer School sits as part of a long-term QSC initiative to equip ambitious individuals with the specialist domain knowledge and skills needed to thrive in a quantum sector brimming with opportunity – whether that’s in scientific research or out in industry with hardware companies, software companies or, ultimately, the end-users of quantum technologies in key verticals like pharmaceuticals, finance and healthcare.

“While PhD students and postdocs are integral to the QSC research effort, the Summer School exposes them to the fundamental ideas of quantum science elaborated by leading experts in the field,” notes Vivien Zapf, a condensed-matter physicist at Los Alamos National Laboratory who heads up QSC’s advanced characterization efforts.

“It’s all about encouraging the collective conversation,” she adds, “with lots of opportunities for questions and knowledge exchange. Overall, our emphasis is very much on training up scientists and engineers to work across the diversity of disciplines needed to translate quantum technologies out of the lab into practical applications.”

The programme isn’t for the faint-hearted, though. Student delegates kicked off this year’s proceedings with a half-day of introductory presentations on quantum materials, devices and algorithms. Next up: three and a half days of intensive lectures, panel discussions and poster sessions covering everything from entangled quantum networks to quantum simulations of superconducting qubits.

Many of the Summer School’s sessions were also made available virtually on Purdue’s Quantum Coffeehouse Live Stream on YouTube – the streamed content reaching quantum learners across the US and further afield. Lecturers were drawn from the US National Laboratories, leading universities (such as Harvard and Northwestern) and the quantum technology sector (including experts from IBM, PsiQuantum, NVIDIA and JPMorganChase).

The post Oak Ridge Quantum Science Center prioritizes joined-up thinking, multidisciplinary impacts appeared first on Physics World.

So you want to install a wind turbine? Here’s what you need to know

8 décembre 2025 à 12:00

As a physicist in industry, I spend my days developing new types of photovoltaic (PV) panels. But I’m also keen to do something for the transition to green energy outside work, which is why I recently installed two PV panels on the balcony of my flat in Munich. Fitting them was great fun – and I can now enjoy sunny days even more knowing that each panel is generating electricity.

However, the panels, which each have a peak power of 440 W, don’t cover all my electricity needs, which prompted me to take an interest in a plan to build six wind turbines in a forest near me on the outskirts of Munich. Curious about the project, I particularly wanted to find out when the turbines will start generating electricity for the grid. So when I heard that a weekend cycle tour of the site was being organized to showcase it to local residents, I grabbed my bike and joined in.

As we cycle, I discover that the project – located in Forstenrieder Park – is the joint effort of four local councils and two “citizen-energy” groups, who’ve worked together for the last five years to plan and start building the six turbines. Each tower will be 166 m high and the rotor blades will be 80 m long, with the plan being for them to start operating in 2027.

I’ve never thought of Munich as a particularly windy city, but at the height at which the blades operate, there’s always a steady, reliable flow of wind

I’ve never thought of Munich as a particularly windy city. But tour leader Dieter Maier, who’s a climate adviser to Neuried council, explains that at the height at which the blades operate, there’s always a steady, reliable flow of wind. In fact, each turbine has a designed power output of 6.5 MW and will deliver a total of 10 GWh in energy over the course of a year.

Practical questions

Cycling around, I’m excited to think that a single turbine could end up providing the entire electricity demand for Neuried. But installing wind turbines involves much more than just the technicalities of generating electricity. How do you connect the turbines to the grid? How do you ensure planes don’t fly into the turbines? What about wildlife conservation and biodiversity?

At one point of our tour, we cycle round a 90-degree bend in the forest and I wonder how a huge, 80 m-long blade will be transported round that kind of tight angle? Trees will almost certainly have to be felled to get the blade in place, which sounds questionable for a supposedly green project. Fortunately, project leaders have been working with the local forest manager and conservationists, finding ways to help improve the local biodiversity despite the loss of trees.

As a representative of BUND (one of Germany’s biggest conservation charities) explains on the tour, a natural, or “unmanaged”, forest consists of a mix of areas with a higher or lower density of trees. But Forstenrieder Park has been a managed forest for well over a century and is mostly thick with trees. Clearing trees for the turbines will therefore allow conservationists to grow more of the bushes and plants that currently struggle to find space to flourish.

Small group of bikes at the edge of a large clearing in a forest
Cut and cover Trees in Forstenrieder Park have had to be chopped down to provide room for new wind turbines to be installed, but the open space will let conservationists grow plants and bushes to boost biodiversity. (Courtesy: Janina Moereke)

To avoid endangering birds and bats native to this forest, meanwhile, the turbines will be turned off when the animals are most active, which coincidentally corresponds to low wind periods in Munich. Insurance costs have to be factored in too. Thankfully, it’s quite unlikely that a turbine will burn down or get ice all over its blades, which means liability insurance costs are low. But vandalism is an ever-present worry.

In fact, at the end of our bike tour, we’re taken to a local wind turbine that is already up and running about 13 km further south of Forstenrieder Park. This turbine, I’m disappointed to discover, was vandalized back in 2024, which led to it being fenced off and video surveillance cameras being installed.

But for all the difficulties, I’m excited by the prospect of the wind turbines supporting the local energy needs. I can’t wait for the day when I’m on my balcony, solar panels at my side, sipping a cup of tea made with water boiled by electricity generated by the rotor blades I can see turning round and round on the horizon.

The post So you want to install a wind turbine? Here’s what you need to know appeared first on Physics World.

Galactic gamma rays could point to dark matter

5 décembre 2025 à 15:21
Fermi telescope data
Excess radiation Gamma-ray intensity map excluding components other than the halo, spanning approximately 100° in the direction of the centre of the Milky Way. The blank horizontal bar is the galactic plane area, which was excluded from the analysis to avoid strong astrophysical radiation. (Courtesy: Tomonori Totani/The University of Tokyo)

Gamma rays emitted from the halo of the Milky Way could be produced by hypothetical dark-matter particles. That is the conclusion of an astronomer in Japan who has analysed data from NASA’s Fermi Gamma-ray Space Telescope. The energy spectrum of the emission is what would be expected from the annihilation of particles called WIMPs. If this can be verified, it would mark the first observation of dark matter via electromagnetic radiation.

Since the 1930s astronomers have known that there is something odd about galaxies, galaxy clusters and larger structures in the universe. The problem is that there is not nearly enough visible matter in these objects to explain their dynamics and structure. A rotating galaxy, for example, should be flinging out its stars because it does not have enough self-gravitation to hold itself together.

Today, the most popular solution to this conundrum is the existence of a hypothetical substance called dark matter. Dark-matter particles would have mass and interact with each other and normal matter via the gravitational force, gluing rotating galaxies together. However, the fact that we have never observed dark matter directly means that the particles must rarely, if ever, interact via the other three forces.

Annihilating WIMPs

The weakly interacting massive particle (WIMP) is a dark-matter candidate that interacts via the weak nuclear force (or a similarly weak force). As a result of this interaction, pairs of WIMPs are expected to occasionally annihilate to create high-energy gamma rays and other particles. If this is true, dense areas of the universe such as galaxies should be sources of these gamma rays.

Now, Tomonori Totani of the University of Tokyo has analysed data from the Fermi telescope  and identified an excess of gamma rays emanating from the halo of the Milky Way. What is more, Totani’s analysis suggests that the energy spectrum of the excess radiation (from about 10−100 GeV) is consistent with hypothetical WIMP annihilation processes.

“If this is correct, to the extent of my knowledge, it would mark the first time humanity has ‘seen’ dark matter,” says Totani. “This signifies a major development in astronomy and physics,” he adds.

While Totani is confident of his analysis, his conclusion must be verified independently. Furthermore, work will be needed to rule out conventional astrophysical sources of the excess radiation.

Catherine Heymans, who is Astronomer Royal for Scotland told Physics World, “I think it’s a really nice piece of work, and exactly what should be happening with the Fermi data”.  The research is described in Journal of Cosmology and Astroparticle Physics. Heymans describes Totani’s paper as “well written and thorough”.

The post Galactic gamma rays could point to dark matter appeared first on Physics World.

Simple feedback mechanism keeps flapping flyers stable when hovering

5 décembre 2025 à 10:00

Researchers in the US have shed new light on the puzzling and complex flight physics of creatures such as hummingbirds, bumblebees and dragonflies that flap their wings to hover in place. According to an interdisciplinary team at the University of Cincinnati, the mechanism these animals deploy can be described by a very simple, computationally basic, stable and natural feedback mechanism that operates in real time. The work could aid the development of hovering robots, including those that could act as artificial pollinators for crops.

If you’ve ever watched a flapping insect or hummingbird hover in place – often while engaged in other activities such as feeding or even mating – you’ll appreciate how remarkable they are. To stay aloft and stable, these animals must constantly sense their position and motion and make corresponding adjustments to their wing flaps.

Feedback mechanism relies on two main components

Biophysicists have previously put forward many highly complex explanations for how they do this, but according to the Cincinnati team of Sameh Eisa and Ahmed Elgohary, some of this complexity is not necessary. Earlier this year, the pair developed their own mathematical and control theory based on a mechanism they call “extremum seeking for vibrational stabilization”.

Eisa describes this mechanism as “very natural” because it relies on just two main components. The first is the wing flapping motion itself, which he says is “naturally built in” for flapping creatures that use it to propel themselves. The second is a simple feedback mechanism involving sensations and measurements related to the altitude at which the creatures aim to stabilize their hovering.

The general principle, he continues, is that a system (in this case an insect or hummingbird) can steer itself towards a stable position by continuously adjusting a high-amplitude, high-frequency input control or signal (in this case, a flapping wing action). “This adjustment is simply based on the feedback of measurement (the insects’ perceptions) and stabilization (hovering) occurs when the system optimizes what it is measuring,” he says.

As well as being relatively easy to describe, Eisa tells Physics World that this mechanism is biologically plausible and computationally basic, dramatically simplifying the physics of hovering. “It is also categorically different from all available results and explanations in the literature for how stable hovering by insects and hummingbirds can be achieved,” he adds.

Researchers at dinner
The researchers and colleagues. (Courtesy: S Eisa)

Interdisciplinary work

In the latest study, which is detailed in Physical Review E, the researchers compared their simulation results to reported biological data on a hummingbird and five flapping insects (a bumblebee, a cranefly, a dragonfly, a hawkmoth and a hoverfly). They found that their simulation fit the data very closely. They also ran an experiment on a flapping, light-sensing robot and observed that it behaved like a moth: it elevated itself to the level of the light source and then stabilized its hovering motion.

Eisa says he has always been fascinated by such optimized biological behaviours. “This is especially true for flyers, where mistakes in execution could potentially mean death,” he says. “The physics behind the way they do it is intriguing and it probably needs elegant and sophisticated mathematics to be described. However, the hovering creatures appear to be doing this very simply and I found discovering the secret of this puzzle very interesting and exciting.”

Eisa adds that this element of the work ended up being very interdisciplinary, and both his own PhD in applied mathematics and the aerospace engineering background of Elgohary came in very useful. “We also benefited from lengthy discussions with a biologist colleague who was a reviewer of our paper,” Eisa says. “Luckily, they recognized the value of our proposed technique and ended up providing us with very valuable inputs.”

Eisa thinks the work could open up new lines of research in several areas of science and engineering. “For example, it opens up new ideas in neuroscience and animal sensory mechanisms and could almost certainly be applied to the development of airborne robotics and perhaps even artificial pollinators,” he says. “The latter might come in useful in the future given the high rate of death many species of pollinating insects are encountering today.”

The post Simple feedback mechanism keeps flapping flyers stable when hovering appeared first on Physics World.

Building a quantum future using topological phases of matter and error correction

4 décembre 2025 à 15:55

This episode of the Physics World Weekly podcast features Tim Hsieh of Canada’s Perimeter Institute for Theoretical Physics. We explore some of today’s hottest topics in quantum science and technology – including topological phases of matter; quantum error correction and quantum simulation.

Our conversation begins with an exploration of the quirky properties quantum matter and how these can be exploited to create quantum technologies. We look at the challenges that must be overcome to create large-scale quantum computers; and Hsieh reveals which problem he would solve first if he had access to a powerful quantum processor.

This interview was recorded earlier this autumn when I had the pleasure of visiting the Perimeter Institute and speaking to four physicists about their research. This is the third of those conversations to appear on the podcast.

The first interview in this series from the Perimeter Institute was with Javier Toledo-Marín, “Quantum computing and AI join forces for particle physics”; and the second was with Bianca Dittrich, “Quantum gravity: we explore spin foams and other potential solutions to this enduring challenge“.

APS logo

 

This episode is supported by the APS Global Physics Summit, which takes place on 15–20 March, 2026, in Denver, Colorado, and online.

The post Building a quantum future using topological phases of matter and error correction appeared first on Physics World.

Generative AI model detects blood cell abnormalities

4 décembre 2025 à 14:00
Blood cell images
Generative classification The CytoDiffusion classifier accurately identifies a wide range of blood cell appearances and detects unusual or rare blood cells that may indicate disease. The diagonal grid elements display original images of each cell type, while the off-diagonal elements show heat maps that provide insight into the model’s decision-making rationale. (Courtesy: Simon Deltadahl)

The shape and structure of blood cells provide vital indicators for diagnosis and management of blood disease and disorders. Recognizing subtle differences in the appearance of cells under a microscope, however, requires the skills of experts with years of training, motivating researchers to investigate whether artificial intelligence (AI) could help automate this onerous task. A UK-led research team has now developed a generative AI-based model, known as CytoDiffusion, that characterizes blood cell morphology with greater accuracy and reliability than human experts.

Conventional discriminative machine learning models can match human performance at classifying cells in blood samples into predefined classes. But discriminative models, which learn to recognise cell images based on expert labels, struggle with never-before-seen cell types and images from differing microscopes and staining techniques.

To address these shortfalls, the team – headed up at the University of Cambridge, University College London and Queen Mary University of London – created CytoDiffusion around a diffusion-based generative AI classifier. Rather than just learning to separate cell categories, CytoDiffusion models the full range of blood cell morphologies to provide accurate classification with robust anomaly detection.

“Our approach is motivated by the desire to achieve a model with superhuman fidelity, flexibility and metacognitive awareness that can capture the distribution of all possible morphological appearances,” the researchers write.

Authenticity and accuracy

For AI-based analysis to be adopted in the clinic, it’s essential that users trust a model’s learned representations. To assess whether CytoDiffusion could effectively capture the distribution of blood cell images, the team used it to generate synthetic blood cell images. Analysis by experienced haematologists revealed that these synthetic images were near-indistinguishable from genuine images, showing that CytoDiffusion genuinely learns the morphological distribution of blood cells rather than using artefactual shortcuts.

The researchers used multiple datasets to develop and evaluate their diffusion classifier, including CytoData, a custom dataset containing more than half a million anonymized cell images from almost 3000 blood smear slides. In standard classification tasks across these datasets, CytoDiffusion achieved state-of-the-art performance, matching or exceeding the capabilities of traditional discriminative models.

Effective diagnosis from blood smear samples also requires the ability to detect rare or previously unseen cell types. The researchers evaluated CytoDiffusion’s ability to detect blast cells (immature blood cells) in the test datasets. Blast cells are associated with blood malignancies such as leukaemia, and high detection sensitivity is essential to minimize false negatives.

In one dataset, CytoDiffusion detected blast cells with sensitivity and specificity of 0.905 and 0.962, respectively. In contrast, a discriminative model exhibited a poor sensitivity of 0.281. In datasets with erythroblasts as the abnormal cells, CytoDiffusion again outperformed the discriminative model, demonstrating that it can detect abnormal cell types not present in its training data, with the high sensitivity required for clinical applications.

Robust model

It’s important that a classification model is robust to different imaging conditions and can function with sparse training data, as commonly found in clinical applications. When trained and tested on diverse image datasets (different hospitals, microscopes and staining procedures), CytoDiffusion achieved state-of-the-art accuracy in all cases. Likewise, after training on limited subsets of 10, 20 and 50 images per class, CytoDiffusion consistently outperformed discriminative models, particularly in the most data-scarce conditions.

Another essential feature of clinical classification tasks, whether performed by a human or an algorithm, is knowing the uncertainty in the final decision. The researchers developed a framework for evaluating uncertainty and showed that CytoDiffusion produced superior uncertainty estimates to human experts. With uncertainty quantified, cases with high certainty could be processed automatically, with uncertain cases flagged for human review.

“When we tested its accuracy, the system was slightly better than humans,” says first author Simon Deltadahl from the University of Cambridge in a press statement. “But where it really stood out was in knowing when it was uncertain. Our model would never say it was certain and then be wrong, but that is something that humans sometimes do.”

Finally, the team demonstrated CytoDiffusion’s ability to create heat maps highlighting regions that would need to change for an image to be reclassified. This feature provides insight into the model’s decision-making process and shows that it understands subtle differences between similar cell types. Such transparency is essential for clinical deployment of AI, making models more trustworthy as practitioners can verify that classifications are based on legitimate morphological features.

“The true value of healthcare AI lies not in approximating human expertise at lower cost, but in enabling greater diagnostic, prognostic and prescriptive power than either experts or simple statistical models can achieve,” adds co-senior author Parashkev Nachev from University College London.

CytoDiffusion is described in Nature Machine Intelligence.

The post Generative AI model detects blood cell abnormalities appeared first on Physics World.

Light pollution from satellite mega-constellations threaten space-based observations

4 décembre 2025 à 12:50

Almost every image that will be taken by future space observatories in low-Earth orbit could be tainted due to light contamination from satellites. That is according to a new analysis from researchers at NASA, which stresses that light pollution from satellites orbiting Earth must be reduced to guarantee astronomical research is not affected.

The number of satellites orbiting Earth has increased from about 2000 in 2019 to 15 000 today. Many of these are part of so-called mega-constellations that provide services such as Internet coverage around the world, including in areas that were previously unable to access it. Examples of such constellations include SpaceX’s Starlink as well as Amazon’s Kuiper and Eutelsat’s OneWeb.

Many of these mega-constellations share the same space as space-based observatories such as NASA’s Hubble Space Telescope. This means that the telescopes can capture streaks of reflected light from the satellites that render the images or data completely unusable for research purposes. That is despite anti-reflective coating that is applied to some newer satellites in SpaceX’s Starlink constellation, for example.

Previous work has explored the impact of such satellites constellations on ground-based astronomy, both optical and radioastronomy. Yet their impact on telescopes in space has been overlooked.

To find out more, Alejandro Borlaff from NASA’s Ames Research Center, and colleagues simulated the view of four space-based telescopes: Hubble and the near-infrared observatory SPHEREx, which launched in 2025, as well at the European Space Agency’s proposed near-infrared ARRAKIHS mission and China’s planned Xuntian telescopes.

These observatories are, or will be placed, between 400 and 800 km from the Earth’s surface.

The authors found that if the population of mega-constellation satellites grows to the 56 000 that is projected by the end of the decade, it would contaminate about 39.6% of Hubble’s images and 96% of images from the other three telescopes.

Borlaff and colleagues predict that the average number of satellites observed per exposure would be 2.14 for Hubble, 5.64 for SPHEREx, 69 for ARRAKIHS, and 92 for Xuntian.

The authors note that one solution could be to deploy satellites at lower orbits than the telescopes operate, which would make them about four magnitudes dimmer. The downside is that emissions from these lower satellites could have implications for Earth’s ozone layer.

An ‘urgent need for dialogue’

Katherine Courtney, chair of the steering board for the Global Network on Sustainability in Space, says that without astronomy, the modern space economy “simply wouldn’t exist”.

“The space industry owes its understanding of orbital mechanics, and much of the technology development that has unlocked commercial opportunities for satellite operators, to astronomy,” she says. “The burgeoning growth of the satellite population brings many benefits to life on Earth, but the consequences for the future of astronomy must be taken into consideration.”

Courtney adds that there is now “an urgent need for greater dialogue and collaboration between astronomers and satellite operators to mitigate those impacts and find innovative ways for commercial and scientific operations to co-exist in space.”

  • Katherine Courtney, chairs the Global Network on Sustainability in Space, and Alice Gorman from Flinders University in Adelaide, Australia, appeared on a Physics World Live panel discussion about the impact of space debris that was held on 10 November. A recording of the event is available here.

The post Light pollution from satellite mega-constellations threaten space-based observations appeared first on Physics World.

Physicists use a radioactive molecule’s own electrons to probe its internal structure

4 décembre 2025 à 10:00

Physicists have obtained the first detailed picture of the internal structure of radium monofluoride (RaF) thanks to the molecule’s own electrons, which penetrated the nucleus of the molecule and interacted with its protons and neutrons. This behaviour is known as the Bohr-Weisskopf effect, and study co-leader Shane Wilkins says that this marks the first time it has been observed in a molecule. The measurements themselves, he adds, are an important step towards testing for nuclear symmetry violation, which might explain why our universe contains much more matter than antimatter.

RaF contains the radioactive isotope 225Ra, which is not easy to make, let alone measure. Producing it requires a large accelerator facility at high temperature and high velocity, and it is only available in tiny quantities (less than a nanogram in total) for short periods (it has a nuclear half-life of around 15 days).

“This imposes significant challenges compared to the study of stable molecules, as we need extremely selective and sensitive techniques in order to elucidate the structure of molecules containing 225Ra,” says Wilkins, who performed the measurements as a member of Ronald Fernando Garcia Ruiz’s research group at the Massachusetts Institute of Technology (MIT), US.

The team chose RaF despite these difficulties because theory predicts that it is particularly sensitive to small nuclear effects that break the symmetries of nature. “This is because, unlike most atomic nuclei, the radium atom’s nucleus is octupole deformed, which basically means it has a pear shape,” explains the study’s other co-leader, Silviu-Marian Udrescu.

Electrons inside the nucleus

In their study, which is detailed in Science, the MIT team and colleagues at CERN, the University of Manchester, UK and KU Leuven in the Netherlands focused on RaF’s hyperfine structure. This structure arises from interactions between nuclear and electron spins, and studying it can reveal valuable clues about the nucleus. For example, the nuclear magnetic dipole moment can provide information on how protons and neutrons are distributed inside the nucleus.

In most experiments, physicists treat electron-nucleus interactions as taking place at (relatively) long ranges. With RaF, that’s not the case. Udrescu describes the radium atom’s electrons as being “squeezed” within the molecule, which increases the probability that they will interact with, and penetrate, the radium nucleus. This behaviour manifests itself as a slight shift in the energy levels of the radium atom’s electrons, and the team’s precision measurements – combined with state-of-the-art molecular structure calculations – confirm that this is indeed what happens.

“We see a clear breakdown of this [long-range interactions] picture because the electrons spend a significant amount of time within the nucleus itself due to the special properties of this radium molecule,” Wilkins explains. “The electrons thus act as highly sensitive probes to study phenomena inside the nucleus.”

Searching for violations of fundamental symmetries

According to Udrescu, the team’s work “lays the foundations for future experiments that use this molecule to investigate nuclear symmetry violation and test the validity of theories that go beyond the Standard Model of particle physics.” In this model, each of the matter particles we see around us – from baryons like protons to leptons such as electrons – should have a corresponding antiparticle that is identical in every way apart from its charge and magnetic properties (which are reversed).

The problem is that the Standard Model predicts that the Big Bang that formed our universe nearly 14 billion years ago should have generated equal amounts of antimatter and matter – yet measurements and observations made today reveal an almost entirely matter-based universe. Subtler differences between matter particles and their antimatter counterparts might explain why the former prevailed, so by searching for these differences, physicists hope to explain antimatter-matter asymmetry.

Wilkins says the team’s work will be important for future such searches in species like RaF. Indeed, Wilkins, who is now at Michigan State University’s Facility for Rare Isotope Beams (FRIB), is building a new setup to cool and slow beams of radioactive molecules to enable higher-precision spectroscopy of species relevant to nuclear structure, fundamental symmetries and astrophysics. His long-term goal, together with other members of the RaX collaboration (which includes FRIB and the MIT team as well as researchers at Harvard University and the California Institute of Technology), is to implement advanced laser-based techniques using radium-containing molecules.

The post Physicists use a radioactive molecule’s own electrons to probe its internal structure appeared first on Physics World.

Quantum-scale thermodynamics offers a tighter definition of entropy

3 décembre 2025 à 17:18

A new, microscopic formulation of the second law of thermodynamics for coherently driven quantum systems has been proposed by researchers in Switzerland and Germany. The researchers applied their formulation to several canonical quantum systems, such as a three-level maser. They believe the result provides a tighter definition of entropy in such systems, and could form a basis for further exploration.

In any physical process, the first law of thermodynamics says that the total energy must always be conserved, with some converted to useful work and the remainder dissipated as heat. The second law of thermodynamics says that, in any allowed process, the total amount of heat (the entropy) must always increase.

“I like to think of work being mediated by degrees of freedom that we control and heat being mediated by degrees of freedom that we cannot control,” explains theoretical physicist Patrick Potts of the University of Basel in Switzerland. “In the macroscopic scenario, for example, work would be performed by some piston – we can move it.” The heat, meanwhile, goes into modes such as phonons generated by friction.

Murky at small scales

This distinction, however, becomes murky at small scales: “Once you go microscopic everything’s microscopic, so it becomes much more difficult to say ‘what is it that that you control – where is the work mediated – and what is it that you cannot control?’,” says Potts.

Potts and colleagues in Basel and at RWTH Aachen University in Germany examined the case of optical cavities driven by laser light, systems that can do work: “If you think of a laser as being able to promote a system from a ground state to an excited state, that’s very important to what’s being done in quantum computers, for example,” says Potts. “If you rotate a qubit, you’re doing exactly that.”

The light interacts with the cavity and makes an arbitrary number of bounces before leaking out. This emergent light is traditionally treated as heat in quantum simulations. However, it can still be partially coherent – if the cavity is empty, it can be just as coherent as the incoming light and can do just as much work.

In 2020, quantum optician Alexia Auffèves of Université Grenoble Alpes in France and colleagues noted that the coherent component of the light exiting a cavity could potentially do work. In the new study, the researchers embedded this in a consistent thermodynamic framework. They studied several examples and formulated physically consistent laws of thermodynamics.

In particular, they looked at the three-level maser, which is a canonical example of a quantum heat engine. However, it has generally been modelled semi-classically by assuming that the cavity contains a macroscopic electromagnetic field.

Work vanishes

“The old description will tell you that you put energy into this macroscopic field and that is work,” says Potts, “But once you describe the cavity quantum mechanically using the old framework then – poof! – the work is gone…Putting energy into the light field is no longer considered work, and whatever leaves the cavity is considered heat.”

The researchers new thermodynamic treatment allows them to treat the cavity quantum mechanically and to parametrize the minimum degree of entropy in the radiation that emerges – how much radiation must be converted to uncontrolled degrees of freedom that can do no useful work and how much can remain coherent.

The researchers are now applying their formalism to study thermodynamic uncertainty relations as an extension of the traditional second law of thermodynamics. “It’s actually a trade-off between three things – not just efficiency and power, but fluctuations also play a role,” says Potts. “So the more fluctuations you allow for, the higher you can get the efficiency and the power at the same time. These three things are very interesting to look at with this new formalism because these thermodynamic uncertainty relations hold for classical systems, but not for quantum systems.”

“This [work] fits very well into a question that has been heavily discussed for a long time in the quantum thermodynamics community, which is how to properly define work and how to  properly define useful resources,” says quantum theorist Federico Cerisola of the UK’s University of Exeter. “In particular, they very convincingly argue that, in the particular family of experiments they’re describing, there are resources that have been ignored in the past when using more standard approaches that can still be used for something useful.”

Cerisola says that, in his view, the logical next step is to propose a system – ideally one that can be implemented experimentally – in which radiation that would traditionally have been considered waste actually does useful work.

The research is described in Physical Review Letters.  

The post Quantum-scale thermodynamics offers a tighter definition of entropy appeared first on Physics World.

Bring gravity back down to Earth: from giraffes and tree snakes to ‘squishy’ space–time

3 décembre 2025 à 14:00

When I was five years old, my family moved into a 1930s semi-detached house with a long strip of garden. At the end of the garden was a miniature orchard of eight apple trees the previous owners had planted – and it was there that I, much like another significantly more famous physicist, learned an important lesson about gravity.

As I read in the shade of the trees, an apple would sometimes fall with a satisfying thunk into the soft grass beside me. Less satisfyingly, they sometimes landed on my legs, or even my head – and the big cooking apples really hurt. I soon took to sitting on old wooden pallets crudely wedged among the higher branches. It was not comfortable, but at least I could return indoors without bruises.

The effects of gravity become common sense so early in life that we rarely stop to think about them past childhood. In his new book Crush: Close Encounters with Gravity, James Riordon has decided to take us back to the basics of this most fundamental of forces. Indeed, he explores an impressively wide range of topics – from why we dream of falling and why giraffes should not exist (but do), to how black holes form and the existence of “Planet 9”.

Riordon, a physicist turned science writer, makes for a deeply engaging author. He is not afraid to put himself into the story, introducing difficult concepts through personal experience and explaining them with the help of everything including the kitchen sink, which in his hands becomes an analogue for a black hole.

Gravity as a subject can easily be both too familiar and too challenging. In Riordon’s words, “Things with mass attract each other. That’s really all there is to Newtonian gravity.” While Albert Einstein’s theory of general relativity, by contrast, is so intricate that it takes years of university-level study to truly master. Riordon avoids both pitfalls: he manages to make the simple fascinating again, and the complex understandable.

He provides captivating insights into how gravity has shaped the animal kingdom, a perspective I had never much considered. Did you know that tree snakes have their hearts positioned closer to their heads than their land-based cousins? I certainly didn’t. The higher placement ensures a steady blood flow to the brain, even when the snake is climbing vertically. It is one of many examples that make you look again at the natural world with fresh eyes.

Riordon’s treatment of gravity in Einstein’s abstract space–time is equally impressive, perhaps unsurprisingly, as his previous books include Very Easy Relativity and Relatively Easy Relativity. Riordon takes a careful, patient approach – though I have never before heard general relativity reduced to “space–time is squishy”. But why not? The phrase sticks and gives us a handhold as we scale the complications of the theory. For those who want to extend the challenge, a mathematical background to the theory is provided in an appendix, and every chapter is well referenced and accompanied with suggestions for further reading.

If anything, I found myself wanting more examples of gravity as experienced by humans and animals on Earth, as opposed to in the context of the astronomical realm. I found these down-to-earth chapters the most fascinating: they formed a bridge between the vast and the local, reminding us that the same force that governs the orbits of galaxies also brings an apple to the ground. This may be a reaction only felt by astronomers like me, who already spend their days looking upward. I can easily see how the balance Riordon chose is necessary for someone without that background, and Einstein’s gravity does require galactic scales to appreciate, after all.

Crush is a generally uncomplicated and pleasurable read. The anecdotes can sometimes be a little long-winded and there are parts of the book that are not without challenge. But it is pitched perfectly for the curious general reader and even for those dipping their toes into popular science for the first time. I can imagine an enthusiastic A-level student devouring it; it is exactly the kind of book I would have loved at that age. Even if some of it would have gone over my head, Riordon’s enthusiasm and gift for storytelling would have kept me more than interested, as I sat up on that pallet in my favourite apple tree.

I left that house, and that tree, a long time ago, but just a few miles down the road from where I live now stands another, far more famous apple tree. In the garden of Woolsthorpe Manor near Grantham, Newton is said to have watched an apple fall. From that small event, he began to ask the questions that reshaped his and our understanding of the universe. Whether or not the story is true hardly matters – Newton was constantly inspired by the natural world, so it isn’t improbable, and that apple tree remains a potent symbol of curiosity and insight.

“[Newton] could tell us that an apple falls, and how quickly it will do it. As for the question of why it falls, that took Einstein to answer,” writes Riordon. Crush is a crisp and fresh tour through a continuum from orchards to observatories, showing that every planetary orbit, pulse of starlight and even every apple fall is part of the same wondrous story.

  • 2025 MIT Press 288pp £27hb

The post Bring gravity back down to Earth: from giraffes and tree snakes to ‘squishy’ space–time appeared first on Physics World.

Ice XXI appears in a diamond anvil cell

3 décembre 2025 à 13:00

A new phase of water ice, dubbed ice XXI, has been discovered by researchers working at the European XFEL and PETRA III facilities. The ice, which exists at room temperature and is structurally distinct from all previously observed phases of ice, was produced by rapidly compressing water to high pressures of 2 GPa. The finding could shed light on how different ice phases form at high pressures, including on icy moons and planets.

On Earth, ice can take many forms, and its properties depend strongly on its structure. The main type of naturally-occurring ice is hexagonal ice (Ih), so-called because the water molecules arrange themselves in a hexagonal lattice (this is the reason why snowflakes have six-fold symmetry). However, under certain conditions – usually involving very high pressures and low temperatures – ice can take on other structures. Indeed, 20 different forms of ice have been identified so far, denoted by roman numerals (ice I, II, III and so on up to ice XX).

Pressures of up to 2 GPa allow ice to form even at room temperature

Researchers from the Korea Research Institute of Standards and Science (KRISS) have now produced a 21st form of ice by applying pressures of up to two gigapascals. Such high pressures are roughly 20 000 times higher than normal air pressure at sea level, and they allow ice to form even at room temperature – albeit only within a device known as a dynamic diamond anvil cell (dDAC) that is capable of producing such extremely high pressures.

“In this special pressure cell, samples are squeezed between the tips of two opposing diamond anvils and can be compressed along a predefined pressure pathway,” explains Cornelius Strohm, a member of the DESY HIBEF team that set up the experiment using the High Energy Density (HED) instrument at the European XFEL.

Much more tightly packed molecules

The structure of ice XXI is different from all previously observed phases of ice because its molecules are much more tightly packed. This gives it the largest unit cell volume of all currently known types of ice, says KRISS scientist Geun Woo Lee. It is also metastable, meaning that it can exist even though another form of ice (in this case ice VI) would be more stable under the conditions in the experiment.

“This rapid compression of water allows it to remain liquid up to higher pressures, where it should have already crystallized to ice VI,” explains Lee. “Ice VI is an especially intriguing phase, thought to be present in the interior of icy moons such as Titan and Ganymede. Its highly distorted structure may allow complex transition pathways that lead to metastable ice phases.”

Ice XXI has a body-centred tetragonal crystal structure

To study how the new ice sample formed, the researchers rapidly compressed and decompressed it over 1000 times in the diamond anvil cell while imaging it every microsecond using the European XFEL, which produces megahertz frequency X-ray pulses at extremely high rates. They found that the liquid water crystallizes into different structures depending on how supercompressed it is.

The KRISS team then used the P02.2 beamline at PETRA III to determine that the ice XXI has a body-centred tetragonal crystal structure with a large unit cell (a = b = 20.197 Å and c = 7.891 Å) at approximately 1.6 GPa. This unit cell contains 152 water molecules, resulting in a density of 1.413 g cm−3.

The experiments were far from easy, recalls Lee. Upon crystallization, Ice XXI grows upwards (that is, in the vertical direction), which makes it difficult to precisely analyse its crystal structure. “The difficulty for us is to keep it stable for a long enough period to make precise structural measurements in single crystal diffraction study,” he says.

The multiple pathways of ice crystallization unearthed in this work, which is detailed in Nature Materials, imply that many more ice phases may exist. Lee says it is therefore important to analyse the mechanism behind the formation of these phases. “This could, for example, help us better understand the formation and evolution of these phases on icy moons or planets,” he tells Physics World.

The post Ice XXI appears in a diamond anvil cell appeared first on Physics World.

Studying the role of the quantum environment in attosecond science

3 décembre 2025 à 11:00

Attosecond science is undoubtedly one of the fastest growing branches of physics today.

Its popularity was demonstrated by the award of the 2023 Nobel Prize in Physics to Anne L’Huillier, Paul Corkum and Ferenc Krausz for experimental methods that generate attosecond pulses of light for the study of electron dynamics in matter.

One of the most important processes in this field is dephasing. This happens when an electron loses its phase coherence because of interactions with its surroundings.

This loss of coherence can obscure the fine details of electron dynamics, making it harder to capture precise snapshots of these rapid processes.

The most common way to model this process in light-matter interactions is by using the relaxation time approximation. This approach greatly simplifies the picture as it avoids the need to model every single particle in the system.

Its use is fine for dilute gases, but it doesn’t work as well with intense lasers and denser materials, such as solids, because it greatly overestimates ionisation.

This is a significant problem as ionisation is the first step in many processes such as electron acceleration and high-harmonic generation.

To address this problem, a team led by researchers from the University of Ottawa have developed a new method to correct for this problem.

By introducing a heat bath into the model they were able to represent the many-body environment that interacts with electrons, without significantly increasing the complexity.

This new approach should enable the identification of new effects in attosecond science or wherever strong electromagnetic fields interact with matter.

Read the full article

Strong field physics in open quantum systems – IOPscience

N. Boroumand et al, 2025 Rep. Prog. Phys. 88 070501

 

The post Studying the role of the quantum environment in attosecond science appeared first on Physics World.

❌