↩ Accueil

Vue normale

Reçu aujourd’hui — 24 juin 2025Physics World

Micronozzle could give laser-driven particle accelerators a boost

24 juin 2025 à 14:52

Proton energies achievable in laser accelerators could be tripled by using specially designed micronozzle targets, according to computer simulations done by physicists in Japan and India. In their design, the electric field generated in the micronozzle would be funnelled towards the outgoing protons, allowing the acceleration to proceed for much longer. The researchers believe that the research could be useful in nuclear fusion, hadron therapy and materials science.

Conventional accelerators use oscillating electric fields to drive charged particles to relativistic speeds. The Large Hadron Collider at CERN, for example, uses radio-frequency oscillations to achieve proton energies of nearly 7 TeV.

These accelerators tend to be very large, which limits where they can be built. Laser acceleration, which involves using high-energy laser pulses to accelerate charged particle, offers a way to create much more compact accelerators.

Crucial to inertial confinement

Laser acceleration is crucial to inertial confinement fusion, and high energy proton beams produced by laser accelerators are used in scientific laboratories for a variety of scientific applications including laboratory astrophysics.

The standard techniques for laser acceleration involve firing a laser pulse at a proton target surrounded by metal foil. Solid hydrogen only exists near absolute zero, so the proton target can be a hydrogen-rich compound such as a hydride or a polymer. The femtosecond laser pulse concentrates a huge amount of energy into a tiny area and this instantly turns the target into a plasma. The light’s oscillating electromagnetic field drives electrons through the plasma, leaving behind the much heavier ions and creating a huge electric field that can accelerate protons.

In the new work, physicist Masakatsu Murakami and colleagues at the University of Osaka in Japan, together with researchers at the Indian Institute of Technology Hyderabad, used computer modelling to examine the effect of changing the shape of the metal surrounding the target from a simple planar foil to a two-headed nozzle, with the target placed at the narrowest point. During the first stage of the acceleration process, the wide head of the nozzle behaves like a lens, concentrating the electric field from a wide area to produce an enhanced flow of hot electrons towards the centre. This electric current on the nozzle enhances ablation of protons from the hydrogen rod, kicking them forward into the vacuum.

“Just like a rocket nozzle”

Subsequently, the electrons keep moving through the “skirt” of the nozzle, creating a powerful electric field that, owing to the nozzle’s shape, remains focused on the accelerating proton pulse as it travels away into the vacuum. “With the single hydrogen rod and the single foil, the protons are accelerated only during the laser illumination,” explains Murakami. “However, interestingly with the micronozzle target, the acceleration keeps going even after the laser pulse illumination…Most of the plasma expands in a small volume together with the protons – just like a rocket nozzle,” he says. Whereas the standard proton energies achievable with a laser accelerator today are around 400 MeV, the researchers estimate that their micronozzle design could allow energies into the gigaelectronvolt regime without changing anything else.

Murakami has been studying nuclear fusion for 40 years and believes that “this method will be used for fast ignition of laser fusion”. However, he says, its potential uses go far beyond this. Proton beam therapy generally uses protons with energies of 200–300 MeV to treat cancer by delivering a high dose of radiation to the tumour and a much lower dose to surrounding healthy tissue. “Even higher energy is required to target cancers that are located in deeper parts of the body,” he says. The technique could also be useful for materials science techniques such as proton radiography or for simulation of the physics of astrophysical objects such as neutron stars. “I’m planning to do proof of principle experiments in the near future,” says Murakami. 

Accelerator physicist Nicholas Dover of Imperial College London describes the work as “very interesting,” adding, “This target that they propose is a very complex thing to make. It would be a big project for a target fabrication lab to generate something like this – it’s not something we just cook up in our lab. Having these numerical optimizations is really helpful for us.” He notes, however, that one reason accelerator physicists often use planar targets (essentially pieces of kitchen foil) is the need to replace them in every shot. In scientific applications, this may not matter, he says. Applications in fields like medicine, however, would probably require the development of mass production facilities to fabricate the targets economically.

The research is described in Scientific Reports.

The post Micronozzle could give laser-driven particle accelerators a boost appeared first on Physics World.

Harnessing the power of light for healthcare

24 juin 2025 à 12:00

Light has always played a central role in healthcare, enabling a wide range of tools and techniques for diagnosing and treating disease. Nick Stone from the University of Exeter is a pioneer in this field, working with technologies ranging from laser-based cancer therapies to innovative spectroscopy-based diagnostics. Stone was recently awarded the Institute of Physics’ Rosalind Franklin Medal and Prize for developing novel Raman spectroscopic tools for rapid in vivo cancer diagnosis and monitoring. Physics World’s Tami Freeman spoke with Stone about his latest research.

What is Raman spectroscopy and how does it work?

Think about how we see the sky. It is blue due to elastic (specifically Rayleigh) scattering – when an incident photon scatters off a particle without losing any energy. But in about one in a million events, photons interacting with molecules in the atmosphere will be inelastically scattered. This changes the energy of the photon as some of it is taken by the molecule to make it vibrate.

If you shine laser light on a molecule and cause it to vibrate, the photon that is scattered from that molecule will be shifted in energy by a specific amount relating to the molecule’s vibrational mode. Measuring the wavelength of this inelastically scattered light reveals which molecule it was scattered from. This is Raman spectroscopy.

Because most of the time we’re working at room or body temperatures, most of what we observe is Stokes Raman scattering, in which the laser photons lose energy to the molecules. But if a molecule is already vibrating in an excited state (at higher temperature), it can give up energy and shift the laser photon to a higher energy. This anti-Stokes spectrum is much weaker, but can be very useful – as I’ll come back to later.

How are you using Raman spectroscopy for cancer diagnosis?

A cell in the body is basically a nucleus: one set of molecules, surrounded by the cytoplasm: another set of molecules. These molecules change subtlety depending on the phenotype [set of observable characteristics] of the particular cell. If you have a genetic mutation, which is what drives cancer, the cell tends to change its relative expression of proteins, nucleic acids, glycogen and so on.

We can probe these molecules with light, and therefore determine their molecular composition. Cancer diagnostics involves identifying minute changes between the different compositions. Most of our work has been in tissues, but it can also be done in biofluids such as tears, blood plasma or sweat. You build up a molecular fingerprint of the tissue or cell of interest, and then you can compare those fingerprints to identify the disease.

We tend to perform measurements under a microscope and, because Raman scattering is a relatively weak effect, this requires good optical systems. We’re trying to use a single wavelength of light to probe molecules of interest and look for wavelengths that are shifted from that of the laser illumination. Technology improvements have provided holographic filters that remove the incident laser wavelength readily, and less complex systems that enable rapid measurements.

Raman spectroscopy can classify tissue samples removed in cancer surgery, for example. But can you use it to detect cancer without having to remove tissue from the patient?

Absolutely, we’ve developed probes that fit inside an endoscope for diagnosing oesophageal cancer.

Earlier in my career I worked on photodynamic therapy. We would look inside the oesophagus with an endoscope to find disease, then give the patient a phototoxic drug that would target the diseased cells. Shining light on the drug causes it to generate singlet oxygen that kills the cancer cells. But I realized that the light we were using could also be used for diagnosis.

Currently, to find this invisible disease, you have to take many, many biopsies. But our in vivo probes allow us to measure the molecular composition of the oesophageal lining using Raman spectroscopy, to be and determine where to take biopsies from. Oesophageal cancer has a really bad outcome once it’s diagnosed symptomatically, but if you can find the disease early you can deliver effective treatments. That’s what we’re trying to do.

Two photos: macro of a narrow probe inside a tube a few millimetres wide; a doctor wearing scrubs feeding a narrow tube into a piece of surgical equipment
Tiny but mighty (left) A Raman probe protruding from the instrument channel of an endoscope. (right) Oliver Old, consultant surgeon, passing the probe down an endoscope for a study led by the University of Exeter, with the University of Bristol and Gloucestershire Hospitals NHS Foundation Trust as partners. (Courtesy: RaPIDE Team)

The very weak Raman signal, however, causes problems. With a microscope, we can use advanced filters to remove the incident laser wavelength. But sending light down an optical fibre generates unwanted signal, and we also need to remove elastically scattered light from the oesophagus. So we had to put a filter on the end of this tiny 2 mm fibre probe. In addition, we don’t want to collect photons that have travelled a long way through the body, so we needed a confocal system. We built a really complex probe, working in collaboration with John Day at the University of Bristol – it took a long time to optimize the optics and the engineering.

Are there options for diagnosing cancer in places that can’t be accessed via an endoscope?

Yes, we have also developed a smart needle probe that’s currently in trials. We are using this to detect lymphomas – the primary cancer in lymph nodes – in the head and neck, under the armpit and in the groin.

If somebody comes forward with lumps in these areas, they usually have a swollen lymph node, which shows that something is wrong. Most often it’s following an infection and the node hasn’t gone back down in size.

This situation usually requires surgical removal of the node to decide whether cancer is present or not. Instead, we can just insert our needle probe and send light in. By examining the scattered light and measuring its fingerprint we can identify if it’s lymphoma. Indeed, we can actually see what type of cancer it is and where it has come from. 

Nick Stone sat on stage holding up a prototype needle probe
Novel needle Nick Stone demonstrates a prototype Raman needle probe. (Courtesy: Matthew Jones Photography)

Currently, the prototype probe is quite bulky because we are trying to make it low in cost. It has to have a disposable tip, so we can use a new needle each time, and the filters and optics are all in the handpiece.

Are you working on any other projects at the moment?

As people don’t particularly want a needle stuck in them, we are now trying to understand where the photons travel if you just illuminate the body. Red and near-infrared light travel a long way through the body, so we can use near-infrared light to probe photons that have travelled many, many centimetres.

We are doing a study looking at calcifications in a very early breast cancer called ductal carcinoma in situ (DCIS) – it’s a Cancer Research UK Grand Challenge called DCIS PRECISION, and we are just moving on to the in vivo phase.

Calcifications aren’t necessarily a sign of breast cancer – they are mostly benign; but in patients with DCIS, the composition of the calcifications can show how their condition will progress. Mammographic screening is incredibly good at picking up breast cancer, but it’s also incredibly good at detecting calcifications that are not necessarily breast cancer yet. The problem is how to treat these patients, so our aim is to determine whether the calcifications are completely fine or if they require biopsy.

We are using Raman spectroscopy to understand the composition of these calcifications, which are different in patients who are likely to progress onto invasive disease. We can do this in biopsies under a microscope and are now trying to see whether it works using transillumination, where we send near-infrared light through the breast. We could use this to significantly reduce the number of biopsies, or monitor individuals with DCIS over many years.

Light can also be harnessed to treat disease, for example using photodynamic therapy as you mentioned earlier. Another approach is nanoparticle-based photothermal therapy, how does this work?

This is an area I’m really excited about. Nanoscale gold can enhance Raman signals by many orders of magnitude – it’s called surface-enhanced Raman spectroscopy. We can also “label” these nanoparticles by adding functional molecules to their surfaces. We’ve used unlabelled gold nanoparticles to enhance signals from the body and labelled gold to find things.

During that process, we also realized that we can use gold to provide heat. If you shine light on gold at its resonant frequency, it will heat the gold up and can cause cell death. You could easily blow holes in people with a big enough laser and lots of nanoparticles – but we want to do is more subtle. We’re decorating the tiny gold nanoparticles with a label that will tell us their temperature.

By measuring the ratio between Stokes and anti-Stokes scattering signals (which are enhanced by the gold nanoparticles), we can measure the temperature of the gold when it is in the tumour. Then, using light, we can keep the temperature at a suitable level for treatment to optimize the outcome for the patient.

Ideally, we want to use 100 nm gold particles, but that is not something you can simply excrete through the kidneys. So we’ve spent the last five years trying to create nanoconstructs made from 5 nm gold particles that replicate the properties of 100 nm gold, but can be excreted. We haven’t demonstrated this excretion yet, but that’s the process we’re looking at.

This research is part of a project to combine diagnosis and heat treatment into one nanoparticle system – if the Raman spectra indicate cancer, you could then apply light to the nanoparticle to heat and destroy the tumour cells. Can you tell us more about this?

We’ve just completed a five-year programme called Raman Nanotheranostics. The aim is to label our nanoparticles with appropriate antibodies that will help the nanoparticles target different cancer types. This could provide signals that tell us what is or is not present and help decide how to treat the patient.

We have demonstrated the ability to perform treatments in preclinical models, control the temperature and direct the nanoparticles. We haven’t yet achieved a multiplexed approach with all the labels and antibodies that we want. But this is a key step forward and something we’re going to pursue further.

We are also trying to put labels on the gold that will enable us to measure and monitor treatment outcomes. We can use molecules that change in response to pH, or the reactive oxygen species that are present, or other factors. If you want personalized medicine, you need ways to see how the patient reacts to the treatment, how their immune system responds. There’s a whole range of things that will enable us to go beyond just diagnosis and therapy, to actually monitor the treatment and potentially apply a boost if the gold is still there.

Looking to the future, what do you see as the most promising applications of light within healthcare?

Light has always been used for diagnosis: “you look yellow, you’ve got something wrong with your liver”; “you’ve got blue-tinged lips, you must have oxygen depletion”. But it’s getting more and more advanced. I think what’s most encouraging is our ability to measure molecular changes that potentially reveal future outcomes of patients, and individualization of the patient pathway.

But the real breakthrough is what’s on our wrists. We are all walking around with devices that shine light in us – to measure heartbeat, blood oxygenation and so on. There are already Raman spectrometers that sort of size. They’re not good enough for biological measurements yet, but it doesn’t take much of a technology step forward.

I could one day have a chip implanted in my wrist that could do all the things the gold nanoconstructs might do, and my watch could read it out. And this is just Raman – there are a whole host of approaches, such as photoacoustic imaging or optical coherence tomography. Combining different techniques together could provide greater understanding in a much less invasive way than many traditional medical methods. Light will always play a really important role in healthcare.

The post Harnessing the power of light for healthcare appeared first on Physics World.

Ultrafast PET imaging could shed light on cardiac and neurological disease

24 juin 2025 à 10:00

Dynamic PET imaging is an important preclinical research tool used to visualize real-time functional information in a living animal. Currently, however, the temporal resolution of small-animal PET scanners is on the order of seconds, which is too slow to image blood flow in the heart or track the brain’s neuronal activity. To remedy this, the Imaging Physics Group at the National Institutes for Quantum Science and Technology (QST) in Japan has developed an ultrasensitive small-animal PET scanner that enables sub-second dynamic imaging of a rat.

The limited temporal resolution of conventional preclinical PET scanners stems from their low sensitivity (around 10%), caused by relatively thin detection crystals (10 mm) and a short axial field-of-view (FOV). Thus the QST team built a system based on four-layer, depth-encoding detectors with a total thickness of 30 mm. The scanner has a 325.6 mm-long axial FOV, providing total-body coverage without any bed movement, while a small inner diameter of 155 mm further increases detection efficiency.

“The main application of the total-body small-animal PET (TBS-PET) scanner will be assessment of new radiopharmaceuticals, especially for cardiovascular and neurodegenerative diseases, by providing total-body rodent PET images with sub-second temporal resolution,” first author Han Gyu Kang tells Physics World. “In addition, the scanner will be used for in-beam PET imaging, and single-cell tracking, where ultrahigh sensitivity is required.”

Performance evaluation

The TBS-PET scanner contains six detector rings, each incorporating 10 depth-of-interaction (DOI) detectors. Each DOI detector comprises a four-layer zirconium-doped gadolinium oxyorthosilicate (GSOZ) crystal array (16×16 crystals per layer) and an array of multi-anode photomultiplier tubes. The team selected GSOZ crystals because they have no intrinsic radiation signal, thus enabling low activity PET imaging.

The researchers performed a series of tests to characterize the scanner performance. Measurements of a 68Ge line source at the centre of the FOV showed that the TBS-PET had an energy resolution of 18.4% and a coincidence timing resolution of 7.9 ns.

Imaging a NEMA 22Na point source revealed a peak sensitivity of 45.0% in the 250–750 keV energy window – more than four times that of commercial or laboratory small-animal PET scanners. The system exhibited a uniform spatial resolution of around 2.6 mm across the FOV, thanks to the four-layer DOI information, which effectively reduced the parallax error.

In vivo imaging

Kang and colleagues next obtained in vivo total-body PET images of healthy rats using a single bed position. Static imaging using Na18F and 18F-FDG tracers clearly visualized bone structures and glucose metabolism, respectively, of the entire rat body.

Moving to dynamic imaging, the researchers injected an 18F-FDG bolus into the tail vein of an anesthetized rat for 15 s, followed by a saline injection 15 s after injection. They acquired early-phase dynamic PET data every second until 27 s after injection. To enable sub-second PET imaging, they used custom-written software to subdivide the list-mode data (1 s time frame) into time frames of 0.5 s, 0.25 s and 0.1 s.

Dynamic PET images with a 0.5 s time frame clearly visualized the blood stream from the tail to the heart through the iliac vein and inferior vena cava for the first 2 s, after which the tracer reached the right atrium and right ventricle. At 4.0 s after injection, blood flowed from the left ventricle into the brain via the carotid arteries. The cortex and kidneys were identified 5.5 s after injection. After roughly 17.5 s, the saline peak could be identified in the time-activity curves (TACs).

At 0.25 s temporal resolution, the early-phase images visualized the first pass blood circulation of the rat heart, showing the 18F-FDG bolus flowing from the inferior vena cava to the right ventricle from 2.25 s. The tracer next circulated to the lungs via the pulmonary artery from 2.5 s, and then flowed to the left ventricle from 3.75 s.

The TACs clearly visualized the time dispersion between the right and left ventricles (1.25 s). This value can change for animals with cardiac disease, and the team plans to explore the benefit of fast temporal resolution PET for diagnosing cardiovascular and neurodegenerative diseases.

The researchers conclude that the TBS-PET scanner enables dynamic imaging with a nearly real-time frame rate, visualizing cardiac function and pulmonary circulation of a rat with 0.25 s temporal resolution, a feat that is not possible with conventional small-animal PET scanners.

“One drawback of the TBS-PET scanner is the relatively low spatial resolution of around 2.6 mm, which is limited by the relatively large crystal pitch of 2.85 mm,” says Kang. “To solve this issue, we are now developing a new small-animal PET scanner employing three-layer depth-encoding detectors with 0.8 mm crystal pitch, towards our final goal of sub-millimetre and sub-second temporal resolution PET imaging in rodent models.”

The TBS-PET scanner is described in Physics in Medicine & Biology.

The post Ultrafast PET imaging could shed light on cardiac and neurological disease appeared first on Physics World.

Reçu hier — 23 juin 2025Physics World

Cosmic conflict continues: new data fuel the Hubble tension debate

23 juin 2025 à 13:00

A bumper crop of measurements of the expansion rate of the universe have stretched the Hubble tension as taut as it has ever been, with scientists grappling with trying to find a solution.

Over 500 researchers have come together in the “CosmoVerse” consortium to produce a new white paper that delves into the various cosmological tensions between theory and observation. These include the Hubble tension, which is the bewildering discrepancy in the expansion rate of the universe, referred to as the Hubble constant (H0).

Predictive measurements made by applying the standard model of cosmology to the cosmic microwave background (CMB) give H0 as 67.4 km/s/Mpc. In other words, every volume of space a million parsecs across (one parsec is 3.26 light years) should be expanding by 67.4 kilometres every second.

Yet that’s not what Hubble’s law – which tells us the expansion rate based on a given object’s velocity away from us and its distance – says, as demonstrated by the CosmoVerse White Paper.

“The paper’s been getting a lot of attention in our field,” Joe Jensen of Utah Valley University tells Physics World. “You can easily see that the vast majority of measurements fall around 73 km/s/Mpc, with varying uncertainties.”

There’s no known reason why local measurements of H0 (based on supernovae observations) should differ from the CMB measurement. This discrepancy leads to two possibilities. Either there are unknown systematic uncertainties in measurements that skew the results, or cosmology’s standard model is wrong and new physics is needed.

A lot at stake

The highest rung on the cosmic distance ladder is a type Ia supernova – a white dwarf explosion. They have a standardizable brightness that makes them perfect for judging how far away they are, based on their luminosity curve. These measurements are calibrated by lower rungs on the ladder, such as Cepheid variable stars or the peak brightness of red giant stars (referred to as the “tip of the red giant branch”, or TRGB).

If the tension is real, then different calibrators should still give the same result. One of the few outliers is found in a new paper published in The Astrophysical Journal by the Chicago–Carnegie Hubble Program (CCHP) led by the University of Chicago’s Wendy Freedman.

CCHP’s latest paper uses the TRGB to arrive at a best value of 70.39 km/s/Mpc when combining measurements from the James Webb Space Telescope (JWST) – which is able to better resolve red giant stars in other galaxies – with Hubble Space Telescope data.

The CCHP team argue that this result is in line with the CMB measurements and removes the tension. However, their conclusion has met opposition.

“Their result is sort of in the middle of the Hubble tension, so I’m surprised that they would say they rule it out,” Dan Scolnic, an astrophysicist at Duke University in the United States, tells Physics World.

At a meeting of the American Astronomical Society in January 2025, Scolnic declared that the Hubble tension was now a crisis. CCHP’s results do not dissuade him from this conclusion.

“For some reason they don’t include a number of supernovae in their sample that they could have,” says Scolnic. “Siyang Li [of Johns Hopkins University] led a paper [on which Scolnic is a co-author] that showed that if one uses their TRGB measurements, and the complete sample of supernovae, one goes back to higher H0.”

Freedman did not respond to Physics World‘s request for an interview.

Different approaches

Jensen has also led a team that recently conducted measurements of H0 using TRGB stars, but in a different way by looking for surface brightness fluctuations (SBF).

“SBF is a statistical method that measures the brightnesses of red giant stars even when they cannot be measured individually,” says Jensen.

Individual stars in galaxies cannot be resolved at great distance – their light blends together, and the more distant the galaxy, the smoother this blend is. We describe this blended light as the galaxy’s surface brightness, and fluctuations are statistical in nature and result from the discrete nature of stars.

In old elliptical galaxies, the surface brightness is dominated by red giant stars, which are evolved Sun-like stars. Measuring the SBF therefore provides a value for the TRGB, from which a distance can be determined.

Using JWST images to measure the SBF of 14 elliptical galaxies, then using those to calibrate the distances to 60 more distant ellipticals, and then using that calibration to determine H0, Jensen’s team arrived at a value of 73.8 km/s/Mpc.

“The reason that we don’t get the same answer [as CCHP] is that we are not using the same JWST calibrators, and we don’t use type Ia to measure H0,” says Jensen.

This contradicts CCHP’s main assertion, which is that there must be unknown systematic uncertainties in either the type Ia supernovae or the Cepheids. Jensen’s team use neither, yet still find a tension.

Perhaps the most convincing evidence for the tension comes from the TDCOSMO (time-delay cosmography) team, who utilize gravitationally lensed quasars to measure H0.

Quasars fluctuate in brightness over a matter of days. When light from a quasar takes paths of varying lengths around a lensing object, it produces multiple images that have time lags relative to one another. The expansion of space can extend this time delay, providing a completely independent measure of H0.

In 2019 the H0LiCOW project used six gravitational lenses to arrive at a value of 73.3 km/s/Mpc. This result came with some scepticism. So they formed the new TDCOSMO consortium and “went on a six-year journey to see if their original measurement was okay,” says Scolnic.

TDCOSMO’s final conclusion is 72.1 km/Mpc/s, strongly supporting the tension. However, in all these measurements there’s wriggle room from various known measuring uncertainties.

“It’s important to remember that the uncertainties put us in only mild disagreement,” says Jensen. “I expect that we will soon know if the disagreement can be explained by the mundane choices of calibration galaxies and processing techniques.”

If it cannot, then the inescapable conclusion is that there’s something wrong with our understanding of the universe. Figuring that out could be the next great quest in cosmology.

The post Cosmic conflict continues: new data fuel the Hubble tension debate appeared first on Physics World.

Vera C Rubin Observatory reveals its first spectacular images of the cosmos

23 juin 2025 à 07:01

The first spectacular images from the Vera C Rubin Observatory have been released today showing millions of galaxies and Milky Way stars and thousands of asteroids in exquisite detail.

Based in Cerro Pachón in the Andes, the Vera C Rubin Observatory contains the Legacy Survey of Space and Time (LSST) – the largest camera ever built. Taking almost two decades to build, the 3200 megapixel instrument forms the heart of the observatory’s 8.4 m Simonyi Survey Telescope.

The imagery released today, which took just 10 hours of observations, is a small preview of the Observatory’s upcoming 10-year scientific mission.

The image above is of the Trifid and Lagoon nebulas. This picture combines 678 separate images taken by the Vera C. Rubin Observatory in just over seven hours of observing time. It reveals otherwise faint or invisible details, such as the clouds of gas and dust that comprise the Trifid nebula (top right) and the Lagoon nebula, which are several thousand light-years away from Earth.

The image below is of the Virgo cluster. It shows a small section of the Virgo cluster, featuring two spiral galaxies (lower right), three merging galaxies (upper right) and several groups of distant galaxies.

Virgo cluster
Cosmic expanse: Vera C. Rubin Observatory’s view of the Virgo cluster. (Courtesy: NSF-DOE Vera C. Rubin Observatory)

Star mapper

Later this year, the Vera C Rubin Observatory, which is funded by the National Science Foundation and the Department of Energy’s Office of Science, will begin a decade-long survey of the southern hemisphere sky.

The LSST will take a complete picture of the southern night sky every 3-4 nights. It will then replicate this process over a decade to produce almost 1000 full images of sky.

This will be used to plot the positions and measure the brightness of objects in the sky to help improve our understanding of dark matter and dark energy. It will examine 20 billion galaxies as well as produce the most detailed star map of the Milky Way, imaging 17 billion stars and cataloguing some six million small objects within our solar system including asteroids.

Cosmic pioneer

The Vera C. Rubin Observatory)
On top of the world: Later this year, the Vera C Rubin Observatory will begin a decade-long survey of the southern hemisphere sky. (Courtesy: NSF-DOE Vera C. Rubin Observatory)

The observatory is named in honour of the US astronomer Vera C. Rubin. In 1970, working with Kent Ford Jr, they observed that outer stars orbiting in the Andromeda galaxy were all doing so at the same speed.

Examining more galaxies still, they found that their rotation curves – the orbital speed of visible stars within the galaxy compared with their radial distance to the galaxy centre – contradicted Kepler’s law.

They also found that stars near the outer edges of the galaxies were orbiting so fast that they should be falling apart.

Rubin and Ford Jr’s observation led them to predict that there was some mass, dubbed “dark matter”, inside the galaxies responsible for the anomalous motions, something their telescopes couldn’t see but was there in quantities about six times the amount of the luminous matter present.

The post Vera C Rubin Observatory reveals its first spectacular images of the cosmos appeared first on Physics World.

Reçu avant avant-hierPhysics World

Conflicting measurements of helium’s charge radius may be reconciled by new calculations

20 juin 2025 à 17:50

Independent measurements of the charge radius of the helium-3 nucleus using two different  methods have yielded significantly different results – prompting a re-evaluation of underlying theory to reconcile them. The international CREMA Collaboration used muonic helium-3 ions to determine the radius, whereas a team in the Netherlands used a quantum-degenerate gas of helium-3 atoms.

The charge radius is a statistical measure of how far the electric charge of a particle extends into space. Both groups were mystified by the discrepancy in the values – which hints at physics beyond the Standard Model of particle physics. However, new theoretical calculations inspired by the results may have already resolved the discrepancy.

Both groups studied the difference between the charge radii of the helium-3 and helium-4 nuclei. CREMA used muonic helium ions, in which the remaining electrons replaced by muons. Muons are much more massive than electrons, so they spend more time near the nucleus – and are therefore more sensitive to the charge radius.

Shorter wavelengths

Muonic atoms have spectra at much shorter wavelengths than normal atoms. This affects values such as the Lamb shift. This is the energy difference in the 2S1/2 and 2P1/2 atomic states, which are split by interactions with virtual photons and vacuum polarization. This is most intense near the nucleus. More importantly, a muon in an S orbital becomes more sensitive to the finite size of the nucleus.

In 2010, CREMA used the charge radius of muonic hydrogen to conclude that the charge radius of the proton is significantly smaller than the current accepted value. The same procedure was then used with muonic helium-4 ions. Now, CREMA has used pulsed laser spectroscopy of muonic helium-3 ions to extract several key parameters including the Lamb shift and used them to calculate the charge radius of muonic helium-3 nuclei. They then calculated the difference with the charge radius in helium-4. The value they obtained was 15 times more accurate than any previously reported.

Meanwhile, at the Free University of Amsterdam in the Netherlands, researchers were taking a different approach, using conventional helium-3 atoms. This has significant challenges, because the effect of the nucleus on electrons is much smaller. However, it also means that an electron affects the nucleus it measures less than does a muon, which mitigates a source of theoretical uncertainty.

The Amsterdam team utilized the fact that the 2S triplet state in helium is extremely long-lived. ”If you manage to get the atom up there, it’s like a new ground state, and that means you can do laser cooling on it and it allows very efficient detection of the atoms,” explains Kjeld Eikema, one of the team’s leaders after its initial leader Wim Vassen died in 2019. In 2018, the Amsterdam group created an ultracold Bose–Einstein condensate (BEC) of helium-4 atoms in the 2S triplet state in an optical dipole trap before using laser spectroscopy to measure the ultra-narrow transition between the 2S triplet state and the higher 2S singlet state.

Degenerate Fermi gas

In the new work, the researchers turned to helium-3, which does not form a BEC but instead forms a degenerate Fermi gas. Interpreting the spectra of this required new discoveries itself. “Current theoretical models are insufficiently accurate to determine the charge radii from measurements on two-electron atoms,” Eikema explains. However, “the nice thing is that if you measure the transition directly in one isotope and then look at the difference with the other isotope, then most complications from the two electrons are common mode and drop out,” he says. This can be used to the determine the difference in the charge radii.

The researchers obtained a value that was even more precise than CREMA’s and larger by 3.6σ. The groups could find no obvious explanation for the discrepancy. “The scope of the physics involved in doing and interpreting these experiments is quite massive,” says Eikema; “a comparison is so interesting, because you can say ‘Well, is all this physics correct then? Are electrons and muons the same aside from their mass? Did we do the quantum electrodynamics correct for both normal atoms and muonic atoms? Did we do the nuclear polarization correctly?’” The results of both teams are described in Science (CREMA, Amsterdam).

While these papers were undergoing peer review, the work attracted the attention of two groups of theoretical physicists – one led by Xiao-Qiu Qi f the Wuhan Institute of Physics and Mathematics in China, and the other by Krzysztof Pachucki of the University of Warsaw in Poland. Both revised the calculation of the hyperfine structure of helium-3, finding that incorporating previously neglected higher orders into the calculation produced an unexpectedly large shift.

“Suddenly, by plugging this new value into our experiment – ping! – our determination comes within 1.2σ of theirs,” says Eikema; “which is a triumph for all the physics involved, and it shows how, by showing there’s a difference, other people think, ‘Maybe we should go and check our calculations,’ and it has improved the calculation of the hyperfine effect.” In this manner the ever improving experiments and theory calculations continue to seek the limits of the Standard Model.

Xiao-Qiu Qi and colleagues describe their calculations in Physical Review Research, while Pachucki’s team have published in Physical Review A.

Eikema adds “Personally I would have adjusted the value in our paper according to these new calculations, but Science preferred to keep the paper as it was at the time of submission and peer review, with an added final paragraph to explain the latest developments.”

Theoretical physicist Marko Horbatsch at Canada’s York University is impressed by the experimental results and bemused by the presentation. “I would say that their final answer is a great success,” he concludes. “There is validity in having the CREMA and Eikema work published side-by-side in a high-impact journal. It’s just that the fact that they agree should not be confined to a final sentence at the end of the paper.”

The post Conflicting measurements of helium’s charge radius may be reconciled by new calculations appeared first on Physics World.

Simulation of capsule implosions during laser fusion wins Plasma Physics and Controlled Fusion Outstanding Paper Prize

20 juin 2025 à 17:00

Computational physicist Jose Milovich of the Lawrence Livermore National Laboratory (LLNL) and colleagues have been awarded the 2025 Plasma Physics and Controlled Fusion (PPCF) Outstanding Paper Prize for their computational research on capsule implosions during laser fusion.

The work – Understanding asymmetries using integrated simulations of capsule implosions in low gas-fill hohlraums at the National Ignition Facility – is an important part of understanding the physics at the heart of inertial confinement fusion (ICF).

Fusion is usually performed via two types of plasma confinement. Magnetic involves using magnetic fields to hold stable a plasma of deuterium-tritium (D-T), while inertial confinement uses rapid compression, usually by lasers, to create a confined plasma for a short period of time.

The award-winning work was based on experiments carried out at the National Ignition Facility (NIF) based in California, which is one of the leading fusion centres in the world.

During NIF’s ICF experiments, a slight imbalance of the laser can induce motion of the hot central core of an ignition capsule, which contains the D-T fuel. This effect results in a reduced performance.

Experiments at NIF in 2018 found that laser imbalances alone, however, could not account for the motion of the capsule. The simulations carried out by Milovich and colleagues demonstrated that other factors were at play such as non-concentricity of the layers of the material surrounding the D-T fuel as well as “drive perturbations” induced by diagnostic windows on the implosion.

Computational physicist Jose Milovich Jose
Computational physicist Jose Milovich of the Lawrence Livermore National Laboratory. (Courtesy: LLNL)

Changes made following the team’s findings then helped towards the recent demonstration of “energy breakeven” at NIF in December 2022.

Awarded each year, the PPCF prize aims to highlight work of the highest quality and impact published in the journal.  The award was judged on originality, scientific quality and impact as well as being based on community nominations and publication metrics. The prize will be presented at the 51st European Physical Society Conference on Plasma Physics in Vilnius, Lithuania, on 7–11 July.

The journal is now seeking nominations for next year’s prize, which will focus on papers in magnetic confinement fusion.

Below, Milovich talks to Physics World about prize, the future of fusion and what advice he has for early-career researchers.

What does winning the 2025 PPCF Outstanding Paper Prize mean to you and for your work?

The award is an incredible honour to me and my collaborators as a recognition of the detailed work required to make inertial fusion in the laboratory a reality and the dream of commercial fusion energy a possibility. The paper presented numerical confirmation of how seemingly small effects can significantly impact the performance of fusion targets.  This study led to target modifications and revised manufacturing specifications for improved performance.  My collaborators and I would like to deeply thank PPCF for granting us this award.

What excites you about fusion?

Nuclear fusion is the process that powers the stars, and achieving those conditions in the laboratory is exciting in many ways.  It is an interesting scientific problem in its own right and it is an incredibly challenging engineering problem to handle the extreme conditions required for successful energy production. This is an exciting time since the possibility of realizing this energy source became tangibly closer two years ago when NIF successfully demonstrated that more energy can be released from D-T fusion than the laser energy delivered to the target.

What are your thoughts on the future direction of ICF and NIF?

While the challenges ahead to make ICF commercially feasible are daunting, we are well positioned to address them by developing new technologies and innovative target configurations. Applications of artificial intelligence to reactor plant designs, optimized operations, and improvements on plasma confinement could potentially lead to improved designs at a fraction of the cost. The challenges are many but the potential for providing a clean and inexhaustible source of energy for the benefit of mankind is invigorating.

What advice would you give to people thinking about embarking on a career in fusion?

This is an exciting time to get involved in fusion. The latest achievements at NIF have shown that fusion is possible. There are countless difficulties to overcome, making it an ideal time to devote one’s career in this area. My advice is to get involved now since, at this early stage, any contribution will have a major and lasting impact on mankind’s future energy needs.

The post Simulation of capsule implosions during laser fusion wins <em>Plasma Physics and Controlled Fusion</em> Outstanding Paper Prize appeared first on Physics World.

AI algorithms in radiology: how to identify and prevent inadvertent bias

20 juin 2025 à 09:30

Artificial intelligence (AI) has the potential to generate a sea change in the practice of radiology, much like the introduction of radiology information system (RIS) and picture archiving and communication system (PACS) technology did in the late 1990s and 2000s. However, AI-driven software must be accurate, safe and trustworthy, factors that may not be easy to assess.

Machine learning software is trained on databases of radiology images. But these images might lack the data or procedures needed to prevent algorithmic bias. Such algorithmic bias can cause clinical errors and performance disparities that affect a subset of the analyses that the AI performs, unintentionally disadvantaging certain groups of patients.

A multinational team of radiology informaticists, biomedical engineers and computer scientists has identified potential pitfalls in the evaluation and measurement of algorithmic bias in AI radiology models. Describing their findings in Radiology, the researchers also suggest best practices and future directions to mitigate bias in three key areas: medical image datasets; demographic definitions; and statistical evaluations of bias.

Medical imaging datasets

The medical image datasets used for training and evaluation of AI algorithms are reflective of the population from which they are acquired. It is natural that a dataset acquired in a country in Asia will not be representative of the population in a Nordic country, for example. But if there’s no information available about the image acquisition location, how might this potential source of bias be determined?

Paul Yi
Team leader Paul Yi. (Courtey: RSNA)

Lead author Paul Yi, of St. Jude Children’s Research Hospital in Memphis, TN, and coauthors advise that many existing medical imaging databases lack a comprehensive set of demographic characteristics, such as age, sex, gender, race and ethnicity. Additional potential confounding factors include the scanner brand and model, the radiology protocols used for image acquisition, radiographic views acquired, the hospital location and disease prevalence. In addition to incorporating these data, the authors recommend that raw image data are collected and shared without institution-specific post-processing.

The team advise that generative AI, a set of machine learning techniques that generate new data, provides the potential to create synthetic imaging datasets with more balanced representation of both demographic and confounding variables. This technology is still in development, but might provide a solution to overcome pitfalls related to measurement of AI biases in imperfect datasets.

Defining demographics

Radiology researchers lack consensus with respect to how demographic variables should be defined. Observing that demographic categories such as gender and race are self-identified characteristics informed by many factors, including society and lived experiences, the authors advise that concepts of race and ethnicity do not necessarily translate outside of a specific society and that biracial individuals reflect additional complexity and ambiguity.

They emphasize that ensuring accurate measurements of race- and/or ethnicity-based biases in AI models is important to enable accurate comparison of bias evaluations. This not only has clinical implications, but is also essential to prevent health policies being established in error from erroneous AI-derived findings, which could potentially perpetuate pre-existing inequities.

Statistical evaluations of bias

The researchers define bias in the context of demographic fairness and how it reflects differences in metrics between demographic groups. However, establishing consensus on the definition of bias is complex, because bias can have different clinical and technical meanings. They point out that in statistics, bias refers to a discrepancy between the expected value of an estimated parameter and its true value.

As such, the radiology speciality needs to establish a standard notion of bias, as well as tackle the incompatibility of fairness metrics, the tools that measure whether a machine learning model treats certain demographic groups differently. Currently there is no universal fairness metric that can be applied to all cases and problems, and the authors do not think there ever will be one.

The different operating points of predictive AI models may result in different performance that could lead to potentially different demographic biases. These need to be documented, and thresholds should be included in research and by commercial AI software vendors.

Key recommendations

The authors suggest some key courses of action to mitigate demographic biases in AI in radiology:

  • Improve reporting of demographics by establishing a consensus panel to define and update reporting standards.
  • Improve dataset reporting of non-demographic factors, such as imaging scanner vendor and model.
  • Develop a standard lexicon of terminology for concepts of fairness and AI bias concepts in radiology.
  • Develop standardized statistical analysis frameworks for evaluating demographic bias of AI algorithms based on clinical contexts
  • Require greater demographic detail to evaluate algorithmic fairness in scientific manuscripts relating to AI models.

Yi and co-lead collaborator Jeremias Sulam, of Hopkins BME, Whiting School of Engineering, tell Physics World that their assessment of pitfalls and recommendations to mitigate demographic biases reflect years of multidisciplinary discussion. “While both the clinical and computer science literature had been discussing algorithmic bias with great enthusiasm, we learned quickly that the statistical notions of algorithmic bias and fairness were often quite different between the two fields,” says Yi.

“We noticed that progress to minimize demographic biases in AI models is often hindered by a lack of effective communication between the computer science and statistics communities and the clinical world, radiology in particular,” adds Sulam.

A collective effort to address the challenges posed by bias and fairness is important, notes Melissa Davis of Yale School of Medicine, in an accompanying editorial in Radiology. By fostering collaboration between clinicians, researchers, regulators and industry stakeholders, the healthcare community can develop robust frameworks that prioritize patient safety and equitable outcomes,” she writes.

The post AI algorithms in radiology: how to identify and prevent inadvertent bias appeared first on Physics World.

Helgoland: leading scientists reflect on 100 years of quantum physics and look to the future

19 juin 2025 à 14:59

Last week, Physics World’s Matin Durrani boarded a ferry in Hamburg that was bound for Helgoland – an archipelago in the North Sea about 70 km off the north-west coast of Germany.

It was a century ago in Helgoland that the physicist Werner Heisenberg devised the mathematical framework that underpins our understanding of quantum physics.

Matin was there with some of the world’s leading quantum physicists for the conference Helgoland 2025: 100 Years of Quantum Mechanics – which celebrated Heisenberg’s brief stay in Helgoland.

He caught up with three eminent physicists and asked them to reflect on Heisenberg’s contributions to quantum mechanics and look forward to the next 100 years of quantum science and technology. They are Tracy Northup at the University of Vienna; Michelle Simmons of the University of New South Wales, Sydney; and Peter Zoller of the University of Innsbruck.

• Don’t miss the 2025 Physics World Quantum Briefing, which is free to read via this link.

This article forms part of Physics World‘s contribution to the 2025 International Year of Quantum Science and Technology (IYQ), which aims to raise global awareness of quantum physics and its applications.

Stayed tuned to Physics World and our international partners throughout the next 12 months for more coverage of the IYQ.

Find out more on our quantum channel.

The post Helgoland: leading scientists reflect on 100 years of quantum physics and look to the future appeared first on Physics World.

Laser World of Photonics showcases cutting-edge optical innovation

19 juin 2025 à 10:45

Laser World of Photonics, the leading trade show for the laser and photonics industry, takes place in Munich from 24 to 27 June. Attracting visitors and exhibitors from around the world, the event features 11 exhibition areas covering the entire spectrum of photonic technologies – including illumination and energy, biophotonics, data transmission, integrated photonics, laser systems, optoelectronics, sensors and much more.

Running parallel and co-located with Laser World of Photonics is World of Quantum, the world’s largest trade fair for quantum technologies. Showcasing all aspects of quantum technologies – from quantum sensors and quantum computers to quantum communications and cryptography – the event provides a platform to present innovative quantum-based products and discuss potential applications.

Finally, the World of Photonics Congress (running from 22 to 27 June) features seven specialist conferences, over 3000 lectures and around 6700 experts from scientific and industrial research.

The event is expecting to attract around 40,000 visitors from 70 countries, with the trade shows incorporating 1300 exhibitors from 40 countries. Here are some of the companies and product innovations to look out for on the show floor.

HOLOEYE unveils compact 4K resolution spatial light modulator

HOLOEYE Photonics AG, a leading provider of spatial light modulator (SLM) devices, announces the release of the GAEA-C spatial light modulator, a compact version of the company’s high-resolution SLM series. The GAEA-C will be officially launched at Laser World of Photonics, showcasing its advanced capabilities and cost-effective design.

The GAEA-C spatial light modulator
Compact and cost-effective The GAEA-C spatial light modulator is ideal for a variety of applications requiring precise light modulation. (Courtesy: HOLOEYE)

The GAEA-C is a phase-only SLM with a 4K resolution of 4094 x 2400 pixels, with an exceptionally small pixel pitch of 3.74 µm. This compact model is equipped with a newly developed driver solution that not only reduces costs but also enhances phase stability, making it ideal for a variety of applications requiring precise light modulation.

The GAEA-C SLM features a reflective liquid crystal on silicon (LCOS) display (phase only). Other parameters include a fill factor of 90%, an input frame rate of 30 Hz and a maximum spatial resolution of 133.5 lp/mm.

The GAEA-C is available in three versions, each optimized for a different wavelength range: a VIS version (420–650 nm), a NIR version (650–1100 nm) and a version tailored for the telecommunications waveband around 1550 nm. This versatility ensures that the GAEA-C can meet the diverse needs of industries ranging from telecoms to scientific research.

HOLOEYE continues to lead the market with its innovative SLM solutions, providing unparalleled resolution and performance. The introduction of the GAEA-C underscores HOLOEYE’s commitment to delivering cutting-edge technology that meets the evolving demands of its customers.

  • For more information about the GAEA-C and other SLM products, visit HOLOEYE at booth #225 in Hall A2.

Avantes launches NIR Enhanced spectrometers

At this year’s Laser World of Photonics, Avantes unveils its newest generation of spectrometers: the NEXOS NIR Enhanced and VARIUS NIR Enhanced. Both instruments mark a significant leap in near-infrared (NIR) spectroscopy, offering up to 2x improved sensitivity and unprecedented data quality for integration into both research and industry applications.

NEXOS NIR Enhanced spectrometer
Solving spectroscopy challenges Visit Avantes at booth 218, Hall A3, for hands-on demonstrations of its newest generation of spectrometers. (Courtesy: Avantes)

Compact, robust and highly modular, the NEXOS NIR Enhanced spectrometer redefines performance in a small form factor. It features enhanced NIR quantum efficiency in the 700–1100 nm range, with up to 2x increased sensitivity, fast data transfer and improved signal-to-noise ratio. The USB-powered spectrometer is designed with a minimal footprint of just 105 x 80 x 20 mm and built using AvaMation production for top-tier reproducibility and scalability. It also offers seamless integration with third-party software platforms.

The NEXOS NIR Enhanced is ideal for food sorting, Raman applications and VCSEL/laser system integration, providing research-grade performance in a compact housing. See the NEXOS NIR Enhanced product page for further information.

Designed for flexibility and demanding industrial environments, the VARIUS NIR Enhanced spectrometer introduces a patented optical bench for supreme accuracy, with replaceable slits for versatile configurations. The spectrometer offers a dual interface – USB 3.0 and Gigabit Ethernet – plus superior stray light suppression, high dynamic range and enhanced NIR sensitivity in the 700–1100 nm region.

With its rugged form factor (183 x 130 x 45.2 mm) and semi-automated production process, the VARIUS NIR is optimized for real-time applications, ensuring fast data throughput and exceptional reliability across industries. For further information, see the VARIUS NIR Enhanced product page.

Avantes invites visitors to experience both systems live at Laser World of Photonics 2025. Meet the team for hands-on demonstrations, product insights and expert consultations. Avantes offers free feasibility studies and tailored advice to help you identify the optimal solution for your spectroscopy challenges.

  • For more information, visit www.avantes.com or meet Avantes at booth #218 in Hall A3.

HydraHarp 500: a new era in time-correlated single-photon counting

Laser World of Photonics sees PicoQuant introduce its newest generation of event timer and time-correlated single-photon counting (TCSPC) unit – the HydraHarp 500. Setting a new standard in speed, precision and flexibility, the TCSPC unit is freely scalable with up to 16 independent channels and a common sync channel, which can also serve as an additional detection channel if no sync is required.

HydraHarp 500
Redefining what’s possible PicoQuant presents HydraHarp 500, a next-generation TCSPC unit that maximizes precision, flexibility and efficiency. (Courtesy: PicoQuant)

At the core of the HydraHarp 500 is its outstanding timing precision and accuracy, enabling precise photon timing measurements at exceptionally high data rates, even in demanding applications.

In addition to the scalable channel configuration, the HydraHarp 500 offers flexible trigger options to support a wide range of detectors, from single-photon avalanche diodes to superconducting nanowire single-photon detectors. Seamless integration is ensured through versatile interfaces such as USB 3.0 or an external FPGA interface for data transfer, while White Rabbit synchronization allows precise cross-device coordination for distributed setups.

The HydraHarp 500 is engineered for high-throughput applications, making it ideal for rapid, large-volume data acquisition. It offers 16+1 fully independent channels for true simultaneous multi-channel data recording and efficient data transfer via USB or the dedicated FPGA interface. Additionally, the HydraHarp 500 boasts industry-leading, extremely low dead-time per channel and no dead-time across channels, ensuring comprehensive datasets for precise statistical analysis.

The HydraHarp 500 is fully compatible with UniHarp, a sleek, powerful and intuitive graphical user interface. UniHarp revolutionizes the interaction with PicoQuant’s TCSPC and time tagging electronics, offering seamless access to advanced measurement modes like time trace, histogram, unfold, raw and correlation (including FCS and g²).

Step into the future of photonics and quantum research with the HydraHarp 500. Whether it’s achieving precise photon correlation measurements, ensuring reproducible results or integrating advanced setups, the HydraHarp 500 redefines what’s possible – offering precision, flexibility and efficiency combined with reliability and seamless integration to achieve breakthrough results.

For more information, visit www.picoquant.com or contact us at info@picoquant.com.

  • Meet PicoQuant at booth #216 in Hall B2.

SmarAct showcases integrated, high-precision technologies

With a strong focus on turnkey, application-specific solutions, SmarAct offers nanometre-precise motion systems, measurement equipment and scalable micro-assembly platforms for photonics, quantum technologies, semiconductor manufacturing and materials research – whether in research laboratories or high-throughput production environments.

SmarAct’s high-precision technologies
State-of-the-art solutions The SmarAct Group returns to Laser World of Photonics in 2025 with a comprehensive showcase of integrated, high-precision technologies. (Courtesy: SmarAct)

At Laser World of Photonics, SmarAct presents a new modular multi-axis positioning system for quantum computing applications and photonic integrated circuit (PIC) testing. The compact system is made entirely from titanium and features a central XY stage with integrated rotation, flanked by two XYZ modules – one equipped with a tip-tilt goniometer.

For cryogenic applications, the system can be equipped with cold plates and copper braids to provide a highly stable temperature environment, even at millikelvin levels. Thanks to its modularity, the platform can be reconfigured for tasks such as low-temperature scanning or NV centre characterization. When combined with SmarAct’s interferometric sensors, the system delivers unmatched accuracy and long-term stability under extreme conditions.

Also debuting is the SGF series of flexure-based goniometers – compact, zero-backlash rotation stages developed in collaboration with the University of Twente. Constructed entirely from non-ferromagnetic materials, the goniometers are ideal for quantum optics, electron and ion beam systems. Their precision has been validated in a research paper presented at EUSPEN 2023.

Targeting the evolving semiconductor and photonics markets, SmarAct’s optical assembly platforms enable nanometre-accurate alignment and integration of optical components. At their core is a modular high-performance toolkit for application-specific configurations, with the new SmarAct robot control software serving as the digital backbone. Key components include SMARPOD parallel kinematic platforms, long-travel SMARSHIFT electromagnetic linear stages and ultraprecise microgrippers – all seamlessly integrated to perform complex optical alignment tasks with maximum efficiency.

Highlights at Laser World of Photonics include a gantry-based assembly system developed for the active alignment of beam splitters and ferrules, and a compact, fully automated fibre array assembly system designed for multicore and polarization-maintaining fibres. Also on display are modular probing systems for fast, accurate and reliable alignment of fibres and optical elements – providing the positioning precision required for chip- and wafer-level testing of PICs prior to packaging. Finally, the microassembly platform P50 from SmarAct Automation offers a turnkey solution for automating critical micro-assembly tasks such as handling, alignment and joining of tiny components.

Whether you’re working on photonic chip packaging, quantum instrumentation, miniaturized medical systems or advanced semiconductor metrology, SmarAct invites researchers, engineers and decision-makers to experience next-generation positioning, automation and metrology solutions live in Munich.

  • Visit SmarAct at booth #107 in Hall B2.

 

The post Laser World of Photonics showcases cutting-edge optical innovation appeared first on Physics World.

Liquid carbon reveals its secrets

19 juin 2025 à 10:00

Thanks to new experiments using the DIPOLE 100-X high-performance laser at the European X-ray Free Electron Laser (XFEL), an international collaboration of physicists has obtained the first detailed view of the microstructure of carbon in its liquid state. The work will help refine models of liquid carbon, enabling important insights into the role that it plays in the interior of ice giant planets like Uranus and Neptune, where liquid carbon exists in abundance. It could also inform the choice of ablator materials in future technologies such as nuclear fusion.

Carbon is the one of the most abundant elements on Earth and indeed the universe, but we still know very little about how it behaves in its liquid state. This is because producing liquid carbon is extremely difficult: at ambient pressures it sublimes rather than melts; and the liquid phase requires pressures of at least several hundred atmospheres to form. What is more, carbon boasts the highest melting temperature (of roughly 4500 °C) of all known materials under these high-pressure conditions, which means that there is no substance that can contain it for long enough to be studied and characterized.

In situ probing laser compression technique

There is an alternative, though, which involves using X-ray free electron laser pulses – such as those produced at the European XFEL – to transform solid carbon into a liquid for a few nanoseconds. The next challenge is to make measurements during this very short period of time. But this is exactly what a team led by Dominik Kraus of the University of Rostock and the Helmholtz-Zentrum Dresden-Rossendorf (HZDR) has succeeded in doing.

In their work, Kraus and colleagues transiently created liquid carbon by driving strong compression waves into solid carbon samples using the pulsed high-energy laser DIPOLE 100-X, which is a new experimental platform at the European XFEL. In this way, the researchers were able to achieve pressures exceeding one million atmospheres, with the compression waves simultaneously heating the samples to around 7000 K to form liquid carbon. They then obtained in situ snapshots of the structure using ultrabright X-ray pulses at the European XFEL that lasted just 25 fs – that is, about 100,000 times shorter than the already very short lifetime of the liquid carbon samples.

Relevance to planetary interiors and inertial fusion

Studying liquid carbon is important for modelling the interior of planets such as the ice giants Neptune and Uranus, as well as the atmosphere of white dwarfs, in which it also exists, explains Kraus. The insights gleaned from the team’s experiments will help to clarify the role that liquid carbon plays in the ice giants and perhaps even comparable carbon-rich exoplanets.

Liquid carbon also forms as a transient state during some technical processes, like in the synthesis of carbon-based materials such as carbon nanotubes, nanodiamonds or “Q-carbon”, and may be key for the synthesis of new carbon materials, such as the long sought after (but still only predicted) “BC-8” structure. The team’s findings could also help inform the choice of materials for inertial fusion implosions aiming for clean and reliable energy production, where carbon is used as an ablator material.

“Because of its relevance in these areas, I had already tried to study liquid carbon during my doctoral work more than 10 years ago,” Kraus says. “Without an XFEL for characterization, I could only obtain a tiny hint of the liquid structure of carbon (and with large error bars) and was barely able to refine any existing models.”

Until now, however, this work was considered as being the best attempt to characterize the structure of liquid carbon at Mbar pressures, he tells Physics World. “Using the XFEL as a characterization tool and the subsequent analysis was incredibly simple in comparison to all the previous work and, in the end, the most important challenge was to get the European XFEL facility ready – something that I had already discussed more than 10 years ago too when the first plans were being made for studying matter under extreme conditions at such an installation.”

The results of the new study, which is detailed in Nature, prove that simple models cannot describe the liquid state of carbon very well, and that sophisticated atomistic simulations are required for predicting processes involving this material, he says.

Looking forward, the Rostock University and HZDR researchers now plan to extend their methodology to the liquid states of various other materials. “In particular, we will study mixtures of light elements that may exist in planetary interiors and the resulting chemistry at extreme conditions,” reveals Kraus. “This work may also be interesting for forming doped nanodiamonds or other phases with potential technological applications.”

The post Liquid carbon reveals its secrets appeared first on Physics World.

Tiny laser delivers high-quality, narrowband light for metrology

18 juin 2025 à 18:00

A new solid-state laser can make a vast number of precise optical measurements each second, while sweeping across a broad range of optical wavelengths. Created by a team led by Qiang Lin at the University of Rochester in the US, the device can be fully integrated onto a single chip.

Optical metrology is a highly versatile technique that uses light to gather information about the physical properties of target objects. It involves illuminating a sample and measuring the results with great precision – using techniques such as interferometry and spectroscopy. In the 1960s, the introduction of lasers and the coherent light they emit boosted the technique to an unprecedented level of precision. This paved the way for advances ranging from optical clocks, to the detection of gravitational waves.

Yet despite the indispensable role they have played so far, lasers have also created a difficult challenge. To ensure the best possible precision, experimentalists much achieve very tight control over the wavelength, phase, polarization and other properties of the laser light. This is very difficult to do within the tiny solid-state laser diodes that are very useful in metrology.

Currently, the light from laser diodes is improved externally using optical modules. This added infrastructure is inherently bulky and it remains difficult to integrate the entire setup onto chip-scale components – which limits the development of small, fast lasers for metrology.

Two innovations

Lin and colleagues addressed this challenge by designing a new laser with two key components. One is a laser cavity that comprises a thin film of lithium niobate. Thanks to the Pockels effect, this material’s refractive index can vary depending on the strength of an applied electric field. This provides control over the wavelength of the light amplified by the cavity.

The other component is a distributed Bragg reflector (DBR), which is a structure containing periodic grooves that create alternating regions of refractive index. With the right spacing of these grooves, a DBR can strongly reflect light at a single, narrow linewidth, while scattering all other wavelengths. In previous studies, lasers were created by etching a DBR directly onto a lithium niobate film – but due to the material’s optical properties, this resulted in a broad linewidth.

“Instead, we developed an ‘extended DBR’ structure, where the Bragg grating is defined in a silica cladding,” explains team member Mingxiao Li at the University of California Santa Barbara. “This allowed for flexible control over the grating strength, via the thickness and etch depth of the cladding. It also leverages silica’s superior etchability to achieve low scattering strength, which is essential for narrow linewidth operation.”

Using a system of integrated electrodes, Lin’s team can adjust the strength of the electric field they applied to the lithium niobate film. This allows them to rapidly tune the wavelengths amplified by the cavity via the Pockels effect. In addition, they used a specially designed waveguide to control the phase of light passing into the cavity. This design enabled them to tune their laser over a broad range of wavelengths, without needing external correction modules to achieve narrow linewidths.

Narrowband performance

Altogether, the laser demonstrated an outstanding performance on a single chip – producing a clean, single wavelength with very little noise. Most importantly, the light had a linewidth of just 167 Hz – the smallest range achieved to date for a single-chip lithium niobate laser. This exceptional performance enabled the laser to rapidly sweep across a bandwidth of over 10 GHz – equivalent to scanning quintillions of points per second.

“These capabilities translated directly into successful applications,” Li describes. “The laser served as the core light source in a high-speed LIDAR system, measuring the velocity of a target 0.4 m away with better than 2 cm distance resolution. The system supports a velocity measurement as high as Earth’s orbital velocity – around 7.91 km/s – at 1 m.” Furthermore, Lin’s team were able to lock their laser’s frequency with a reference gas cell, integrated directly onto the same chip.

By eliminating the need for bulky control modules, the team’s design could now pave the way for the full miniaturization of optical metrology – with immediate benefits for technologies including optical clocks, quantum computers, self-driving vehicles, and many others.

“Beyond these, the laser’s core advantages – exceptional coherence, multifunctional control, and scalable fabrication – position it as a versatile platform for transformative advances in high-speed communications, ultra-precise frequency generation, and microwave photonics,” Lin says.

The new laser is described in Light: Science & Applications.

The post Tiny laser delivers high-quality, narrowband light for metrology appeared first on Physics World.

Astronomers capture spectacular ‘thousand colour’ image of the Sculptor Galaxy

18 juin 2025 à 14:01

Astronomers at the European Southern Observatory’s Very Large Telescope (VLT) have created a thousand colour image of the nearby Sculptor Galaxy.

First discovered by Carloine Herschel in 1783 the spiral galaxy lies 11 million light-years away and is one of the brightest galaxies in the sky.

While conventional images contain only a handful of colours, this new map contains thousands, which helps astronomers to understand the age, composition and motion of the stars, gas and dust within it.

To create the image, researchers observed the galaxy for over 50 hours with the Multi Unit Spectroscopic Explorer (MUSE) instrument on the VLT, which is based at the Paranal Observatory in Chile’s Atacama Desert.

The team then stitched together over 100 exposures to cover an area of the galaxy about 65 000 light-years wide.

The image revealed around 500 planetary nebulae – regions of gas and dust cast off from dying Sun-like stars – that can be used as distance markers to their host galaxies.

“Galaxies are incredibly complex systems that we are still struggling to understand,” notes astronomer Enrico Congiu, lead author of the study. “The Sculptor Galaxy is in a sweet spot – it is close enough that we can resolve its internal structure and study its building blocks with incredible detail, but at the same time, big enough that we can still see it as a whole system.”

Future work will involve understanding how gas flows, changes its composition, and forms stars in the galaxy.  “How such small processes can have such a big impact on a galaxy whose entire size is thousands of times bigger is still a mystery,” adds Congiu.

The post Astronomers capture spectacular ‘thousand colour’ image of the Sculptor Galaxy appeared first on Physics World.

Delving into the scientific mind, astronomy’s happy accidents, lit science experiments at home, the art of NASA: micro reviews of recent books

18 juin 2025 à 12:00

The Shape of Wonder: How Scientists Think, Work and Live
By Alan Lightman and Martin Rees

In their delightful new book, cosmologist Martin Rees and physicist and science writer Alan Lightman seek to provide “an honest picture of scientists as people and how they work and think”. The Shape of Wonder does this by exploring the nature of science, examining the role of critical thinking, and looking at how scientific theories are created and revised as new evidence emerges. It also includes profiles of individual scientists, ranging from historical Nobel-prize winners such as physicist Werner Heisenberg and biologist Barbara McClintock, to rising stars like CERN theorist Dorota Grabowska. Matin Durrani

  • 2025 Pantheon Books

Our Accidental Universe: Stories of Discovery from Asteroids to Aliens
By Chris Lintott

TV presenter and physics professor Chris Lintott brings all his charm and wit to his new book Our Accidental Universe. He looks at astronomy through the lens of the human errors and accidents that lead to new knowledge. It’s a loose theme that allows him to skip from the search for alien life to pulsars and the Hubble Space Telescope. Lintott has visited many of the facilities he discusses, and spoken to many people working in these areas, adding a personal touch to his stated aim of elucidating how science really gets done. Kate Gardner

  • 2024 Penguin

Science is Lit: Awesome Electricity and Mad Magnets
By Big Manny (Emanuel Wallace)

Want to feed your child’s curiosity about how things work (and don’t mind creating a mini lab in your house)? Take a look at Awesome Electricity and Mad Magnets, the second in the Science is Lit series by Emanuel Wallace – aka TikTok star “Big Manny”. Wallace introduces four key concepts of physics – force, sound, light and electricity – in an enthusiastic and fun way that’s accessible for 8–12 year olds. With instructions for experiments kids can do at home, and a clear explanation of the scientific process, your child can really experience what it’s like to be a scientist. Sarah Tesh

  • 2025 Puffin
Painting of a grey-white lunar landscape featuring several astronauts and dozens of scientific apparatus
NASA art This concept painting by Robert McCall shows a telescope in a hypothetical lunar observatory, sheltered from the Sun to protect its lens. (Courtesy: Robert McCall)

Space Posters & Paintings: Art About NASA
By Bill Schwartz

Astronomy is the most visually gifted of all the sciences, with endless stunning photographs of our cosmos. But perhaps what sets NASA apart from other space agencies is its art programme, which has existed since 1962. In Space Posters and Paintings: Art about NASA, documentary filmmaker Bill Schwartz has curated a striking collection of nostalgic artworks that paint the history of NASA and its various missions across the solar system and beyond. Particularly captivating are pioneering artist Robert McCall’s paintings of the Gemini and Apollo missions. This large-format coffee book is a perfect purchase for any astronomy buff. Tushna Commissariat

  • 2024 ACC Art Books

The post Delving into the scientific mind, astronomy’s happy accidents, lit science experiments at home, the art of NASA: micro reviews of recent books appeared first on Physics World.

US astronomy facing ‘extinction level’ event following Trump’s 2026 budget request

17 juin 2025 à 16:01

The administration of US president Donald Trump has proposed drastic cuts to science that would have severe consequence for physics and astronomy if passed by the US Congress. The proposal could involve the cancellation of one of the twin US-based gravitational-wave detectors as well as the axing of a proposed next-generation ground-based telescope and a suite of planned NASA mission. Scientific societies, groups of scientists and individuals have expressed their shock over the scale of the reductions.

In the budget request, which represents the start of the budgeting procedure for the year from 1 October, the National Science Foundation (NSF) would see its funding plummet from $9bn to just  $3.9bn – imperilling several significant projects. While the NSF had hoped to support both next-generation ground-based tele­scopes planned by the agency – the Giant Magellan Tele­scope (GMT) and the Thirty Meter Telescope (TMT) – the new budget would only allow one to be supported.

On 12 June the GMT, which is already 40% completed thanks to private funds, received NSF approval confirming that the observatory will advance into its “major facilities final design phase”, one of the final steps before becoming eligible for federal construction funding. The TMT, meanwhile, which is set to be built in Hawaii, has been hit with delays following protests over adding more telescopes to Mauna Kea. In a statement from the TMT International Observatory, it said it was “disappointed that the NSF’s current budget proposal does not include TMT”.

It is also possible that one of the twin Laser Interferometer Gravitational-Wave Observatory (LIGO) facilities – one in Hanford, Washington and the other in Livingston, Louisiana – would have to close down after the budget proposes a 39.6% cut to LIGO operations. Having one LIGO facility would significantly cut its ability to identify and localize events that produce gravitational waves.

“This level of cut, if enacted, would drastically reduce the science coming out of LIGO and have long-term negative consequences for gravitational-wave astrophysics,” notes LIGO executive director David Reitze. LIGO officials told Physics World that the cuts would be “extremely punishing to US gravitational wave science” and would mean “layoffs to staff, reduced scientific output, and the loss of scientific leadership in a field that made first detections just under 10 years ago”.

NASA’s science funding, meanwhile, would reduce by 47% year on year, and the agency as a whole would see more than 5500 staff lose their jobs as its workforce gets slashed from 17 391 to just 11 853. NASA would also lose planned missions to Venus, Mars, Jupiter and the asteroid Apophis that will pass close to Earth in 2029. Several scientific missions focusing on planet Earth, meanwhile, would also be axed.

The American Astronomical Society expressed “grave concern” that the cuts to NASA and the NSF “would result in an historic decline of American investment in basic scientific research”. The Planetary Society called the proposed NASA budget “an extinction-level event for the space agency’s most productive, successful and broadly supported activity”. Before the cuts were announced, the Trump administration pulled its nomination of billionaire industrialist Jared Isaacman for NASA administrator after his supporter Elon Musk left his post as head of the “Department of Government Efficiency.”

‘The elephant in the room’

The Department of Energy, meanwhile, will receive a slight increase in its defence-related budget, from the current $34.0bn to next year’s proposed $33.8bn. But its non-defence budget will fall by 26% from $16.83bn to $12.48bn. Michael Kratsios, Trump’s science adviser and head of the White House Office of Science and Technology Policy, sought to justify the administration’s planned cuts in a meeting at the National Academy of Sciences (NAS) on 19 May.

“Spending more money on the wrong things is far worse than spending less money on the right things,” Kratsios noted, adding that the country had received “diminishing returns” on its investments in science over the past four decades and that it now requires “new methods and approaches to supporting research”. He also suggested that research now undertaken at US universities falls short of what he called “gold standard science”, citing “political biases [that] have displaced the vital search for truth”. Universities, he stated, have lost public trust because they have “promoted diversity, equity and inclusion”.

The US science community, however, is unconvinced. “The elephant in the room right now is whether the drastic reductions in research budgets and new research policies across the federal agencies will allow us to remain a research and development powerhouse,” says Marcia McNutt, president of the National Academy of Sciences. “Thus, we are embarking on a radical new experiment in what conditions promote science leadership – with the US being the ‘treatment’ group, and China as the control.”

Former presidential science adviser Neal Lane, now at Rice University, told Physics World that while the US administration appears to value some aspects of scientific research such as AI, quantum, nuclear and biotechnologies, it “doesn’t seem to understand or acknowledge that technological advances and innovation often come from basic research in unlikely fields of science“. He expects the science community to “continue to push back” by writing and visiting members of Congress, many of whom support science, and “by speaking out to the public and encouraging various organizations to do that same”.

Indeed, an open letter by the group Stand Up for Science dated 26 May calls the administration’s stated commitment to “gold standard science” an approach “that will actually undermine scientific rigor and the transparent progress of science”. It would “introduce stifling limits on intellectual freedom in our nation’s laboratories and federal funding agencies”, the letter adds.

As of 13 June, the letter had more than 9250 signatures. Another letter, sent to Jay Bhattachayra, director of the National Institutes of Health (NIH), from some 350 NIH members, almost 100 of whom identified themselves, asserted that they “remain pressured to implement harmful measures” such as halting clinical trials midstream. In the budget request, the NIH would lose about 40%, leaving it with $27.5bn next year. The administration also plans to consolidate the NIH’s 27 institutes into just eight.

A political divide

On the day that the budget was announced, 16 states run by Democratic governors called on a federal court to block cuts in programmes and funding for the NSF. They point out that universities in their states could lose significant income if the cuts go ahead. In fact, the administration’s budget proposal is just that: a proposal. Congress will almost certainly make changes to it before presenting it to Trump for his signature. And while Republicans in the Senate and House of Representatives find it difficult to oppose the administration, science has historically enjoyed support by both Democrats and Republicans.

Despite that, scientists are gearing up for a difficult summer of speculation about financial support. “We are gaming matters at the moment because we are looking at the next budget cycle,” says Peter Littlewood, chair of the University of Chicago’s physics department. “The principal issues now are to bridge postdocs and graduating PhD students, who are in limbo because offers are drying up.” Littlewood says that, while alternative sources of funding such as philanthropic contributions can help, if the proposed government cuts are approved then philanthropy can’t replace federal support. “I’m less worried about whether this or that piece of research gets done than in stabilizing the pipeline, so all our discussions centre around that,” adds Littlewood.

Lane fears the cuts will put people off from careers in science, even in the unlikely event that all the cuts get reversed. “The combination of statements by the president and other administrative officials do considerable harm by discouraging young people born in the US and other parts of the world from pursuing their education and careers in [science] in America,” he says. “That’s a loss for all Americans.”

The post US astronomy facing ‘extinction level’ event following Trump’s 2026 budget request appeared first on Physics World.

Short-lived eclipsing binary pulsar spotted in Milky Way

17 juin 2025 à 14:00

Astronomers in China have observed a pulsar that becomes partially eclipsed by an orbiting companion star every few hours. This type of observation is very rare and could shed new light on how binary star systems evolve.

While most stars in our galaxy exist in pairs, the way these binary systems form and evolve is still little understood. According to current theories, when two stars orbit each other, one of them may expand so much that its atmosphere becomes large enough to encompass the other. During this “envelope” phase, mass can be transferred from one star to the other, causing the stars’ orbit to shrink over a period of around 1000 years. After this, the stars either merge or the envelope is ejected.

In the special case where one star in the pair is a neutron star, the envelope-ejection scenario should, in theory, produce a helium star that has been “stripped” of much of its material and a “recycled” millisecond pulsar – that is, a rapidly spinning neutron star that flashes radio pulses hundreds of times per second. In this type of binary system, the helium star can periodically eclipse the pulsar as it orbits around it, blocking its radio pulses and preventing us from detecting them here on Earth. Only a few examples of such a binary system have ever been observed, however, and all previous ones were in nearby dwarf galaxies called the Magellanic Clouds, rather than our own Milky Way.

A special pulsar

Astronomers led by Jinlin Han from the National Astronomical Observatories of China say they have now identified the first system of this type in the Milky Way. The pulsar in the binary, denoted PSR J1928+1815, had been previously identified using the Five-hundred-meter Aperture Spherical radio Telescope (FAST) during the FAST Galactic Plane Pulsar Snapshot survey. These observations showed that PSR J1928+1815 has a spin period of 10.55 ms, which is relatively short for a pulsar of this type and suggests it had recently sped up by accreting mass from a companion.

The researchers used FAST to observe this suspected binary system at radio frequencies ranging from 1.0 to 1.5 GHz over a period of four and a half years. They fitted the times that the radio pulses arrived at the telescope with a binary orbit model to show that the system has an eccentricity of less than 3 × 10−5. This suggests that the pulsar and its companion star are in a nearly circular orbit. The diameter of this orbit, Han points out, is smaller than that of our own Sun, and its period – that is, the time it takes the two stars to circle each other – is correspondingly short, at 3.6 hours. For a sixth of this time, the companion star blocks the pulsar’s radio signals.

The team also found that the rate at which this orbital period is changing (the so-called spin period derivative) is unusually high for a millisecond-period pulsar, at 3.63 × 10−18 s s−1 .This shows that energy is rapidly being lost from the system as the pulsar spins down.

“We knew that PSR J1928+1815 was special from November 2021 onwards,” says Han. “Once we’d accumulated data with FAST, one of my students, ZongLin Yang, studied the evolution of such binaries in general and completed the timing calculations from the data we had obtained for this system. His results suggested the existence of the helium star companion and everything then fell into place.”

Short-lived phenomenon

This is the first time a short-life (107 years) binary consisting of a neutron star and a helium star has ever been detected, Han tells Physics World. “It is a product of the common envelope evolution that lasted for only 1000 years and that we couldn’t observe directly,” he says.

“Our new observation is the smoking gun for long-standing binary star evolution theories, such as those that describe how stars exchange mass and shrink their orbits, how the neutron star spins up by accreting matter from its companion and how the shared hydrogen envelope is ejected.”

The system could help astronomers study how neutron stars accrete matter and then cool down, he adds. “The binary detected in this work will evolve to become a system of two compact stars that will eventually merge and become a future source of gravitational waves.”

Full details of the study are reported in Science.

The post Short-lived eclipsing binary pulsar spotted in Milky Way appeared first on Physics World.

How quantum sensors could improve human health and wellbeing

17 juin 2025 à 12:00

As the world celebrates the 2025 International Year of Quantum Science and Technology, it’s natural that we should focus on the exciting applications of quantum physics in computing, communication and cryptography. But quantum physics is also set to have a huge impact on medicine and healthcare. Quantum sensors, in particular, can help us to study the human body and improve medical diagnosis – in fact, several systems are close to being commercialized.

Quantum computers, meanwhile, could one day help us to discover new drugs by providing representations of atomic structures with greater accuracy and by speeding up calculations to identify potential drug reactions. But what other technologies and projects are out there? How can we forge new applications of quantum physics in healthcare and how can we help discover new potential use cases for the technology?

Those are the some of the questions tackled in a recent report, on which this Physics World article is based, published by Innovate UK in October 2024. Entitled Quantum for Life, the report aims to kickstart new collaborations by raising awareness of what quantum physics can do for the healthcare sector. While the report says quite a bit about quantum computing and quantum networking, this article will focus on quantum sensors, which are closer to being deployed.

Sense about sensors

The importance of quantum science to healthcare isn’t new. In fact, when a group of academics and government representatives gathered at Chicheley Hall back in 2013 to hatch plans for the UK’s National Quantum Technologies Programme, healthcare was one of the main applications they identified. The resulting £1bn programme, which co-ordinated the UK’s quantum-research efforts, was recently renewed for another decade and – once again – healthcare is a key part of the remit.

As it happens, most major hospitals already use quantum sensors in the form of magnetic resonance imaging (MRI) machines. Pioneered in the 1970s, these devices manipulate the quantum spin states of hydrogen atoms using magnetic fields and radio waves. By measuring how long those states take to relax, MRI can image soft tissues, such as the brain, and is now a vital part of the modern medicine toolkit.

While an MRI machine measures the quantum properties of atoms, the sensor itself is classical, essentially consisting of electromagnetic coils that detect the magnetic flux produced when atomic spins change direction. More recently, though, we’ve seen a new generation of nanoscale quantum sensors that are sensitive enough to detect magnetic fields emitted by a target biological system. Others, meanwhile, consist of just a single atom and can monitor small changes in the environment.

There are lots of different quantum-based companies and institutions working in the healthcare sector

As the Quantum for Life report shows, there are lots of different quantum-based companies and institutions working in the healthcare sector. There are also many promising types of quantum sensors, which use photons, electrons or spin defects within a material, typically diamond. But ultimately what matters is what quantum sensors can achieve in a medical environment.

Quantum diagnosis

While compiling the report, it became clear that quantum-sensor technologies for healthcare come in five broad categories. The first is what the report labels “lab diagnostics”, in which trained staff use quantum sensors to observe what is going on inside the human body. By monitoring everything from our internal temperature to the composition of cells, the sensors can help to identify diseases such as cancer.

Currently, the only way to definitively diagnose cancer is to take a sample of cells – a biopsy – and examine them under a microscope in a laboratory. Biopsies are often done with visual light but that can damage a sample, making diagnosis tricky. Another option is to use infrared radiation. By monitoring the specific wavelengths the cells absorb, the compounds in a sample can be identified, allowing molecular changes linked with cancer to be tracked.

Unfortunately, it can be hard to differentiate these signals from background noise. What’s more, infrared cameras are much more expensive than those operating in the visible region. One possible solution is being explored by Digistain, a company that was spun out of Imperial College, London, in 2019. It is developing a product called EntangleCam that uses two entangled photons – one infrared and one visible (figure 1).

1 Entangled thoughts

Diagram of a laser beam passing through a diamond, where it is split into two: a beam directed at a cancer cell and a beam that enters a single photon detector
a (Adapted from Quantum for Life: How UK Life Sciences and Healthcare Can Benefit from Quantum Technologies by IOP Publishing)

Two false-colour images of cancer cells – one in purple on beige background, one in bright greens, reds and yellows on black background
b (Courtesy: Digistain)

a One way in which quantum physics is benefiting healthcare is through entangled photons created by passing laser light through a nonlinear crystal (left). Each laser photon gets converted into two lower-energy photons – one visible, one infrared – in a process called spontaneous parametric down conversion. In technology pioneered by the UK company Digistain, the infrared photon can be sent through a sample, with the visible photon picked up by a detector. As the photons are entangled, the visible photon gives information about the infrared photon and the presence of, say, cancer cells. b Shown here are cells seen with traditional stained biopsy (left) and with Digistain’s method (right).

If the infrared photon is absorbed by, say, a breast cancer cell, that immediately affects the visible photon with which it is entangled. So by measuring the visible light, which can be done with a cheap, efficient detector, you can get information about the infrared photon – and hence the presence of a potential cancer cell (Phys. Rev. 108 032613). The technique could therefore allow cancer to be quickly diagnosed before a tumour has built up, although an oncologist would still be needed to identify the area for the technique to be applied.

Point of care

The second promising application of quantum sensors lies in “point-of-care” diagnostics. We all became familiar with the concept during the COVID-19 pandemic when lateral-flow tests proved to be a vital part of the worldwide response to the virus. The tests could be taken anywhere and were quick, simple, reliable and relatively cheap. Something that had originally been designed to be used in a lab was now available to most people at home.

Quantum technology could let us miniaturize such tests further and make them more accurate, such that they could be used at hospitals, doctor’s surgeries or even at home. At the moment, biological indicators of disease tend to be measured by tagging molecules with fluorescent markers and measuring where, when and how much light they emit. But because some molecules are naturally fluorescent, those measurements have to be processed to eliminate the background noise.

One emerging quantum-based alternative is to characterize biological samples by measuring their tiny magnetic fields. This can be done, for example, using diamond specially engineered with nitrogen-vacancy (NV) defects. Each is made by removing two carbon atoms from the lattice and implanting a nitrogen atom in one of the gaps, leaving a vacancy in the other. Behaving like an atom with discrete energy levels, each defect’s spin state is influenced by the local magnetic field and can be “read out” from the way it fluoresces.

One UK company working in this area is Element Six. It has joined forces with the US-based firm QDTI to make a single-crystal diamond-based device that can quickly identify biomarkers in blood plasma, cerebrospinal fluid and other samples extracted from the body. The device detects magnetic fields produced by specific proteins, which can help identify diseases in their early stages, including various cancers and neurodegenerative conditions like Alzheimer’s. Another firm using single-crystal diamond to detect cancer cells is Germany-based Quantum Total Analysis Systems (QTAS).

Matthew Markham, a physicist who is head of quantum technologies at Element Six, thinks that healthcare has been “a real turning point” for the company. “A few years ago, this work was mostly focused on academic problems,” he says. “But now we are seeing this technology being applied to real-world use cases and that it is transitioning into industry with devices being tested in the field.”

An alternative approach involves using tiny nanometre-sized diamond particles with NV centres, which have the advantage of being highly biocompatible. QT Sense of the Netherlands, for example, is using these nanodiamonds to build nano-MRI scanners that can measure the concentration of molecules that have an intrinsic magnetic field. This equipment has already been used by biomedical researchers to investigate single cells (figure 2).

2 Centre of attention

Artist's illustration of a diamond with light entering and exiting, plus a zoom in to show the atomic structure of a nitrogen-vacancy defect
(Courtesy: Element Six)

A nitrogen-vacancy defect in diamond – known as an NV centre – is made by removing two carbon atoms from the lattice and implanting a nitrogen atom in one of the gaps, leaving a vacancy in the other. Using a pulse of green laser light, NV centres can be sent from their ground state to an excited state. If the laser is switched off, the defects return to their ground state, emitting a visible photon that can be detected. However, the rate at which the fluorescent light drops while the laser is off depends on the local magnetic field. As companies like Element Six and QTSense are discovering, NV centres in diamond are great way of measuring magnetic fields in the human body especially as the surrounding lattice of carbon atoms shields the NV centre from noise.

Australian firm FeBI Technologies, meanwhile, is developing a device that uses nanodiamonds to measure the magnetic properties of ferritin – a protein that stores iron in the body. The company claims its technology is nine orders of magnitude more sensitive than traditional MRI and will allow patients to monitor the amount of iron in their blood using a device that is accurate and cheap.

Wearable healthcare

The third area in which quantum technologies are benefiting healthcare is what’s billed in the Quantum for Life report as “consumer medical monitoring and wearable healthcare”. In other words, we’re talking about devices that allow people to monitor their health in daily life on an ongoing basis. Such technologies are particularly useful for people who have a diagnosed medical condition, such as diabetes or high blood pressure.

NIQS Tech, for example, was spun off from the University of Leeds in 2022 and is developing a highly accurate, non-invasive sensor for measuring glucose levels. Traditional glucose-monitoring devices are painful and invasive because they basically involve sticking a needle in the body. While newer devices use light-based spectroscopic measurements, they tend to be less effective for patients with darker skin tones.

The sensor from NIQS Tech instead uses a doped silica platform, which enables quantum interference effects. When placed in contact with the skin and illuminated with laser light, the device fluoresces, with the lifetime of the fluorescence depending on the amount of glucose in the user’s blood, regardless of skin tone. NIQS has already demonstrated proof of concept with lab-based testing and now wants to shrink the technology to create a wearable device that monitors glucose levels continuously.

Body imaging

The fourth application of quantum tech lies in body scanning, which allows patients to be diagnosed without needing a biopsy. One company leading in this area is Cerca Magnetics, which was spun off from the University of Nottingham. In 2023 it won the inaugural qBIG prize for quantum innovation from the Institute of Physics, which publishes Physics World, for developing wearable optically pumped magnetometers for magnetoencephalography (MEG), which measure magnetic fields generated by neuronal firings in the brain. Its devices can be used to scan patients’ brains in a comfortable seated position and even while they are moving.

Quantum-based scanning techniques could also help diagnose breast cancer, which is usually done by exposing a patient’s breast tissue to low doses of X-rays. The trouble with such mammograms is that all breasts contain a mix of low-density fatty and other, higher-density tissue. The latter creates a “white blizzard” effect against the dark background, making it challenging to differentiate between healthy tissue and potential malignancies.

That’s a particular problem for the roughly 40% of women who have a higher concentration of higher-density tissue. One alternative is to use molecular breast imaging (MBI), which involves imaging the distribution of a radioactive tracer that has been intravenously injected into a patient. This tracer, however, exposes patients to a higher (albeit still safe) dose of radiation than with a mammogram, which means that patients have to be imaged for a long time to get enough signal.

A solution could lie with the UK-based firm Kromek, which is using cadmium zinc telluride (CZT) semiconductors that produce a measurable voltage pulse from just a single gamma-ray photon. As well as being very efficient over a broad range of X-ray and gamma-ray photon energies, CZTs can be integrated onto small chips operating at room temperature. Preliminary results with Kromek’s ultralow-dose and ultrafast detectors show they work with barely one-eighth of the amount of tracer as traditional MBI techniques.

Four samples of cadmium zinc telluride next to a ruler for scale
Faster and better Breast cancer is often detected with X-rays using mammography but it can be tricky to spot tumours in areas where the breast tissue is dense. One alternative is molecular breast imaging (MBI), which uses a radioactive tracer to “light up” areas of cancer in the breast and works even in dense breast tissue. However, MBI currently exposes patients to more radiation than with mammography, which is where cadmium zinc telluride (CZT) semiconductors, developed by the UK firm Kromek, could help. They produce a measurable voltage pulse from just a single gamma-ray photon, opening the door for “ultralow-dose MBI” – where much clearer images are created with barely one-eighth of the radiation. (Courtesy: Kromek)

“Our prototypes have shown promising results,” says Alexander Cherlin, who is principal physicist at Kromek. The company is now designing and building a full-size prototype of the camera as part of Innovate UK’s £2.5m “ultralow-dose” MBI project, which runs until the end of 2025. It involves Kromek working with hospitals in Newcastle along with researchers at University College London and the University of Newcastle.

Microscopy matters

The final application of quantum sensors to medicine lies in microscopy, which these days no longer just means visible light but everything from Raman and two-photon microscopy to fluorescence lifetime imaging and multiphoton microscopy. These techniques allow samples to be imaged at different scales and speeds, but they are all reaching various technological limits.

Quantum technologies can help us break the technological limits of microscopy

Quantum technologies can help us break those limits. Researchers at the University of Glasgow, for example, are among those to have used pairs of entangled photons to enhance microscopy through “ghost imaging”. One photon in each pair interacts with a sample, with the image built up by detecting the effect on its entangled counterpart. The technique avoids the noise created when imaging with low levels of light (Sci. Adv. 6 eaay2652).

Researchers at the University of Strathclyde, meanwhile, have used nanodiamonds to get around the problem that dyes added to biological samples eventually stop fluorescing. Known as photobleaching, the effect prevents samples from being studied after a certain time (Roy. Soc. Op. Sci. 6 190589). In the work, samples could be continually imaged and viewed using two-photon excitation microscopy with a 10-fold increase in resolution.

Looking to the future

But despite the great potential of quantum sensors in medicine, there are still big challenges before the technology can be deployed in real, clinical settings. Scalability – making devices reliably, cheaply and in sufficient numbers – is a particular problem. Fortunately, things are moving fast. Even since the Quantum for Life report came out late in 2024, we’ve seen new companies being founded to address these problems.

One such firm is Bristol-based RobQuant, which is developing solid-state semiconductor quantum sensors for non-invasive magnetic scanning of the brain. Such sensors, which can be built with the standard processing techniques used in consumer electronics, allow for scans on different parts of the body. RobQuant claims its sensors are robust and operate at ambient temperatures without requiring any heating or cooling.

Agnethe Seim Olsen, the company’s co-founder and chief technologist, believes that making quantum sensors robust and scalable is vital if they are to be widely adopted in healthcare. She thinks the UK is leading the way in the commercialization of such sensors and will benefit from the latest phase of the country’s quantum hubs. Bringing academia and businesses together, they include the £24m Q-BIOMED biomedical-sensing hub led by University College London and the £27.5m QuSIT hub in imaging and timing led by the University of Birmingham.

Q-BIOMED is, for example, planning to use both single-crystal diamond and nanodiamonds to develop and commercialize sensors that can diagnose and treat diseases such as cancer and Alzheimer’s at much earlier stages of their development. “These healthcare ambitions are not restricted to academia, with many startups around the globe developing diamond-based quantum technology,” says Markham at Element Six.

As with the previous phases of the hubs, allowing for further research encourages start-ups – researchers from the forerunner of the QuSIT hub, for example, set up Cerca Magnetics. The growing maturity of some of these quantum sensors will undoubtedly attract existing medical-technology companies. The next five years will be a busy and exciting time for the burgeoning use of quantum sensors in healthcare.

This article forms part of Physics World‘s contribution to the 2025 International Year of Quantum Science and Technology (IYQ), which aims to raise global awareness of quantum physics and its applications.

Stayed tuned to Physics World and our international partners throughout the next 12 months for more coverage of the IYQ.

Find out more on our quantum channel.

The post How quantum sensors could improve human health and wellbeing appeared first on Physics World.

Handheld device captures airborne signs of disease

16 juin 2025 à 16:00

A sensitive new portable device can detect gas molecules associated with certain diseases by condensing dilute airborne biomarkers into concentrated liquid droplets. According to its developers at the University of Chicago in the US, the device could be used to detect airborne viruses or bacteria in hospitals and other public places, improve neonatal care, and even allow diabetic patients to read glucose levels in their breath, to list just three examples.

Many disease biomarkers are only found in breath or ambient air at levels of a few parts per trillion. This makes them very difficult to detect compared with biomarkers in biofluids such as blood, saliva or mucus, where they are much more concentrated. Traditionally, reaching a high enough sensitivity required bulky and expensive equipment such as mass spectrometers, which are impractical for everyday environments.

Rapid and sensitive identification

Researchers led by biophysicist and materials chemist Bozhi Tian have now developed a highly portable alternative. Their new Airborne Biomarker Localization Engine (ABLE) can detect both non-volatile and volatile molecules in air in around 15 minutes.

This handheld device comprises a cooled condenser surface, an air pump and microfluidic enrichment modules, and it works in the following way. First, air that (potentially) contains biomarkers flows into a cooled chamber. Within this chamber, Tian explains, the supersaturated moisture condenses onto nanostructured superhydrophobic surfaces and forms droplets. Any particles in the air thus become suspended inside the droplets, which means they can be analysed using conventional liquid-phase biosensors such as colorimeteric test strips or electrochemical probes. This allows them to be identified rapidly with high sensitivity.

Tiny babies and a big idea

Tian says the inspiration for this study, which is detailed in Nature Chemical Engineering, came from a visit he made to a neonatal intensive care unit (NICU) in 2021. “Here, I observed the vulnerability and fragility of preterm infants and realized how important non-invasive monitoring is for them,” Tian explains.

“My colleagues and I envisioned a contact-free system capable of detecting disease-related molecules in air. Our biggest challenge was sensitivity and initial trials failed to detect key chemicals,” he remembers. “We overcame this problem by developing a new enrichment strategy using nanostructured condensation and molecular sieves while also exploiting evaporation physics to stabilize and concentrate the captured biomarkers.”

The technology opens new avenues for non-contact, point-of-care diagnostics, he tells Physics World. Possible near-term applications include the early detection of ailments such as inflammatory bowel disease (IBD), which can lead to markers of inflammation appearing in patients’ breath. Respiratory disorders and neurodevelopment conditions in babies could be detected in a similar way. Tian suggests the device could even be used for mental health monitoring via volatile stress biomarkers (again found in breath) and for monitoring air quality in public spaces such as schools and hospitals.

“Thanks to its high sensitivity and low cost (of around $200), ABLE could democratize biomarker sensing, moving diagnostics beyond the laboratory and into homes, clinics and underserved areas, allowing for a new paradigm in preventative and personalized medicine,” he says.

Widespread applications driven by novel physics

The University of Chicago scientists’ next goal is to further miniaturize and optimize the ABLE device. They are especially interested in enhancing its sensitivity and energy efficiency, as well as exploring the possibility of real-time feedback through closed-loop integration with wearable sensors. “We also plan to extend its applications to infectious disease surveillance and food spoilage detection,” Tian reveals.

The researchers are currently collaborating with health professionals to test ABLE in real-world settings such as NICUs and outpatient clinics. In the future, though, they also hope to explore novel physical processes that might improve the efficiency at which devices like these can capture hydrophobic or nonpolar airborne molecules.

According to Tian, the work has unveiled “unexpected evaporation physics” in dilute droplets with multiple components. Notably, they have seen evidence that such droplets defy the limit set by Henry’s law, which states that at constant temperature, the amount of a gas that dissolves in a liquid of a given type and volume is directly proportional to the partial pressure of the gas in equilibrium with the liquid. “This opens a new physical framework for such condensation-driven sensing and lays the foundation for widespread applications in the non-contact diagnostics, environmental monitoring and public health applications mentioned,” Tian says.

The post Handheld device captures airborne signs of disease appeared first on Physics World.

‘Can’t get you out of my head’: using earworms to teach physics

16 juin 2025 à 12:00

When I’m sitting in my armchair, eating chocolate and finding it hard to motivate myself to exercise, a little voice in my head starts singing “You’ve got to move it, move it” to the tune of will.i.am’s “I like to move it”. The positive reinforcement and joy of this song as it plays on a loop in my mind propels me out of my seat and onto the tennis court.

Songs like this are earworms – catchy pieces of music that play on repeat in your head long after you’ve heard them. Some tunes are more likely to become earworms than others, and there are a few reasons for this.

To truly hook you in, the music must be repetitive so that the brain can easily finish it. Generally, it is also simple, and has a rising and falling pitch shape. While you need to hear a song several times for it to stick, once it’s wormed its way into your head, some lyrics become impossible to escape – “I just can’t get you out of my head”, as Kylie would say.

In his book Musicophilia, neurologist Oliver Sacks describes these internal music loops as “the brainworms that arrive unbidden and leave only on their own time”. They can fade away, but they tend to lie in wait, dormant until an association sets them off again – like when I need to exercise. But for me as a physics teacher for 16–18 year olds, this fact is more than just of passing interest: I use it in the classroom.

There are some common mistakes students make in physics, so I play songs in class that are linked (sometimes tenuously) to the syllabus to remind them to check their work. Before I continue, I should add that I’m not advocating rote learning without understanding – the explanation of the concept must always come first. But I have found the right earworm can be a great memory aid.

I’ve been a physics teacher for a while, and I’ll admit to a slight bias towards the music of the 1980s and 1990s. I play David Bowie’s “Changes” (which the students associate with the movie Shrek) when I ask the class to draw a graph, to remind them to check if they need to process – or change – the data before plotting. The catchy “Ch…ch…ch…changes” is now the irritating tune they hear when I look over their shoulders to check if they have found, for example, the sine values for Snell’s law, or the square root of tension if looking at the frequency of a stretched wire.

When describing how to verify the law of conservation of momentum, students frequently leave out the mechanism that makes the two trollies stick together after the collision. Naturally, this is an opportunity for me to play Roxy Music’s “Let’s stick together”.

Meanwhile, “Ice ice baby” by Vanilla Ice is obviously the perfect earworm for calculating the specific latent heat of fusion of ice, which is when students often drop parts of the equations because they forget that the ice both melts and changes temperature.

In the experiment where you charge a gold leaf electroscope by induction, pupils often fail to do the four steps in the correct order. I therefore play Shirley Bassey’s “Goldfinger” to remind pupils to earth the disc with their finger. Meanwhile, Spandau Ballet’s bold and dramatic “Gold” is reserved for Rutherford’s gold leaf experiment.

“Pump up the volume” by M|A|R|R|S or Ireland’s 1990 football song “Put ‘em under pressure” are obvious candidates for investigating Boyle’s law. I use “Jump around” by House of Pain when causing a current-carrying conductor in a magnetic field to experience a force.

Some people may think that linking musical lyrics and physics in this way is a waste of time. However, it also introduces some light-hearted humour into the classroom – and I find teenagers learn better with laughter. The students enjoy mocking my taste in music and coming up with suitable (more modern) songs, and we laugh together about the tenuous links I’ve made between lyrics and physics.

More importantly, this is how my memory works. I link phrases or lyrics to the important things I need to remember. Auditory information functions as a strong mnemonic. I am not saying that this works for everyone, but I have heard my students sing the lyrics to each other while studying in pairs or groups. I smile to myself as I circulate the room when I hear them saying phrases like, “No you forgot mass × specific latent heat – remember it’s ‘Ice, ice baby!’ ”.

On their last day of school – after two years of playing these tunes in class – I hold a quiz where I play a song and the students have to link it to the physics. It turns into a bit of a sing-along, with chocolate for prizes, and there are usually a few surprises in there too. Have a go yourself with the quiz below.

Earworms quiz

Can you match the following eight physics laws or experiments with the right song? If you can’t remember the songs, we’ve provided links – but beware, they are earworms!

Law or experiment

  1. Demonstrating resonance with Barton’s pendulums
  2. Joule’s law
  3. The latent heat of vaporization of water
  4. Measuring acceleration due to gravity
  5. The movement caused when a current is applied to a coil in a magnetic field
  6. Measuring the pascal
  7. How nuclear fission releases sustainable amounts of energy
  8. Plotting current versus voltage for a diode in forward bias

Artist and song

Answers will be revealed next month – just come back to this article to find out whether you got them all right.

The post ‘Can’t get you out of my head’: using earworms to teach physics appeared first on Physics World.

Yale researcher says levitated spheres could spot neutrinos ‘within months’

14 juin 2025 à 02:18

The Helgoland 2025 meeting, marking 100 years of quantum mechanics, has featured a lot of mind-bending fundamental physics, quite a bit of which has left me scratching my head.

So it was great to hear a brilliant talk by David Moore of Yale University about some amazing practical experiments using levitated, trapped microspheres as quantum sensors to detect what he calls the “invisible” universe.

If the work sounds familar to you, that’s because Moore’s team won a Physics World Top 10 Breakthrough of the Year award in 2024 for using their technique to detect the alpha decay of individual lead-212 atoms.

Speaking in the Nordseehalle on the island of Helgoland, Moore explained the next stage of the experiment, which could see it detect neutrinos “in a couple of months” at the earliest – and “at least within a year” at the latest.

Of course, physicists have already detected neutrinos, but it’s a complicated business, generally involving huge devices in deep underground locations where background signals are minimized. Yale’s set up is much cheaper, smaller and more convenient, involving no more than a couple of lab benches.

As Moore explained, he and his colleagues first trap silica spheres at low pressure, before removing excess electrons to electrically neutralize them. They then stabilize the spheres’ rotation before cooling them to microkelvin temperatures.

In the work that won the Physics World award last year, the team used samples of radon-220, which decays first into polonium-216 and then lead-212. These nuclei embed theselves in the silica spheres, which recoil when the lead-212 decays by releasing an alpha particle (Phys. Rev. Lett. 133 023602).

Moore’s team is able to measure the tiny recoil by watching how light scatters off the spheres. “We can see the force imparted by a subatomic particle on a heavier object,” he told the audience at Helgoland. “We can see single nuclear decays.”

Now the plan is to extend the experiment to detect neutrinos. These won’t (at least initially) be the neutrinos that stream through the Earth from the Sun or even those from a nuclear reactor.

Instead, the idea will be to embed the spheres with nuclei that undergo beta decay, releasing a much lighter neutrino in the process. Moore says the team will do this within a year and, one day, potentially even use to it spot dark matter.

“We are reaching the quantum measurement regime,” he said. It’s a simple concept, even if the name – “Search for new Interactions in a Microsphere Precision Levitation Experiment” (SIMPLE) – isn’t.

This article forms part of Physics World‘s contribution to the 2025 International Year of Quantum Science and Technology (IYQ), which aims to raise global awareness of quantum physics and its applications.

Stayed tuned to Physics World and our international partners throughout the next 12 months for more coverage of the IYQ.

Find out more on our quantum channel.

The post Yale researcher says levitated spheres could spot neutrinos ‘within months’ appeared first on Physics World.

Worm slime could inspire recyclable polymer design

13 juin 2025 à 09:53

The animal world – including some of its ickiest parts – never ceases to amaze. According to researchers in Canada and Singapore, velvet worm slime contains an ingredient that could revolutionize the design of high-performance polymers, making them far more sustainable than current versions.

“We have been investigating velvet worm slime as a model system for inspiring new adhesives and recyclable plastics because of its ability to reversibly form strong fibres,” explains Matthew Harrington, the McGill University chemist who co-led the research with Ali Miserez of Nanyang Technological University (NTU). “We needed to understand the mechanism that drives this reversible fibre formation, and we discovered a hitherto unknown feature of the proteins in the slime that might provide a very important clue in this context.”

The velvet worm (phylum Onychophora) is a small, caterpillar-like creature that lives in humid forests. Although several organisms, including spiders and mussels, produce protein-based slimy material outside their bodies, the slime of the velvet worm is unique. Produced from specialized papillae on each side of the worm’s head, and squirted out in jets whenever the worm needs to capture prey or defend itself, it quickly transforms from a sticky, viscoelastic gel into stiff, glassy fibres as strong as nylon.

When dissolved in water, these stiff fibres return to their biomolecular precursors. Remarkably, new fibres can then be drawn from the solution – implyimg that the instructions for fibre self-assembly are “encoded” within the precursors themselves, Harrington says.

High-molecular-weight protein identified

Previously, the molecular mechanisms behind this reversibility were little understood. In the present study, however, the researchers used protein sequencing and the AI-guided protein structure prediction algorithm AlphaFold to identify a specific high-molecular-weight protein in the slime. Known as a leucine-rich repeat, this protein has a structure similar to that of a cell surface receptor protein called a Toll-like receptor (TLR).

In biology, Miserez explains, this type of receptor is involved in immune system response. It also plays a role in embryonic or neural development. In the worm slime, however, that’s not the case.

“We have now unveiled a very different role for TLR proteins,” says Miserez, who works in NTU’s materials science and engineering department. “They play a structural, mechanical role and can be seen as a kind of ‘glue protein’ at the molecular level that brings together many other slime proteins to form the macroscopic fibres.”

Miserez adds that the team found this same protein in different species of velvet worms that diverged from a common ancestor nearly 400 million years ago. “This means that this different biological function is very ancient from an evolutionary perspective,” he explains.

“It was very unusual to find such a protein in the context of a biological material,” Harrington adds. “By predicting the protein’s structure and its ability to bind to other slime proteins, we were able to hypothesize its important role in the reversible fibre formation behaviour of the slime.”

The team’s hypothesis is that the reversibility of fibre formation is based on receptor-ligand interactions between several slime proteins. While Harrington acknowledges that much work remains to be done to verify this, he notes that such binding is a well-described principle in many groups of organisms, including bacteria, plants and animals. It is also crucial for cell adhesion, development and innate immunity. “If we can confirm this, it could provide inspiration for making high-performance non-toxic (bio)polymeric materials that are also recyclable,” he tells Physics World.

The study, which is detailed in PNAS, was mainly based on computational modelling and protein structure prediction. The next step, say the McGill researchers, is to purify or recombinantly express the proteins of interest and test their interactions in vitro.

The post Worm slime could inspire recyclable polymer design appeared first on Physics World.

Helgoland researchers seek microplastics and microfibres in the sea

12 juin 2025 à 23:05

I’ve been immersed in quantum physics this week at the Helgoland 2025 meeting, which is being held to mark Werner Heisenberg’s seminal development of quantum mechanics on the island 100 years ago.

But when it comes to science, Helgoland isn’t only about quantum physics. It’s also home to an outpost of the Alfred Wegener Institute, which is part of the Helmholtz Centre for Polar and Marine Research and named after the German scientist who was the brains behind continental drift.

Dating back to 1892, the Biological Institute Helgoland (BAH) has about 80 permanent staff. They include Sebastian Primpke, a polymer scientist who studies the growing danger of microplastics and microfibres on the oceans.

Microplastics, which are any kind of small plastic materials, generally range in size from one micron to about 5 mm. They are a big danger for fish and other forms of marine life, as Marric Stephens reported in this recent feature.

Primpke studies microplastics using biofilms attached to a grid immersed in a tank containing water piped continuously in from the North Sea. The tank is covered with a lid to keep samples in the dark, mimicking underwater conditions.

Photo of reseracher looking at a computer screen.
Deep-sea spying A researcher looks at electron micrographs to spot microfibres in seawater samples. (Courtesy: Matin Durrani)

He and his team periodically take samples from the films out, studying them in the lab using infrared and Raman microscopes. They’re able to obtain information such as the length, width, area, perimeter of individual microplastic particles as well as how convex or concave they are.

Other researchers at the Hegloland lab study microfibres, which can come from cellulose and artificial plastics, using electron microscopy. You can find out more information about the lab’s work here.

Primpke, who is a part-time firefighter, has lived and worked on Helgoland for a decade. He says it’s a small community, where everyone knows everyone else, which has its good and bad sides.

With only 1500 residents on the island, which lies 50 km from the mainland, finding good accommodation can be tricky. But with so many tourists, there are more amenities than you’d expect of somewhere of that size.

 

The post Helgoland researchers seek microplastics and microfibres in the sea appeared first on Physics World.

Exploring careers in healthcare for physicists and engineers

12 juin 2025 à 15:55

In this episode of the Physics World Weekly podcast we explore the career opportunities open to physicists and engineers looking to work within healthcare – as medical physicists or clinical engineers.

Physics World’s Tami Freeman is in conversation with two early-career physicists working in the UK’s National Health Service (NHS). They are Rachel Allcock, a trainee clinical scientist at University Hospitals Coventry and Warwickshire NHS Trust, and George Bruce, a clinical scientist at NHS Greater Glasgow and Clyde. We also hear from Chris Watt, head of communications and public affairs at IPEM, about the new IPEM careers guide.

Courtesy: RADformationThis episode is supported by Radformation, which is redefining automation in radiation oncology with a full suite of tools designed to streamline clinical workflows and boost efficiency. At the centre of it all is AutoContour, a powerful AI-driven autocontouring solution trusted by centres worldwide.

The post Exploring careers in healthcare for physicists and engineers appeared first on Physics World.

Quantum island: why Helgoland is a great spot for fundamental thinking

12 juin 2025 à 02:00

Jack Harris, a quantum physicist at Yale University in the US, has a fascination with islands. He grew up on Martha’s Vineyard, an island just south of Cape Cod on the east coast of America, and believes that islands shape a person’s thinking. “Your world view has a border – you’re on or you’re off,” Harris said on a recent episode of the Physics World Stories podcast.

It’s perhaps not surprising, then, that Harris is one of the main organizers of a five-day conference taking place this week on Helgoland, where Werner Heisenberg discovered quantum mechanics exactly a century ago. Heisenberg had come to the tiny, windy, pollen-free island, which lies 50 km off the coast of Germany, in June 1925, to seek respite from the hay fever he was suffering from in Göttingen.

According to Heisenberg’s 1971 book Physics and Beyond, he supposedly made his breakthrough early one morning that month. Unable to sleep, Heisenberg left his guest house just before daybreak and climbed a tower at the top of the island’s southern headland. As the Sun rose, Heisenberg pieced together the curious observations of frequencies of light that materials had been seen to absorb and emit.

PHoto of memorial stone on Helgoland honouring Werner Heisenberg
Where it all began This memorial stone and plaque sits at the spot of Werner Heisenberg’s achievements 100 years ago. (Courtesy: Matin Durrani)

While admitting that the real history of the episode isn’t as simple as Heisenberg made out, Harris believes it’s nevertheless a “very compelling” story. “It has a place and a time: an actual, clearly defined, quantized discrete place – an island,” Harris says. “This is a cool story to have as part of the fabric of [the physics] community.” Hardly surprising, then, that more than 300 physicists, myself included, have travelled from across the world to the Helgoland 2025 meeting.

Much time has been spent so far at the event discussing the fundamentals of quantum mechanics, which might seem a touch self-indulgent and esoteric given the burgeoning  (and financially lucrative) applications of the subject. Do we really need to concern ourselves with, say, non-locality, the meaning of measurement, or the nature of particles, information and randomness?

Why did we need to hear from Juan Maldacena from the Institute for Advanced Study in Princeton getting so excited talking about information loss and black holes? (Fun fact: a “white” black hole the size of a bacterium would, he claimed, be as hot as the Sun and emit so much light we could see it with the naked eye.)

But the fundamental questions are fascinating in their own right. What’s more, if we want to build, say, a quantum computer, it’s not just a technical and engineering endeavour. “To make it work you have to absorb a lot of the foundational topics of quantum mechanics,” says Harris, pointing to challenges such as knowing what kinds of information alter how a system behaves. “We’re at a point where real-word practical things like quantum computing, code breaking and signal detection hinge on our ability to understand the foundational questions of quantum mechanics.”

This article forms part of Physics World‘s contribution to the 2025 International Year of Quantum Science and Technology (IYQ), which aims to raise global awareness of quantum physics and its applications.

Stayed tuned to Physics World and our international partners throughout the next 12 months for more coverage of the IYQ.

Find out more on our quantum channel.

The post Quantum island: why Helgoland is a great spot for fundamental thinking appeared first on Physics World.

‘The Trump uncertainty principle’ is destroying the position and momentum of US science

11 juin 2025 à 12:00

The Heisenberg uncertainty principle holds things together. Articulated by the German physicist Werner Heisenberg almost a century ago, it remains the foundation of the physical world. Its name suggests the rule of the vague and temporary. But the principle is quantitative. A high uncertainty about the position of, say, an electron is compensated by a low uncertainty in its momentum. The principle is vital in helping us to understand chemical bonding, which is what holds matter together.

The Trump uncertainty principle, which I hereby coin, does the opposite; it tears things apart. Having taken effect on the US president’s inauguration day back in January, it almost immediately began damaging scientific culture. Researchers can no longer be sure if their grants will be delayed or axed – or if new proposals are even in the ballpark of the potentially fundable. Work is being stalled, erased or doomed, especially in the medical and environmental sciences.

The Trump uncertainty principle, or TUP for short, is implemented in several ways. One is through new policies at funding agencies like the National Science Foundation (NSF) and the National Institutes of Health (NIH). Those new policies, the administration claims, are designed to promote “science, national health, prosperity, and defense”. Despite being exactly the same as the old policies, they’ve been used to justify the cancellation of 400 grants at the NSF alone and hollow out the NSF, NIH and other key US science funding agencies.

The Trump administration has sought to terminate billions of dollars worth of grants at Harvard University alone. It wants to ban US universities from recruiting international students and has even been cancelling the visas of current students, many of whom are enrolled in the sciences. It also wants to vet what prospective students have posted on social media, despite Trump’s supposed support for free speech. Harvard is already suing the Administration over these actions.

Back in March the Office for Civil Rights of the US Department of Education sent letters to Harvard and 59 other universities, including Columbia, Cornell, Princeton, Stanford and Yale, accusing them of what it considers “discrimination and harassment”. The office threatened “potential enforcement actions if institutions do not fulfil their obligations under Title VI of the Civil Rights Act”, which “prohibits discrimination against or otherwise excluding individuals on the basis of race, color, or national origin”.

“Saddening, traumatic and unnecessary”

But the impact of the Trump uncertainty principle reaches far beyond these 60 institutions because it is destroying the bonding of these institutions through its impact on the labs, institutions and companies that collaborate with them. It is also badly damaging the hiring of postdocs, the ability to attract undergraduates, the retention of skilled support staff, and laboratory maintenance. Most disruptively of all, the Trump uncertainty principle provides no explanation for why or where it shows up, or what it is going to be applied to.

The Trump uncertainty principle provides no explanation for why or where it shows up, or what it is going to be applied to

Stony Brook University, where I teach, is a research incubator not on the list of 60 institutions of higher learning threatened by the Department of Education. But many of my colleagues have had their NIH, NSF or Department of Energy funding paused, left unrenewed, or suspended without explanation, and nobody could tell them whether or when it might be restored or why it was stopped in the first place.

Support for 11 graduate students at Stony Brook was terminated. Though it was later restored after months of uncertainty, nobody knows if it might happen again. I, too, had a grant stopped, though it was due to a crude error and the money started up again. Everyone in the sciences I’ve spoken to – faculty, staff and students – is affected in one way or another by the Trump uncertainty principle even if they haven’t lost funding or jobs.

It is easy to sound hyperbolic. It is possible that Trump’s draconian cuts may be reversed, that the threats won’t be implemented, that they won’t stand up in court, and that the Trump administration will actually respect the court decisions. But that’s not the point. You can’t plan ahead if you are unsure how much money you have, or even why you may be in the administration’s cross-hairs. That’s what is most destructive to US science. It’s also saddening, traumatic and unnecessary.

Maintaining any culture, including an academic research one, requires supporting an active and ongoing dynamic between past, present and future. It consists of an inherited array of resources, a set of ideas about how to go forward, and existing habits and practices about how best to move from one to the other. The Trump administration targets all three. It has slashed budgets and staff of long-standing scientific institutions and redirected future-directed scientific programmes at its whim. The Trump uncertainty principle also comes into play by damaging the existing habits and practices in the present.

The critical point

In his 2016 book The Invention of Science, David Wootton – a historian at the University of York in the UK – defined scientific culture as being “innovative, combative, competitive, but at the same time obsessed with accuracy”. Science isn’t the only kind of culture, he admitted, but it’s “a practical and effective one if your goal is the acquisition of new knowledge”. It seeks to produce knowledge about the world that can withstand criticism – “bomb-proof”, as Wootton put it.

Bomb-proof knowledge is what Trump fears the most, and he is undermining it by injecting uncertainty into the culture that produces it. The administration says that the Trump uncertainty principle is grounded in the fight against financial waste, fraud and discrimination. But proof of the principle is missing.

How do you save money by ending, say, a programme aimed at diagnosing tuberculosis? Why does a study of maternal health promote discrimination? What does research into Alzheimer’s disease have to do with diversity? Has ending scientific study of climate change got anything to do with any of this?

The justifications are not credible, and their lack of credibility is a leading factor in damaging scientific culture. Quite simply, the Trump uncertainty principle is destroying the position and momentum of US science.

The post ‘The Trump uncertainty principle’ is destroying the position and momentum of US science appeared first on Physics World.

Sound waves control droplet movement in microfluidic processor

11 juin 2025 à 10:00

Thanks to a new sound-based control system, a microfluidic processor can precisely manipulate droplets with an exceptionally broad range of volumes. The minimalist device is compatible with many substrates, including metals, polymers and glass. It is also biocompatible, and its developers at the Hong Kong Polytechnic University say it could be a transformative tool for applications in biology, chemistry and lab-on-a-chip systems.

Nano- and microfluidic systems use the principles of micro- and nanotechnology, biochemistry, engineering and physics to manipulate the behaviour of liquids on a small scale. Over the past few decades, they have revolutionized fluid processing, enabling researchers in a host of fields to perform tasks on chips that would previously have required painstaking test-tube-based work. The benefits include real-time, high-throughput testing for point-of care diagnostics using tiny sample sizes.

Microfluidics also play a role in several everyday technologies, including inkjet printer heads, pregnancy tests and, as the world recently discovered, tests for viruses like SARS-Cov2, which causes COVID-19. Indeed, the latter example involves a whole series of fluidic operations, as viral RNA is extracted from swabs, amplified and quantified using the polymerase chain reaction (PCR).

In each of these operations, it is vital to avoid contaminating the sample with other fluids. Researchers have therefore been striving to develop contactless techniques – for instance, those that rely on light, heat or magnetic and electric fields to move the fluids around. However, such approaches often require strong fields or high temperatures that can damage delicate chemical or biological samples.

In recent years, scientists have experimented with using acoustic fields instead. However, this method was previously found to work only for certain types of fluids, and with a limited volume range from hundreds of nanolitres (nL) to tens of microlitres (μL).

Versatile, residue-free fluid control

The new sound-controlled fluidic processor (SFP) developed by Liqiu Wang and colleagues is not bound by this limit. Thanks to an ultrasonic transducer and a liquid-infused slippery surface that minimizes adhesion of the samples, it can manipulate droplets with volumes of between 1 nL to 3000 μL. “By adjusting the sound source’s position, we can shape acoustic pressure fields to push, pull, mix or even split droplets on demand,” explains Wang. “This method ensures versatile, residue-free fluid control.”

The technique’s non-invasive nature and precision make it ideal for point-of-care diagnostics, drug screening and automated biochemical assays, Wang adds. “It could also help streamline reagent delivery in high-throughput systems,” he tells Physics World.

A further use, Wang suggests, would be fundamental biological applications such as organoid research. Indeed, the Hong Kong researchers demonstrated this by culturing mouse primary liver organoids and screening for molecules like verapamil, a drug that can protect the liver by preventing harmful calcium buildup.

Wang and colleagues, who report their work in Science Advances, say they now plan to integrate their sound-controlled fluidic processor into fully automated, programmable lab-on-a-chip systems. “Future steps include miniaturization and incorporating multiple acoustic sources for parallel operations, paving the way for next-generation diagnostics and chemical processing,” Wang reveals.

The post Sound waves control droplet movement in microfluidic processor appeared first on Physics World.

Quartet of Nobel laureates sign Helgoland’s ‘gold book’

11 juin 2025 à 00:04

The first session at the Helgoland 2025 meeting marking the centenary of quantum mechanics began with the four Nobel-prize-winning physicsts in attendance being invited on stage to sign the island’s memorial “gold book” and add a short statement to it.

Anton Zeilinger and Alain Aspect, who shared the 2022 Nobel prize with John Clauser for their work on entanglement and quantum information science, were first up on stage. They were followed by Serge Haroche and David Wineland, who shared the 2012 prize for their work on measuring and manipulating quantum systems.

During the coffee break, the book was placed on display for participants to view and add their own signatures if they wished. Naturally, being the nosey person I am, I was keen to see what the Nobel laureates had written.

Photo of four Nobel laureates on stage at Helgoland 2025.
Signing ceremony (From left to right) Anton Zeilinger, Alain Aspect, Serge Haroche and David Wineland troop on stage to sign the Helgoland book. (Courtesy: Matin Durrani)

Here, for the record, are their comments.

“Great sailing. Great people.” Anton Zeilinger

“C’est une émotion de se trouver à l’endroit où a commencé la méchanique quantique.” Alain Aspect [It’s an emotional feeling to find yourself in the place where quantum mechanics started.]

“Thank you for your warm welcome in Helgoland, an island which is known by all quantum physicists.” Serge Haroche

“An honor to be here.” David Wineland

All the comments made sense to me apart from that of Zeilinger so after the evening’s panel debate on the foundations of quantum mechanics, in which he had taken part, I asked him what the reference to sailing was all about.

Turns out that Zeilinger (as Albert Einstein once was) is a keen sailor in his spare time and he and his wife had come to Helgoland three days before the conference began to see the final stages of a North Sea regatta that takes place in late spring every year.

In fact, Zeilinger explained that the Helgoland meeting had to start on a Tuesday as the day before the venue was host to the regatta’s awards ceremony.

As for the flag, it is that of Helgoland, with the green representing the land, the red for the island’s cliffs and the white for the sand on the beaches.

This article forms part of Physics World‘s contribution to the 2025 International Year of Quantum Science and Technology (IYQ), which aims to raise global awareness of quantum physics and its applications.

Stayed tuned to Physics World and our international partners throughout the next 12 months for more coverage of the IYQ.

Find out more on our quantum channel.

The post Quartet of Nobel laureates sign Helgoland’s ‘gold book’ appeared first on Physics World.

Conference marking 100 years of quantum mechanics starts in Hamburg

10 juin 2025 à 15:04

“This is a birthday party! Happy 100th birthday quantum mechanics,” said Jack Harris from Yale University in the US to whoops and cheers in the banqueting suite of the Hotel Atlantic in Hamburg, Germany.

Harris was addressing the 300 or so physicists attending the Helgoland 2025 conference, which is taking place from 9–14 June to mark Werner Heisenberg’s seminal work on quantum mechanics on the island of Helgoland in the North Sea exactly 100 years ago.

Photo of delegates at Helgoland 2025
Time to celebrate Participants gather ahead of the conference buffet dinner. (Courtesy: Matin Durrani)

“Heisenberg travelled to Helgoland to escape terrible allergies” Harris told delegates, reminding them of how the young 23-year-old had taken leave of absence from his postdoc supervisor Max Born in Göttingen for the fresh air of the treeless island. “His two weeks there was one of the watershed events in the discovery of quantum mechanics.”

Harris admitted, though, that it’s open to debate if Heisenberg’s fortnight on the island was as significant as is often made out, joking that – like quantum mechanics itself – “there are many interpretations that one can apply to this occasion”.

In one interpretation I hadn’t considered before, Harris pointed out that what might be regarded as an impediment or a disability – Heisenberg’s severe hayfever – turned out to be a positive force for science. “It actually brought him to Helgoland in the first place.”

Harris also took the opportunity to remind the audience of the importance of mentoring and helping each other in science. “How we treat others is as important as what we accomplish”, he said. “Another high standard to keep in mind is that science needs to be international and science needs to be inclusive. I am preaching to the choir but this is important to say out loud.”

Photo of Philip Ball at a conference
Destination Helgoland Science writer Philip Ball addresses delegates on the early years of quantum mechanics. (Courtesy: Matin Durrani)

Harris’s opening remarks were followed a series of three talks. First was Douglas Stone from Yale University who discussed the historical development of quantum science.

Next up was philosopher of science Elise Crull from the City University of New York, who looked into some of the early debates about the philosophical implications of quantum physics – including the pioneering contributions of Grete Hermann, who Sidney Perkowitz discussed in his recent feature for Physics World.

The final after-dinner speaker was science journalist Philip Ball, who explained how quantum theory developed in 1924–25 in the run-up to Helgoland. He focused, as he did in his recent feature for Physics World, on work carried out by Niels Bohr and others that turned out to be wrong but showed the intense turmoil in physics on the brink of quantum mechanics.

Helgoland 2025 features a packed five days of talks, poster sessions and debates – on the island of Helgoland itself – covering the past, present and future of quantum physics, with five Nobel laureates in attendance. In fact, Harris and his fellow scientific co-organizers – Časlav Brukner, Steven Girvin and Florian Marquardt – had so much to squeeze in that they could easily have “filled two or three solid programmes with people from whom we would have loved to hear”.

I’ll see over the next few days on Helgoland if they made the right speaker choices, but things have certainly got off to a good start.

• Elise Crull is appearing on the next episode of Physics World Live on Tueday 17 June. You can register for free at this link.

This article forms part of Physics World‘s contribution to the 2025 International Year of Quantum Science and Technology (IYQ), which aims to raise global awareness of quantum physics and its applications.

Stayed tuned to Physics World and our international partners throughout the next 12 months for more coverage of the IYQ.

Find out more on our quantum channel.

 

The post Conference marking 100 years of quantum mechanics starts in Hamburg appeared first on Physics World.

Beyond the classroom: a high-school student’s week at the Institute of Physics

10 juin 2025 à 11:28

Year 12 students (aged 16 or 17) often do work experience while studying for their A-levels. It can provide valuable insights into what the working world is like and showcase what potential career routes are available. And that’s exactly why I requested to do my week of work experience at the Institute of Physics (IOP).

I’m studying maths, chemistry and physics, with a particular interest in the latter. I’m hoping to study physics or chemical physics at university so was keen to find out how the subject can be applied to business, and get a better understanding of what I want to do in the future. The IOP was therefore a perfect placement for me and here are a few highlights of what I did.

Monday

My week at the IOP’s headquarters in London began with a brief introduction to the Institute with the head of science and innovation, Anne Crean, and Katherine Platt, manager for the International Year of Quantum Science and Technology (IYQ). Platt, who planned and supervised my week of activities, then gave me a tour of the building and explained more about the IOP’s work, including how it aims to nurture upcoming physics innovation and projects, and give businesses and physicists resources and support.

My first task was working with Jenny Lovell, project manager in the science and innovation team. While helping her organize the latest round of the IOP’s medals and awards, she explained why the IOP honours the physics community in this way and described the different degrees of achievement that it recognizes.

Next I got to meet the IOP’s chief executive officer Tom Grinyer, and unexpectedly the president-elect Michele Dougherty, who is a space physicist at Imperial College London. They are both inspiring people, who gave me some great advice about how I might go about my future in physics.  They talked about the exciting opportunities available as a woman in physics, and how no matter where I start, I can go into many different sectors as the subject is so applicable.

Michele Dougherty, Naeya Mistry and Tom Grinyer at the Institute of Physics, London
Top people Naeya Mistry (centre) got some valuable advice from the chief executive officer of the Institute of Physics, Tom Grinyer (right), and the president-elect, Michele Dougherty (left). (Courtesy: IOP)

To round off the day, I sat in a meeting about how the science and innovation team can increase engagement, before starting on a presentation I was due to make on Thursday about quantum physics and young people.

Tuesday

My second day began with a series of meetings. First up was the science and innovation team’s weekly stand-up meeting. I then attended a larger staff meeting with most of IOP’s employees, which proved informative and gave me a chance to see how different teams interact with each other. Next was the science and innovation managers’ meeting, where I took the minutes as they spoke.

I then met data science lead, Robert Cocking, who went through his work on data insights. He talked about IOP membership statistics in the UK and Ireland, as well as age and gender splits, and how he can do similar breakdowns for the different areas of special interest (such as quantum physics or astronomy). I found the statistics around the representation of girls in the physics community, specifically at A-level, particularly fascinating as it applies to me. Notably, although a lower percentage of girls take A-level physics compared to boys, a higher proportion of those girls go on to study it at university.

The day ended with some time to work on my presentation and research different universities and pathways I could take once I have finished my A-levels.

Wednesday

It was a steady start to Wednesday as I continued with my presentation and research with Platt’s help. Later in the morning, I attended a meeting with the public engagement team about Mimi’s Tiny Adventure, a children’s book written by Toby Shannon-Smith, public programmes manager at IOP, and illustrated by Pauline Gregory. The book, which is the third in the Mimi’s Adventures series, is part of the IOP’s Limit Less campaign to engage young people in physics, and will be published later this year to coincide with the IYQ. It was interesting to see how the IOP advertises physics to a younger audience and makes it more engaging for them.

Platt and I then had a video call with the Physics World team at IOP Publishing in Bristol, joining for their daily news meeting before having an in-depth chat with the editor-in-chief, Matin Durrani, and feature editors, Tushna Commissariat and Sarah Tesh. After giving me a brief introduction to the magazine, website and team structure, we discussed physics careers. It was good hear the editors’ insights as they cover a broad range of jobs in Physics World and all have a background in physics. It was particularly good to hear from Durrani as he studied chemical physics, which combines my three subjects and my passions.

Thursday

On Thursday I met David Curry, founder of Quantum Base Alpha – a start-up using quantum-inspired algorithms to solve issues facing humanity. We talked about physics in a business context, what he and his company do, and what he hopes for the future of quantum.

I then gave my presentation on “Why should young people care about quantum?”. I detailed the importance of quantum physics, the major things happening in the field and what it can become, as well as the careers quantum will offer in the future. I also discussed diversity and representation in the physics community, and how that is translated to what I see in everyday life, such as in my school and class. As a woman of colour going into science, technology, engineering and mathematics (STEM), I think it is important for me to have conversations around diversity of both gender and race, and the combination of two. After my presentation, Curry gave me some feedback, and we discussed what I am aiming to do at university and beyond.

Friday

For my final day, I visited the University of Sussex, where I toured the campus with Curry’s daughter Kitty, an undergraduate student studying social sciences. I then met up again with Curry, who introduced me to Thomas Clarke, a PhD student in Sussex’s ion quantum technologies group. We went to the physics and maths building, where he explained the simple process of quantum computing to me, and the struggles they have implementing that on a larger scale.

Clarke then gave us a tour of the lab that he shares with other PhD students, and showed us his experiments, which consisted of multiple lasers that made up their trapped ion quantum computing platform. As we read off his oscilloscope attached to the laser system, it was interesting to hear that a lot of his work involved trial and error, and the visit helped me realize that I am probably more interested in the experimental side of physics rather than pure theory.

My work experience week at the IOP has been vital in helping me to understand how physics can be applied in both business and academia. Thanks to the IOP’s involvement in the IYQ, I now have a deeper understanding of quantum science and how it might one day be applied to almost every aspect of physics – including chemical physics – as the sector grows in interest and funding. It’s been an eye-opening week, and I’ve returned to school excited and better informed about my potential next career steps.

The post Beyond the classroom: a high-school student’s week at the Institute of Physics appeared first on Physics World.

Generative AI speeds medical image analysis without impacting accuracy

10 juin 2025 à 10:05

Artificial intelligence (AI) holds great potential for a range of data-intensive healthcare tasks: detecting cancer in diagnostic images, segmenting images for adaptive radiotherapy and perhaps one day even fully automating the radiation therapy workflow.

Now, for the first time, a team at Northwestern Medicine in Illinois has integrated a generative AI tool into a live clinical workflow to draft radiology reports on X-ray images. In routine use, the AI model increased documentation efficiency by an average of 15.5%, while maintaining diagnostic accuracy.

Medical images such as X-ray scans play a central role in diagnosing and staging disease. To interpret an X-ray, a patient’s imaging data are typically input into the hospital’s PACS (picture archiving and communication system) and sent to radiology reporting software. The radiologist then reviews and interprets the imaging and clinical data and creates a report to help guide treatment decisions.

To speed up this process, Mozziyar Etemadi and colleagues proposed that generative AI could create a draft report that radiologists could then check and edit, saving them from having to start from scratch. To enable this, the researchers built a generative AI model specifically for radiology at Northwestern, based on historical data from the 12-hospital Northwestern Medicine network.

They then integrated this AI model into the existing radiology clinical workflow, enabling it to receive data from the PACS and generate a draft AI report. Within seconds of image acquisition, this report is available within the reporting software, enabling radiologists to create a final report from the AI-generated draft.

“Radiology is a great fit [for generative AI] because the practice of radiology is inherently generative – radiologists are looking very carefully at images and then generating text to summarize what is in the image,” Etemadi tells Physics World. “This is similar, if not identical, to what generative models like ChatGPT do today. Our [AI model] is unique in that it is far more accurate than ChatGPT for this task, was developed years earlier and is thousands of times less costly.”

Clinical application

The researchers tested their AI model on radiographs obtained at Northwestern hospitals over a five month period, reporting their findings in JAMA Network Open. They first examined the AI model’s impact on documentation efficiency for 23 960 radiographs. Unlike previous AI investigations that only used chest X-rays, this work covered all anatomies, with 18.3% of radiographs from non-chest sites (including the abdomen, pelvis, spine, and upper and lower extremities).

Use of the AI model increased report completion efficiency by 15.5% on average – reducing mean documentation time from 189.2 s to 159.8 s – with some radiologists achieving gains as high as 40%. The researchers note that this corresponds to a time saving of more than 63 h over the five months, representing a reduction from roughly 79 to 67 radiologist shifts.

To assess the quality of the AI-based documentation, they investigated the rate at which addenda (used to rectify reporting errors) were made to the final reports. Addenda were required in 17 model-assisted reports and 16 non-model reports, suggesting that use of AI did not impact the quality of radiograph interpretation.

To further verify this, the team also conducted a peer review analysis – in which a second radiologist rates a report according to how well they agree with its findings and text quality – in 400 chest and 400 non-chest studies, split evenly between AI-assisted and non-assisted reports. The peer review revealed no differences in clinical accuracy or text quality between AI-assisted and non-assisted interpretations, reinforcing the radiologist’s ability to create high-quality documentation using the AI.

Rapid warning system

Finally, the researchers applied the model to flag unexpected life-threatening pathologies, such as pneumothorax (collapsed lung), using an automated prioritization system that monitors the AI-generated reports. The system exhibited a sensitivity of 72.7% and specificity of 99.9% for detecting unexpected pneumothorax. Importantly, these priority flags were generated between 21 and 45 s after study completion, compared with a median of 24.5 min for radiologist notifications.

Etemadi notes that previous AI systems were designed to detect specific findings and output a “yes” or “no” for each disease type. The team’s new model, on the other hand, creates a full text draft containing detailed comments.

“This precise language can then be searched to make more precise and actionable alerts,” he explains. “For example, we don’t need to know if a patient has a pneumothorax if we already know they have one and it is getting better. This cannot be done with existing systems that just provide a simple yes/no response.”

The team is now working to increase the accuracy of the AI tool, to enable more subtle and rare findings, as well as expand beyond X-ray images. “We currently have CT working and are looking to expand to MRI, ultrasound, mammography, PET and more, as well as modalities beyond radiology like ophthalmology and dermatology,” says Etemadi.

The researchers conclude that their generative AI tool could help alleviate radiologist shortages, with radiologist and AI collaborating to improve clinical care delivery. They emphasize, though, that the technology won’t replace humans. “You still need a radiologist as the gold standard,” says co-author Samir Abboud in a press statement. “Our role becomes ensuring every interpretation is right for the patient.”

The post Generative AI speeds medical image analysis without impacting accuracy appeared first on Physics World.

There’s an elephant in the room at the Royal Society – and for once, it’s not (just) Elon Musk

9 juin 2025 à 16:00

Just over a week ago, US President Donald Trump released a budget proposal that would, if enacted, eviscerate science research across the country. Among other cuts, it proposes a 57% drop (relative to 2024) in funding for the National Science Foundation (NSF), which provides the lion’s share of government support for basic science. Within this, the NSF’s physics and mathematics directorate stands to lose more than a billion dollars, or 67% of its funding. And despite the past closeness between Trump and SpaceX boss Elon Musk, NASA faces cuts of 24%, including 50% of its science budget.

Of course, the US is not the only nation that funds scientific research, any more than NASA is the only agency that sends spacecraft to explore the cosmos. Still, both are big enough players (and big enough partners for the UK) that I expected these developments to feature at least briefly at last Tuesday’s Royal Society conference on the future of the UK space sector.

During the conference’s opening session, it occasionally seemed like David Parker, a former chief executive of the UK Space Agency (UKSA) who now works for the European Space Agency (ESA), might say a few words on the subject. His opening remarks focused on lessons the UK could learn from the world’s other space agencies, including NASA under the first Trump administration. At one point, he joked that all aircraft have four dimensions: span, length, height and politics. But as for the politics that threaten NASA in Trump’s second administration, Parker was silent.

Let’s talk about something else

This silence continued throughout the morning. All told, 19 speakers filed on and off the stage at the Royal Society’s London headquarters without so much as mentioning what the Nobel-Prize-winning astrophysicist Adam Riess called an “almost extinction level” event for research in their field.

The most surreal omission was in a talk by Sheila Rowan, a University of Glasgow astrophysicist and past president of the Institute of Physics (which publishes Physics World). Rowan was instrumental in the 2015 detection of gravitational waves at the US-based Laser Interferometer Gravitational-Wave Observatory (LIGO), and her talk focused on gravitational-wave research. Despite this, she did not mention that Trump’s budget would eliminate funding for one of the two LIGO detectors, drastically reducing the research LIGO can do.

When I contacted Rowan to ask why this was, she replied that she had prepared her talk before the budget was announced. The conference, she added, was “a great example of how fantastic science benefits not just the UK, but society more broadly, and globally, and that is a message we must never stop explaining”.

What’s at stake

Rowan ended her talk on a similarly positive note, with hopeful words about the future. “The things that will fly in 2075, we are thinking about now,” she said.

In some cases, that may be true. However, if Trump’s budget passes both houses of the US Congress (the House of Representatives has already passed a bill that would enact most of the administration’s wishes), the harsh reality is that many things space scientists are thinking about will never fly at all.

Over at Astrobites, a site where PhD students write about astronomy and astrophysics for undergraduates, Arizona State University student Skylar Grayson compiled a depressingly long list of threatened missions. Like other graphics that have circled on social media since the budget announcement, Grayson’s places red X’s – indicating missions that are “fully dead” under the new budget – over dozens of projects. Affected missions range from well-known workhorses like Mars Orbiter and New Horizons to planning-stage efforts like the next-generation Earth-observing satellite Landsat Next. According to Landsat Next’s live-at-the-time-of-writing NASA webpage, it is expected to launch no earlier than 2031. What does its future look like now?

And NASA’s own missions are just the start. Several missions led by other agencies – including high-profile ones like ESA’s Rosalind Franklin Mars rover – are under threat. This is because the new NASA budget would eliminate the US’s share of their funding, forcing partners to pick up the tab or see their investments go to waste. Did that possibility not deserve some mention at a conference on the future of the UK space sector?

The elephant in the room

Midway through the conference, satellite industry executive Andrew Stanniland declared that he was about to mention the “elephant in the room”. At last, I thought. Someone’s going to say something. However, Stanniland’s “elephant” was not the proposed gutting of NASA science. Instead, he wanted to discuss the apparently taboo topic of the Starlink network of communications satellites.

Like SpaceX, Tesla and, until recently, Trump’s budget-slashing “department of government efficiency”, Starlink is a Musk project. Musk is a Fellow of the Royal Society, and he remains so after the society’s leadership rejected a grassroots effort to remove him for, inter alia, calling for the overthrow of the UK government. Could it be that speakers were avoiding Musk, Trump and the new US science budget to spare the Royal Society’s blushes?

Exasperated, I submitted a question to the event’s online Q&A portal. “The second Trump administration has just proposed a budget for NASA that would gut its science funding,” I wrote. “How is this likely to affect the future of the space sector?” Alas, the moderator didn’t choose my question – though in fairness, five others also went unanswered, and Rowan, for the record, says that she could “of course” talk about whatever she wanted to.

Finally, in the event’s second-to-last session, the elephant broke through. During a panel discussion on international collaboration, an audience member asked, “Can we really operate [collaboratively] when we have an administration that’s causing irreparable harm to one of our biggest collaborators on the space science stage?”

In response, panellist Gillian Wright, a senior scientist at the UK Astronomy Technology Centre in Edinburgh, called it “an incredibly complicated question given the landscape is still shifting”. Nevertheless, she said, “My fear is that what goes won’t come back easily, so we do need to think hard about how we keep those scientific connections alive for the future, and I don’t know the answer.” The global focus of space science, Wright added, may be shifting away from the US and towards Europe and the global south.

And that was it.

A question of leadership

I logged out of the conference feeling depressed – and puzzled. Why had none of these distinguished speakers (partially excepting Wright) addressed one of the biggest threats to the future of space science? One possible answer, suggested to me on social media by the astrophysicist Elizabeth Tasker, is that individuals might hesitate to say anything that could be taken as an official statement, especially if their organization needs to maintain a relationship with the US. “I think it needs to be an agency-released statement first,” said Tasker, who works at (but was not speaking for) the Japan Aerospace Exploration Agency (JAXA). “I totally agree that silence is problematic for the community, and I think that’s where official statements come in – but those may need more time.”

Official statements from agencies and other institutions would doubtless be welcomed by members of the US science workforce whose careers and scientific dreams are at risk from the proposed budget. The initial signs, however, are not encouraging.

On the same day as the Royal Society event, the US’s National Academies of Science (NAS) hosted their annual “State of the Science” event in Washington, DC. According to reporting by John Timmer at Ars Technica, many speakers at this event were, if anything, even keener than the Royal Society speakers to avoid acknowledging the scale of the (real and potential) damage. A few oblique comments from NAS president Marcia McNutt; a few forthright ones from a Republican former congresswoman, Heather Wilson; but overall, a pronounced tendency to ignore the present in favour of a future that may never come.

Frankly, the scientific community on both sides of the Atlantic deserves better.

The post There’s an elephant in the room at the Royal Society – and for once, it’s not (just) Elon Musk appeared first on Physics World.

Quantum physics guides proton motion in biological systems

9 juin 2025 à 13:00

If you dig deep enough, you’ll find that most biochemical and physiological processes rely on shuttling hydrogen atoms – protons – around living systems. Until recently, this proton transfer process was thought to occur when protons jump from water molecule to water molecule and between chains of amino acids. In 2023, however, researchers suggested that protons might, in fact, transfer at the same time as electrons. Scientists in Israel have now confirmed this is indeed the case, while also showing that proton movement is linked to the electrons’ spin, or magnetic moment. Since the properties of electron spin are defined by quantum mechanics, the new findings imply that essential life processes are intrinsically quantum in nature.

The scientists obtained this result by placing crystals of lysozyme – an enzyme commonly found in living organisms – on a magnetic substrate. Depending on the direction of the substrate’s magnetization, the spin of the electrons ejected from this substrate may be up or down. Once the electrons are ejected from the substrate, they enter the lysozymes. There, they become coupled to phonons, or vibrations of the crystal lattice.

Crucially, this coupling is not random. Instead, the chirality, or “handedness”, of the phonons determines which electron spin they will couple with – a  property known as chiral induced spin selectivity.

Excited chiral phonons mediate electron coupling spin

When the scientists turned their attention to proton transfer through the lysozymes, they discovered that the protons moved much more slowly with one magnetization direction than they did with the opposite. This connection between proton transfer and spin-selective electron transfer did not surprise Yossi Paltiel, who co-led the study with his Hebrew University of Jerusalem (HUJI) colleagues Naama Goren, Nir Keren and Oded Livnah in collaboration with Nurit Ashkenazy of Ben Gurion University and Ron Naaman of the Weizmann Institute.

“Proton transfer in living organisms occurs in a chiral environment and is an essential process,” Paltiel says. “Since protons also have spin, it was logical for us to try to relate proton transfer to electron spin in this work.”

The finding could shed light on proton hopping in biological environments, Paltiel tells Physics World. “It may ultimately help us understand how information and energy are transferred inside living cells, and perhaps even allow us to control this transfer in the future.

“The results also emphasize the role of chirality in biological processes,” he adds, “and show how quantum physics and biochemistry are fundamentally related.”

The HUJI team now plans to study how the coupling between the proton transfer process and the transfer of spin polarized electrons depends on specific biological environments. “We also want to find out to what extent the coupling affects the activity of cells,” Paltiel says.

Their present study is detailed in PNAS.

The post Quantum physics guides proton motion in biological systems appeared first on Physics World.

Development and application of a 3-electrode setup for the operando detection of side reactions in Li-Ion batteries

9 juin 2025 à 11:13

webinar main image

Join us to learn about the development and application of a 3-Electrode setup for the operando detection of side reactions in Li-Ion batteries.

Detecting parasitic side reactions originating both from the cathode active materials (CAMs) and the electrolyte is paramount for developing more stable cell chemistries for Li-ion batteries. This talk will present a method for the qualitative analysis of oxidative electrolyte oxidation, as well as the quantification of released lattice oxygen and transition metal ions (TM ions) from the CAM. It is based on a 3-electrode cell design employing a Vulcan carbon-based sense electrode (SE) that is held at a controlled voltage against a partially delithiated lithium iron phosphate (LFP) counter electrode (CE). At this SE, reductive currents can be measured while polarizing a CAM or carbon working electrode (WE) against the same LFP CE. In voltametric scans, we show how the SE potential can be selected to specifically detect a given side reaction during CAM charge/discharge, allowing, e.g., to discriminate between lattice oxygen, protons, and dissolved TMs. Furthermore, it is shown via On-line Electrochemical Mass Spectrometry (OEMS) that O2 reduction in the here-used LP47 electrolyte consumes ~2.3 electrons/O2. Using this value, the lattice oxygen release deduced from the 3-electrode setup upon charging of the NCA WE is in good agreement with OEMS measurements up to NCA potentials >4.65 VLi. At higher potentials, the contributions from the reduction of TM ions can be quantified by comparing the integrated SE current with the O2 evolution from OEMS

Lennart Reuter headshot
Lennart Reuter

Lennart Reuter is a PhD student in the group of Prof Hubert A Gasteiger at the Chair of Technical Electrochemistry at TUM. His research focused on the interfacial processes in lithium-ion batteries that govern calendar life, cycle stability, and rate capability. He advanced the on-line electrochemical mass spectrometry (OEMS) technique to investigate gas evolution mechanisms from interfacial side reactions at the cathode and anode. His work also explored how SEI formation and graphite structural changes affect Li⁺ transport, using impedance spectroscopy and complementary analysis techniques.

 

Leonhard J Reinschluessel headshot
Leonhard J Reinschluessel

Leonhard J Reinschluessel is currently a PhD candidate at at the Chair of Technical Electrochemistry in the Gasteiger research group at the Technical University of Munich (TUM). His current work encompasses an in-depth understanding of the complex interplay of cathode- and electrolyte degradation mechanisms in lithium-ion batteries using operando lab-based and synchrotron techniques. He received his MSc in chemistry from TUM, where he investigated the mitigation of aging of FeNC-based cathode catalyst layers in PEMFCs in his thesis at the Gasteiger group Electrochemistry at TUM. His research focused on the interfacial processes in lithium-ion batteries that govern calendar life, cycle stability, and rate capability. He advanced the on-line electrochemical mass spectrometry (OEMS) technique to investigate gas evolution mechanisms from interfacial side reactions at the cathode and anode. His work also explored how SEI formation and graphite structural changes affect Li⁺ transport, using impedance spectroscopy and complementary analysis techniques.

The post Development and application of a 3-electrode setup for the operando detection of side reactions in Li-Ion batteries appeared first on Physics World.

People benefit from medicine, but machines need healthcare too

9 juin 2025 à 10:17

I began my career in the 1990s at a university spin-out company, working for a business that developed vibration sensors to monitor the condition of helicopter powertrains and rotating machinery. It was a job that led to a career developing technologies and techniques for checking the “health” of machines, such as planes, trains and trucks.

What a difference three decades has made. When I started out, we would deploy bespoke systems that generated limited amounts of data. These days, everything has gone digital and there’s almost more information than we can handle. We’re also seeing a growing use of machine learning and artificial intelligence (AI) to track how machines operate.

In fact, with AI being increasingly used in medical science – for example to predict a patient’s risk of heart attacks – I’ve noticed intriguing similarities between how we monitor the health of machines and the health of human bodies. Jet engines and hearts are very different objects, but in both cases monitoring devices gives us a set of digitized physical measurements.

A healthy perspective

Sensors installed on a machine provide various basic physical parameters, such as its temperature, pressure, flow rate or speed. More sophisticated devices can yield information about, say, its vibration, acoustic behaviour, or (for an engine) oil debris or quality. Bespoke sensors might even be added if an important or otherwise unchecked aspect of a machine’s performance needs to be monitored – provided the benefits of doing so outweigh the cost.

Generally speaking, the sensors you use in a particular situation depend on what’s worked before and whether you can exploit other measurements, such as those controlling the machine. But whatever sensors are used, the raw data then have to be processed and manipulated to extract particular features and characteristics.

If the machine appears to be going wrong, can you try to diagnose what the problem might be?

Once you’ve done all that, you can then determine the health of the machine, rather like in medicine. Is it performing normally? Does it seem to be developing a fault? If the machine appears to be going wrong, can you try to diagnose what the problem might be?

Generally, we do this by tracking a range of parameters to look for consistent behaviour, such as a steady increase, or by seeing if a parameter exceeds a pre-defined threshold. With further analysis, we can also try to predict the future state of the machine, work out what its remaining useful life might be, or decide if any maintenance needs scheduling.

A diagnosis typically involves linking various anomalous physical parameters (or symptoms) to a probable cause. As machines obey the laws of physics, a diagnosis can either be based on engineering knowledge or be driven by data – or sometimes the two together. If a concrete diagnosis can’t be made, you can still get a sense of where a problem might lie before carrying out further investigation or doing a detailed inspection.

One way of doing this is to use a “borescope” – essentially a long, flexible cable with a camera on the end. Rather like an endoscope in medicine, it allows you to look down narrow or difficult-to-reach cavities. But unlike medical imaging, which generally takes place in the controlled environment of a lab or clinic, machine data are typically acquired “in the field”. The resulting images can be tricky to interpret because the light is poor, the measurements are inconsistent, or the equipment hasn’t been used in the most effective way.

Even though it can be hard to work out what you’re seeing, in-situ visual inspections are vital as they provide evidence of a known condition, which can be directly linked to physical sensor measurements. It’s a kind of health status calibration. But if you want to get more robust results, it’s worth turning to advanced modelling techniques, such as deep neural networks.

One way to predict the wear and tear of a machine’s constituent parts is to use what’s known as a “digital twin”. Essentially a virtual replica of a physical object, a digital twin is created by building a detailed model and then feeding in real-time information from sensors and inspections. The twin basically mirrors the behaviour, characteristics and performance of the real object.

Real-time monitoring

Real-time health data are great because they allow machines to be serviced as and when required, rather than following a rigid maintenance schedule. For example, if a machine has been deployed heavily in a difficult environment, it can be serviced sooner, potentially preventing an unexpected failure. Conversely, if it’s been used relatively lightly and not shown any problems, then  maintenance could be postponed or reduced in scope. This saves time and money because the equipment will be out of action less than anticipated.

We can work out which parts will need repairing or replacing, when the maintenance will be required and who will do it

Having information about a machine’s condition at any point in time not only allows this kind of “intelligent maintenance” but also lets us use associated resources wisely. For example, we can work out which parts will need repairing or replacing, when the maintenance will be required and who will do it. Spare parts can therefore be ordered only when required, saving money and optimizing supply chains.

Real-time health-monitoring data are particularly useful for companies owning many machines of one kind, such as airlines with a fleet of planes or haulage companies with a lot of trucks. It gives them a better understanding not just of how machines behave individually – but also collectively to give a “fleet-wide” view. Noticing and diagnosing failures from data becomes an iterative process, helping manufacturers create new or improved machine designs.

This all sounds great, but in some respects, it’s harder to understand a machine than a human. People can be taken to hospitals or clinics for a medical scan, but a wind turbine or jet engine, say, can’t be readily accessed, switched off or sent for treatment. Machines also can’t tell us exactly how they feel.

However, even humans don’t always know when there’s something wrong. That’s why it’s worth us taking a leaf from industry’s book and consider getting regular health monitoring and checks. There are lots of brilliant apps out there to monitor and track your heart rate, blood pressure, physical activity and sugar levels.

Just as with a machine, you can avoid unexpected failure, reduce your maintenance costs, and make yourself more efficient and reliable. You could, potentially, even live longer too.

The post People benefit from medicine, but machines need healthcare too appeared first on Physics World.

Japan’s ispace suffers second lunar landing failure

6 juin 2025 à 15:04

The Japanese firm ispace has suffered another setback after its second attempt to land on the Moon ended in failure yesterday. The Hakuto-R Mission 2, also known as Resilience, failed to touch down near the centre of Mare Frigoris (sea of cold) in the far north of the Moon after a sensor malfunctioned during descent.

Launched on 15 January from the Kennedy Space Center, Florida, aboard a SpaceX Falcon 9 rocket, the craft spent four months travelling to the Moon before it entered lunar orbit on 7 May. It then spent the past month completing several lunar orbital manoeuvres.

During the descent phase, the 2.3 m-high lander began a landing sequence that involved firing its main propulsion system to gradually decelerate and adjust its attitude. ispace says that the lander was confirmed to be nearly vertical but then the company lost communication with the craft.

The firm concludes that the laser rangefinder experienced delays attempting to measure the distance to the lunar surface during descent, meaning that it was unable to decelerate sufficiently to carry out a soft landing.

“Given that there is currently no prospect of a successful lunar landing, our top priority is to swiftly analyse the telemetry data we have obtained thus far and work diligently to identify the cause,” noted ispace founder and chief executive officer Takeshi Hakamada in a statement. “We strive to restore trust by providing a report of the findings.”

The mission was planned to have operated for about two weeks. Resilience featured several commercial payloads, worth $16m, including a food-production experiment and a deep-space radiation probe. It also carried a rover, dubbed Tenacious, which was about the size of a microwave oven and would have collected and analysed lunar regolith.

The rover would have also delivered a Swedish artwork called The Moonhouse – a small red cottage with white corners – and placed it at a “symbolically meaningful” site on the Moon.

Lunar losses

The company’s first attempt to land on the Moon also ended in failure in 2023 when the Hakuto-R Mission 1 crash landed despite being in a vertical position as it carried out the final approach to the lunar surface.

The issue was put down to a software problem that incorrectly assessed the craft’s altitude during descent.

If the latest attempt was a success, ispace would have joined the US firms Intuitive Machines and Firefly Aerospace, which both successfully landed on the Moon last year and in March, respectively.

The second lunar loss casts doubt on ispace’s plans for further lunar landings and its grand aim of establishing a lunar colony of 1000 inhabitants by the 2040s.

The post Japan’s ispace suffers second lunar landing failure appeared first on Physics World.

Richard Bond and George Efstathiou: meet the astrophysicists who are shaping our understanding of the early universe

5 juin 2025 à 17:00

This episode of the Physics World Weekly podcast features George Efstathiou and Richard Bond, who share the 2025 Shaw Prize in Astronomy, “for their pioneering research in cosmology, in particular for their studies of fluctuations in the cosmic microwave background (CMB). Their predictions have been verified by an armada of ground-, balloon- and space-based instruments, leading to precise determinations of the age, geometry, and mass-energy content of the universe.”

Bond and Efstathiou talk about how the CMB emerged when the universe was just 380,000 years old and explain how the CMB is observed today. They explain why studying fluctuations in today’s CMB provides a window into the nature of the universe as it existed long ago, and how future studies could help physicists understand the nature of dark matter – which is one of the greatest mysteries in physics.

Efstathiou is emeritus professor of astrophysics at the University of Cambridge in the UK – and Richard Bond is a professor at the Canadian Institute for Theoretical Astrophysics (CITA) and university professor at the University of Toronto in Canada. Bond and Efstathiou share the 2025 Shaw Prize in Astronomy and its $1.2m prize money equally.

This podcast is sponsored by The Shaw Prize Foundation.

The post Richard Bond and George Efstathiou: meet the astrophysicists who are shaping our understanding of the early universe appeared first on Physics World.

Superconducting innovation: SQMS shapes up for scalable success in quantum computing

5 juin 2025 à 16:00

Developing quantum computing systems with high operational fidelity, enhanced processing capabilities plus inherent (and rapid) scalability is high on the list of fundamental problems preoccupying researchers within the quantum science community. One promising R&D pathway in this regard is being pursued by the Superconducting Quantum Materials and Systems (SQMS) National Quantum Information Science Research Center at the US Department of Energy’s Fermi National Accelerator Laboratory, the pre-eminent US particle physics facility on the outskirts of Chicago, Illinois.

The SQMS approach involves placing a superconducting qubit chip (held at temperatures as low as 10–20 mK) inside a three-dimensional superconducting radiofrequency (3D SRF) cavity – a workhorse technology for particle accelerators employed in high-energy physics (HEP), nuclear physics and materials science. In this set-up, it becomes possible to preserve and manipulate quantum states by encoding them in microwave photons (modes) stored within the SRF cavity (which is also cooled to the millikelvin regime).

Put another way: by pairing superconducting circuits and SRF cavities at cryogenic temperatures, SQMS researchers create environments where microwave photons can have long lifetimes and be protected from external perturbations – conditions that, in turn, make it possible to generate quantum states, manipulate them and read them out. The endgame is clear: reproducible and scalable realization of such highly coherent superconducting qubits opens the way to more complex and scalable quantum computing operations – capabilities that, over time, will be used within Fermilab’s core research programme in particle physics and fundamental physics more generally.

Fermilab is in a unique position to turn this quantum technology vision into reality, given its decadal expertise in developing high-coherence SRF cavities. In 2020, for example, Fermilab researchers demonstrated record coherence lifetimes (of up to two seconds) for quantum states stored in an SRF cavity.

“It’s no accident that Fermilab is a pioneer of SRF cavity technology for accelerator science,” explains Sir Peter Knight, senior research investigator in physics at Imperial College London and an SQMS advisory board member. “The laboratory is home to a world-leading team of RF engineers whose niobium superconducting cavities routinely achieve very high quality factors (Q) from 1010 to above 1011 – figures of merit that can lead to dramatic increases in coherence time.”

Moreover, Fermilab offers plenty of intriguing HEP use-cases where quantum computing platforms could yield significant research dividends. In theoretical studies, for example, the main opportunities relate to the evolution of quantum states, lattice-gauge theory, neutrino oscillations and quantum field theories in general. On the experimental side, quantum computing efforts are being lined up for jet and track reconstruction during high-energy particle collisions; also for the extraction of rare signals and for exploring exotic physics beyond the Standard Model.

SQMS associate scientists Yao Lu and Tanay Roy
Collaborate to accumulate SQMS associate scientists Yao Lu (left) and Tanay Roy (right) worked with PhD student Taeyoon Kim (centre) to develop a two-qudit superconducting QPU with a record coherence lifetime (>20 ms). (Courtesy: Hannah Brumbaugh, Fermilab)

Cavities and qubits

SQMS has already notched up some notable breakthroughs on its quantum computing roadmap, not least the demonstration of chip-based transmon qubits (a type of charge qubit circuit exhibiting decreased sensitivity to noise) showing systematic and reproducible improvements in coherence, record-breaking lifetimes of over a millisecond, and reductions in performance variation.

Key to success here is an extensive collaborative effort in materials science and the development of novel chip fabrication processes, with the resulting transmon qubit ancillas shaping up as the “nerve centre” of the 3D SRF cavity-based quantum computing platform championed by SQMS. What’s in the works is essentially a unique quantum analogue of a classical computing architecture: the transmon chip providing a central logic-capable quantum information processor and microwave photons (modes) in the 3D SRF cavity acting as the random-access quantum memory.

As for the underlying physics, the coupling between the transmon qubit and discrete photon modes in the SRF cavity allows for the exchange of coherent quantum information, as well as enabling quantum entanglement between the two. “The pay-off is scalability,” says Alexander Romanenko, a senior scientist at Fermilab who leads the SQMS quantum technology thrust. “A single logic-capable processor qubit, such as the transmon, can couple to many cavity modes acting as memory qubits.”

In principle, a single transmon chip could manipulate more than 10 qubits encoded inside a single-cell SRF cavity, substantially streamlining the number of microwave channels required for system control and manipulation as the number of qubits increases. “What’s more,” adds Romanenko, “instead of using quantum states in the transmon [coherence times just crossed into milliseconds], we can use quantum states in the SRF cavities, which have higher quality factors and longer coherence times [up to two seconds].”

In terms of next steps, continuous improvement of the ancilla transmon coherence times will be critical to ensure high-fidelity operation of the combined system – with materials breakthroughs likely to be a key rate-determining step. “One of the unique differentiators of the SQMS programme is this ‘all-in’ effort to understand and get to grips with the fundamental materials properties that lead to losses and noise in superconducting qubits,” notes Knight. “There are no short-cuts: wide-ranging experimental and theoretical investigations of materials physics – per the programme implemented by SQMS – are mandatory for scaling superconducting qubits into industrial and scientifically useful quantum computing architectures.”

Laying down a marker, SQMS researchers recently achieved a major milestone in superconducting quantum technology by developing the longest-lived multimode superconducting quantum processor unit (QPU) ever built (coherence lifetime >20 ms). Their processor is based on a two-cell SRF cavity and leverages its exceptionally high quality factor (~1010) to preserve quantum information far longer than conventional superconducting platforms (typically 1 or 2 ms for rival best-in-class implementations).

Coupled with a superconducting transmon, the two-cell SRF module enables precise manipulation of cavity quantum states (photons) using ultrafast control/readout schemes (allowing for approximately 104 high-fidelity operations within the qubit lifetime). “This represents a significant achievement for SQMS,” claims Yao Lu, an associate scientist at Fermilab and co-lead for QPU connectivity and transduction in SQMS. “We have demonstrated the creation of high-fidelity [>95%] quantum states with large photon numbers [20 photons] and achieved ultra-high-fidelity single-photon entangling operations between modes [>99.9%]. It’s work that will ultimately pave the way to scalable, error-resilient quantum computing.”

The SQMS multiqubit QPU prototype
Scalable thinking The SQMS multiqudit QPU prototype (above) exploits 3D SRF cavities held at millikelvin temperatures. (Courtesy: Ryan Postel, Fermilab)

Fast scaling with qudits

There’s no shortage of momentum either, with these latest breakthroughs laying the foundations for SQMS “qudit-based” quantum computing and communication architectures. A qudit is a multilevel quantum unit that can be more than two states and, in turn, hold a larger information density – i.e. instead of working with a large number of qubits to scale information processing capability, it may be more efficient to maintain a smaller number of qudits (with each holding a greater range of values for optimized computations).

Scale-up to a multiqudit QPU system is already underway at SQMS via several parallel routes (and all with a modular computing architecture in mind). In one approach, coupler elements and low-loss interconnects integrate a nine-cell multimode SRF cavity (the memory) to a two-cell SRF cavity quantum processor. Another iteration uses only two-cell modules, while yet another option exploits custom-designed multimodal cavities (10+ modes) as building blocks.

One thing is clear: with the first QPU prototypes now being tested, verified and optimized, SQMS will soon move to a phase in which many of these modules will be assembled and put together in operation. By extension, the SQMS effort also encompasses crucial developments in control systems and microwave equipment, where many devices must be synchronized optimally to encode and analyse quantum information in the QPUs.

Along a related coordinate, complex algorithms can benefit from fewer required gates and reduced circuit depth. What’s more, for many simulation problems in HEP and other fields, it’s evident that multilevel systems (qudits) – rather than qubits – provide a more natural representation of the physics in play, making simulation tasks significantly more accessible. The work of encoding several such problems into qudits – including lattice-gauge-theory calculations and others – is similarly ongoing within SQMS.

Taken together, this massive R&D undertaking – spanning quantum hardware and quantum algorithms – can only succeed with a “co-design” approach across strategy and implementation: from identifying applications of interest to the wider HEP community to full deployment of QPU prototypes. Co-design is especially suited to these efforts as it demands sustained alignment of scientific goals with technological implementation to drive innovation and societal impact.

In addition to their quantum computing promise, these cavity-based quantum systems will play a central role in serving both as the “adapters” and low-loss channels at elevated temperatures for interconnecting chip or cavity-based QPUs hosted in different refrigerators. These interconnects will provide an essential building block for the efficient scale-up of superconducting quantum processors into larger quantum data centres.

Researchers in the control room of the SQMS Quantum Garage facility
Quantum insights Researchers in the control room of the SQMS Quantum Garage facility, developing architectures and gates for SQMS hardware tailored toward HEP quantum simulations. From left to right: Nick Bornman, Hank Lamm, Doga Kurkcuoglu, Silvia Zorzetti, Julian Delgado, Hans Johnson (Courtesy: Hannah Brumbaugh)

 “The SQMS collaboration is ploughing its own furrow – in a way that nobody else in the quantum sector really is,” says Knight. “Crucially, the SQMS partners can build stuff at scale by tapping into the phenomenal engineering strengths of the National Laboratory system. Designing, commissioning and implementing big machines has been part of the ‘day job’ at Fermilab for decades. In contrast, many quantum computing start-ups must scale their R&D infrastructure and engineering capability from a far-less-developed baseline.”

The last word, however, goes to Romanenko. “Watch this space,” he concludes, “because SQMS is on a roll. We don’t know which quantum computing architecture will ultimately win out, but we will ensure that our cavity-based quantum systems will play an enabling role.”

Scaling up: from qubits to qudits

Conceptual illustration of the SQMS Center’s superconducting TESLA cavity coupled to a transmon ancilla qubit
Left: conceptual illustration of the SQMS Center’s superconducting TESLA cavity coupled to a transmon ancilla qubit (AI-generated). Right: an ancilla qubit with two energy levels – ground ∣g⟩ and excited ∣e⟩ – is used to control a high-coherence (d+1) dimensional qudit encoded in a cavity resonator. The ancilla enables state preparation, control and measurement of the qudit. (Courtesy: Fermilab)

The post Superconducting innovation: SQMS shapes up for scalable success in quantum computing appeared first on Physics World.

Black-hole scattering calculations could shed light on gravitational waves

4 juin 2025 à 17:00

By adapting mathematical techniques used in particle physics, researchers in Germany have developed an approach that could boost our understanding of the gravitational waves that are emitted when black holes collide. Led by Jan Plefka at The Humboldt University of Berlin, the team’s results could prove vital to the success of future gravitational-wave detectors.

Nearly a decade on from the first direct observations of gravitational waves, physicists are hopeful that the next generation of ground- and space-based observatories will soon allow us to study these ripples in space–time with unprecedented precision. But to ensure the success of upcoming projects like the LISA space mission, the increased sensitivity offered by these detectors will need to be accompanied with a deeper theoretical understanding of how gravitational waves are generated through the merging of two black holes.

In particular, they will need to predict more accurately the physical properties of gravitational waves produced by any given colliding pair and account for factors including their respective masses and orbital velocities. For this to happen, physicists will need to develop more precise solutions to the relativistic two-body problem. This problem is a key application of the Einstein field equations, which relate the geometry of space–time to the distribution of matter within it.

No exact solution

“Unlike its Newtonian counterpart, which is solved by Kepler’s Laws, the relativistic two-body problem cannot be solved exactly,” Plefka explains. “There is an ongoing international effort to apply quantum field theory (QFT) – the mathematical language of particle physics – to describe the classical two-body problem.”

In their study, Plefka’s team started from state-of-the-art techniques used in particle physics for modelling the scattering of colliding elementary particles, while accounting for their relativistic properties. When viewed from far away, each black hole can be approximated as a single point which, much like an elementary particle, carries a single mass, charge, and spin.

Taking advantage of this approximation, the researchers modified existing techniques in particle physics to create a framework called worldline quantum field theory (WQFT). “The advantage of WQFT is a clean separation between classical and quantum physics effects, allowing us to precisely target the classical physics effects relevant for the vast distances involved in astrophysical observables,” Plefka describes

Ordinarily, doing calculations with such an approach would involve solving millions of integrals that sum-up every single contribution to the black hole pair’s properties across all possible ways that the interaction between them could occur. To simplify the problem, Plefka’s team used a new algorithm that identified relationships between the integrals. This reduced the problem to just 250 “master integrals”, making the calculation vastly more manageable.

With these master integrals, the team could finally produce expressions for three key physical properties of black hole binaries within WQFT. These includes the changes in momentum during the gravity-mediated scattering of two black holes and the total energy radiated by both bodies over the course of the scattering.

Genuine physical process

Altogether, the team’s WQFT framework produced the most accurate solution to the Einstein field equations ever achieved to date. “In particular, the radiated energy we found contains a new class of mathematical functions known as ‘Calabi–Yau periods’,” Plefka explains. “While these functions are well-known in algebraic geometry and string theory, this marks the first time they have been shown to describe a genuine physical process.”

With its unprecedented insights into the structure of the relativistic two-body problem, the team’s approach could now be used to build more precise models of gravitational-wave formation, which could prove invaluable for the next generation of gravitational-wave detectors.

More broadly, however, Plefka predicts that the appearance of Calabi–Yau periods in their calculations could lead to an entirely new class of mathematical functions applicable to many areas beyond gravitational waves.

“We expect these periods to show up in other branches of physics, including collider physics, and the mathematical techniques we employed to calculate the relevant integrals will no doubt also apply there,” he says.

The research is described in Nature.

The post Black-hole scattering calculations could shed light on gravitational waves appeared first on Physics World.

Harmonious connections: bridging the gap between music and science

4 juin 2025 à 12:00

CP Snow’s classic The Two Cultures lecture, published in book form in 1959, is the usual go-to reference when exploring the divide between the sciences and humanities. It is a culture war that was raging long before the term became social-media shorthand for today’s tribal battles over identity, values and truth.

While Snow eloquently lamented the lack of mutual understanding between scientific and literary elites, the 21st-century version of the two-cultures debate often plays out with a little less decorum and a lot more profanity. Hip hop duo Insane Clown Posse certainly didn’t hold back in their widely memed 2010 track “Miracles”, which included the lyric “And I don’t wanna talk to a scientist / Y’all motherfuckers lying and getting me pissed”. An extreme example to be sure, but it hammers home the point: Snow’s two-culture concerns continue to resonate strongly almost 70 years after his influential lecture and writings.

A Perfect Harmony: Music, Mathematics and Science by David Darling is the latest addition to a growing genre that seeks to bridge that cultural rift. Like Peter Pesic’s Music and the Making of Modern Science, Susan Rogers and Ogi Ogas’ This Is What It Sounds Like, and Philip Ball’s The Music Instinct, Darling’s book adds to the canon that examines the interplay between musical creativity and the analytical frameworks of science (including neuroscience) and mathematics.

I’ve also contributed, in a nanoscopically small way, to this music-meets-science corpus with an analysis of the deep and fundamental links between quantum physics and heavy metal (When The Uncertainty Principle Goes To 11), and have a long-standing interest in music composed from maths and physics principles and constants (see my Lateral Thoughts articles from September 2023 and July 2024). Darling’s book, therefore, struck a chord with me.

Darling is not only a talented science writer with an expansive back-catalogue to his name but he is also an accomplished musician (check out his album Songs Of The Cosmos ), and his enthusiasm for all things musical spills off the page. Furthermore, he is a physicist, with a PhD in astronomy from the University of Manchester. So if there’s a writer who can genuinely and credibly inhabit both sides of the arts–science cultural divide, it’s Darling.

But is A Perfect Harmony in tune with the rest of the literary ensemble, or marching to a different beat? In other words, is this a fresh new take on the music-meets-maths (meets pop sci) genre or, like too many bands I won’t mention, does it sound suspiciously like something you’ve heard many times before? Well, much like an old-school vinyl album, Darling’s work has the feel of two distinct sides. (And I’ll try to make that my final spin on groan-worthy musical metaphors. Promise.)

Not quite perfect pitch

Although the subtitle for A Perfect Harmony is “Music, Mathematics and Science”, the first half of the book is more of a history of the development and evolution of music and musical instruments in various cultures, rather than a new exploration of the underpinning mathematical and scientific principles. Engaging and entertaining though this is – and all credit to Darling for working in a reference to Van Halen in the opening lines of chapter 1 – it’s well-worn ground: Pythagorean tuning, the circle of fifths, equal temperament, Music of the Spheres (not the Coldplay album, mercifully), resonance, harmonics, etc. I found myself wishing, at times, for a take that felt a little more off the beaten track.

One case in point is Darling’s brief discussion of the theremin. If anything earns the title of “The Physicist’s Instrument”, it’s the theremin – a remarkable device that exploits the innate electrical capacitance of the human body to load a resonant circuit and thus produce an ethereal, haunting tone whose pitch can be varied, without, remarkably, any physical contact.

While I give kudos to Darling for highlighting the theremin, the brevity of the description is arguably a lost opportunity when put in the broader context of the book’s aim to explain the deeper connections between music, maths and science. This could have been a novel and fascinating take on the links between electrical and musical resonance that went well beyond the familiar territory mapped out in standard physics-of-music texts.

Using the music of the eclectic Australian band King Gizzard and the Lizard Wizard to explain microtonality is nothing short of inspired

As the book progresses, however, Darling moves into more distinctive territory, choosing a variety of inventive examples that are often fascinating and never short of thought-provoking. I particularly enjoyed his description of orbital resonance in the system of seven planets orbiting the red dwarf TRAPPIST-1, 41 light-years from Earth. The orbital periods have ratios, which, when mapped to musical intervals, correspond to a minor sixth, a major sixth, two perfect fifths, a perfect fourth and another perfect fifth. And it’s got to be said that using the music of the eclectic Australian band King Gizzard and the Lizard Wizard to explain microtonality is nothing short of inspired.

A Perfect Harmony doesn’t entirely close the cultural gap highlighted by Snow all those years ago, but it does hum along pleasantly in the space between. Though the subject matter occasionally echoes well-trodden themes, Darling’s perspective and enthusiasm lend it freshness. There’s plenty here to enjoy, especially for physicists inclined to tune into the harmonies of the universe.

  • 2025 Oneworld Publications 288pp £10.99pb/£6.99ebook

The post Harmonious connections: bridging the gap between music and science appeared first on Physics World.

New analysis of M67 cluster helps decode the sound of stars

3 juin 2025 à 16:00

Stars are cosmic musical instruments: they vibrate with complex patterns that echo through their interiors. These vibrations, known as pressure waves, ripple through the star, similar to the earthquakes that shake our planet. The frequencies of these waves hold information about the star’s mass, age and internal structure.

In a study led by researchers at UNSW Sydney, Australia, astronomer Claudia Reyes and colleagues “listened” to the sound from stars in the M67 cluster and discovered a surprising feature: a plateau in their frequency pattern. This plateau appears during the subgiant and red giant phases of stars where they expand and evolve after exhausting the hydrogen fuel in their cores. This feature, reported in Nature, reveals how deep the outer layers of the star have pushed into the interior and offers a new diagnostic to improve mass and age estimates of stars beyond the main sequence (the core-hydrogen-burning phase).

How do stars create sound?

Beneath the surface of stars, hot gases are constantly rising, cooling and sinking back down, much like hot bubbles in boiling water. This constant churning is called convection. As these rising and sinking gas blobs collide or burst at the stellar surface, they generate pressure waves. These are essentially acoustic waves, bouncing within the stellar interior to create standing wave patterns.

Stars do not vibrate at just one frequency; they oscillate simultaneously at multiple frequencies, producing a spectrum of sounds. These acoustic oscillations cannot be heard in space directly, but are observed as tiny fluctuations in the star’s brightness over time.

M67 cluster as stellar laboratory

Star clusters offer an ideal environment in which to study stellar evolution as all stars in a cluster form from the same gas cloud at about the same time with the same chemical compositions but with different masses. The researchers investigated stars from the open cluster M67, as this cluster has a rich population of evolved stars including subgiants and red giants with a chemical composition similar to the Sun’s. They measured acoustic oscillations in 27 stars using data from NASA’s Kepler/K2 mission.

Stars oscillate across a range of tones, and in this study the researchers focused on two key features in this oscillation spectrum: large and small frequency separations. The large frequency separation, which probes stellar density, is the frequency difference between oscillations of the same angular degree () but different radial orders (n). The small frequency separation refers to frequency differences between the modes of degrees and ℓ + 2, of consecutive orders of n.  For main sequence stars, small separations are reliable age indicators because their changes during hydrogen burning are well understood. In later stages of stellar evolution, however, their relationship to the stellar interior remained unclear.

In 27 stars, Reyes and colleagues investigated the small separation between modes of degrees 0 and 2. Plotting a graph of small versus large frequency separations for each star, called a C–D diagram, they uncovered a surprising plateau in small frequency separations.

C–D diagrams for two M67 stars
A surprising feature C–D diagram showing different evolutionary stages of stars of mass 1 (left) and 1.7 solar masses (right) made from stellar models. Each point represents a specific stage in stellar evolution from the main sequence (A) to the red giant (F). The plateau seen from points F to E during the post-main-sequence phase reveals a transition in the stellar interior. (Courtesy: CC BY 4.0/C Reyes et al. Nature 10.1038/s41586-025-08760-2)

The researchers traced this plateau to the evolution of the lower boundary of the star’s convective envelope. As the envelope expands and cools, this boundary sinks deeper into the interior. Along this boundary, the density and sound speed change rapidly due to the difference in chemical composition on either side. These steep changes cause acoustic glitches that disturb how the pressure waves move through the star and temporarily stall the evolution of the small frequency separations, observed as a plateau in the frequency pattern.

This stalling occurs at a specific stage in stellar evolution – when the convective envelope deepens enough to encompass nearly 80% of the star’s mass. To confirm this connection, the researchers varied the amount of convective boundary mixing in their stellar models. They found that the depth of the envelope directly influenced both the timing and shape of the plateau in the small separations.

A new window on galactic history

This plateau serves as a new diagnostic tool to identify a specific evolutionary stage in red giant stars and improve estimates of their mass and age.

“The discovery of the ‘plateau’ frequencies is significant because it represents one more corroboration of the accuracy of our stellar models, as it shows how the turbulent regions at the bottom of a star’s envelope affect the sound speed,” explains Reyes, who is now at the Australian National University in Canberra. “They also have great potential to help determine with ease and great accuracy the mass and age of a star, which is of great interest for galactic archaeology, the study of the history of our galaxy.”

The sounds of starquakes offer a new window to study the evolution of stars and, in turn, recreate the history of our galaxy. Clusters like M67 serve as benchmarks to study and test stellar models and understand the future evolution of stars like our Sun.

“We plan to look for stars in the field which have very well-determined masses and which are in their ‘plateau’ phase,” says Reyes. “We will use these stars to benchmark the diagnostic potential of the plateau frequencies as a tool, so it can later be applied to stars all over the galaxy.”

The post New analysis of M67 cluster helps decode the sound of stars appeared first on Physics World.

❌