↩ Accueil

Vue normale

Reçu avant avant-hier

Quantum computing on the verge: a look at the quantum marketplace of today

14 octobre 2025 à 17:40

“I’d be amazed if quantum computing produces anything technologically useful in ten years, twenty years, even longer.” So wrote University of Oxford physicist David Deutsch – often considered the father of the theory of quantum computing – in 2004. But, as he added in a caveat, “I’ve been amazed before.”

We don’t know how amazed Deutsch, a pioneer of quantum computing, would have been had he attended a meeting at the Royal Society in London in February on “the future of quantum information”. But it was tempting to conclude from the event that quantum computing has now well and truly arrived, with working machines that harness quantum mechanics to perform computations being commercially produced and shipped to clients. Serving as the UK launch of the International Year of Quantum Science and Technology (IYQ) 2025, it brought together some of the key figures of the field to spend two days discussing quantum computing as something like a mature industry, even if one in its early days.

Werner Heisenberg – who worked out the first proper theory of quantum mechanics 100 years ago – would surely have been amazed to find that the formalism he and his peers developed to understand the fundamental behaviour of tiny particles had generated new ways of manipulating information to solve real-world problems in computation. So far, quantum computing – which exploits phenomena such as superposition and entanglement to potentially achieve greater computational power than the best classical computers can muster – hasn’t tackled any practical problems that can’t be solved classically.

Although the fundamental quantum principles are well-established and proven to work, there remain many hurdles that quantum information technologies have to clear before this industry can routinely deliver resources with transformative capabilities. But many researchers think that moment of “practical quantum advantage” is fast approaching, and an entire industry is readying itself for that day.

Entangled marketplace

So what are the current capabilities and near-term prospects for quantum computing?

The first thing to acknowledge is that a booming quantum-computing market exists. Devices are being produced for commercial use by a number of tech firms, from the likes of IBM, Google, Canada-based D-Wave, and Rigetti who have been in the field for a decade or more; to relative newcomers like Nord Quantique (Canada), IQM (Finland), Quantinuum (UK and US), Orca (UK) and PsiQuantum (US), Silicon Quantum Computing (Australia).

The global quantum ecosystem

Map showing the investments globally into quantum computing
(Courtesy: QURECA)

We are on the cusp of a second quantum revolution, with quantum science and technologies growing rapidly across the globe. This includes quantum computers; quantum sensing (ultra-high precision clocks, sensors for medical diagnostics); as well as quantum communications (a quantum internet). Indeed, according to the State of Quantum 2024 report, a total of 33 countries around the world currently have government initiatives in quantum technology, of which more than 20 have national strategies with large-scale funding. As of this year, worldwide investments in quantum tech – by governments and industry – exceed $55.7 billion, and the market is projected to reach $106 billion by 2040. With the multitude of ground-breaking capabilities that quantum technologies bring globally, it’s unsurprising that governments all over the world are eager to invest in the industry.

With data from a number of international reports and studies, quantum education and skills firm QURECA has summarized key programmes and efforts around the world. These include total government funding spent through 2025, as well as future commitments spanning 2–10 year programmes, varying by country. These initiatives generally represent government agencies’ funding announcements, related to their countries’ advancements in quantum technologies, excluding any private investments and revenues.

A supply chain is also organically developing, which includes manufacturers of specific hardware components, such as Oxford Instruments and Quantum Machines and software developers like Riverlane, based in Cambridge, UK, and QC Ware in Palo Alto, California. Supplying the last link in this chain are a range of eager end-users, from finance companies such as J P Morgan and Goldman Sachs to pharmaceutical companies such as AstraZeneca and engineering firms like Airbus. Quantum computing is already big business, with around 400 active companies and current global investment estimated at around $2 billion.

But the immediate future of all this buzz is hard to assess. When the chief executive of computer giant Nvidia announced at the start of 2025 that “truly useful” quantum computers were still two decades away, the previously burgeoning share prices of some leading quantum-computing companies plummeted. They have since recovered somewhat, but such volatility reflects the fact that quantum computing has yet to prove its commercial worth.

The field is still new and firms need to manage expectations and avoid hype while also promoting an optimistic enough picture to keep investment flowing in. “Really amazing breakthroughs are being made,” says physicist Winfried Hensinger of the University of Sussex, “but we need to get away from the expectancy that [truly useful] quantum computers will be available tomorrow.”

The current state of play is often called the “noisy intermediate-scale quantum” (NISQ) era. That’s because the “noisy” quantum bits (qubits) in today’s devices are prone to errors for which no general and simple correction process exists. Current quantum computers can’t therefore carry out practically useful computations that could not be done on classical high-performance computing (HPC) machines. It’s not just a matter of better engineering either; the basic science is far from done.

IBM quantum computer cryogenic chandelier
Building up Quantum computing behemoth IBM says that by 2029, its fault-tolerant system should accurately run 100 million gates on 200 logical qubits, thereby truly achieving quantum advantage. (Courtesy: IBM)

“We are right on the cusp of scientific quantum advantage – solving certain scientific problems better than the world’s best classical methods can,” says Ashley Montanaro, a physicist at the University of Bristol who co-founded the quantum software company Phasecraft. “But we haven’t yet got to the stage of practical quantum advantage, where quantum computers solve commercially important and practically relevant problems such as discovering the next lithium-ion battery.” It’s no longer if or how, but when that will happen.

Pick your platform

As the quantum-computing business is such an emerging area, today’s devices use wildly different types of physical systems for their qubits. There is still no clear sign as to which of these platforms, if any, will emerge as the winner. Indeed many researchers believe that no single qubit type will ever dominate.

The top-performing quantum computers, like those made by Google (with its 105-qubit Willow chip) and IBM (which has made the 121-qubit Condor), use qubits in which information is encoded in the wavefunction of a superconducting material. Until recently, the strongest competing platform seemed to be trapped ions, where the qubits are individual ions held in electromagnetic traps – a technology being developed into working devices by the US company IonQ, spun out from the University of Maryland, among others.

But over the past few years, neutral trapped atoms have emerged as a major contender, thanks to advances in controlling the positions and states of these qubits. Here the atoms are prepared in highly excited electronic states called Rydberg atoms, which can be entangled with one another over a few microns. A Harvard startup called QuEra is developing this technology, as is the French start-up Pasqal. In September a team from the California Institute of Technology announced a 6100-qubit array made from neutral atoms. “Ten years ago I would not have included [neutral-atom] methods if I were hedging bets on the future of quantum computing,” says Deutsch’s Oxford colleague, the quantum information theorist Andrew Steane. But like many, he thinks differently now.

Some researchers believe that optical quantum computing, using photons as qubits, will also be an important platform. One advantage here is that there is no need for complex conversion of photonic signals in existing telecommunications networks going to or from the processing units, which is also handy for photonic interconnections between chips. What’s more, photonic circuits can work at room temperature, whereas trapped ions and superconducting qubits need to be cooled. Photonic quantum computing is being developed by firms like PsiQuantum, Orca, and Xanadu.

Other efforts, for example at Intel and Silicon Quantum Computing in Australia, make qubits from either quantum dots (Intel) or precision-placed phosphorus atoms (SQC), both in good old silicon, which benefits from a very mature manufacturing base. “Small qubits based on ions and atoms yield the highest quality processors”, says Michelle Simmons of the University of New South Wales, who is the founder and CEO of SQC. “But only atom-based systems in silicon combine this quality with manufacturability.”

Intel's silicon spin qubits are now being manufactured on an industrial scale
Spinning around Intel’s silicon spin qubits are now being manufactured on an industrial scale. (Courtesy: Intel Corporation)

And it’s not impossible that entirely new quantum computing platforms might yet arrive. At the start of 2025, researchers at Microsoft’s laboratories in Washington State caused a stir when they announced that they had made topological qubits from semiconducting and superconducting devices, which are less error-prone than those currently in use. The announcement left some scientists disgruntled because it was not accompanied by a peer-reviewed paper providing the evidence for these long-sought entities. But in any event, most researchers think it would take a decade or more for topological quantum computing to catch up with the platforms already out there.

Each of these quantum technologies has its own strengths and weaknesses. “My personal view is that there will not be a single architecture that ‘wins’, certainly not in the foreseeable future,” says Michael Cuthbert, founding director of the UK’s National Quantum Computing Centre (NQCC), which aims to facilitate the transition of quantum computing from basic research to an industrial concern. Cuthbert thinks the best platform will differ for different types of computation: cold neutral atoms might be good for quantum simulations of molecules, materials and exotic quantum states, say, while superconducting and trapped-ion qubits might be best for problems involving machine learning or optimization.

Measures and metrics

Given these pros and cons of different hardware platforms, one difficulty in assessing their merits is finding meaningful metrics for making comparisons. Should we be comparing error rates, coherence times (basically how long qubits remain entangled), gate speeds (how fast a single computational step can be conducted), circuit depth (how many steps a single computation can sustain), number of qubits in a processor, or what? “The metrics and measures that have been put forward so far tend to suit one or other platform more than others,” says Cuthbert, “such that it becomes almost a marketing exercise rather than a scientific benchmarking exercise as to which quantum computer is better.”

The NQCC evaluates the performance of devices using a factor known as the “quantum operation” (QuOp). This is simply the number of quantum operations that can be carried out in a single computation, before the qubits lose their coherence and the computation dissolves into noise. “If you want to run a computation, the number of coherent operations you can run consecutively is an objective measure,” Cuthbert says. If we want to get beyond the NISQ era, he adds, “we need to progress to the point where we can do about a million coherent operations in a single computation. We’re now at the level of maybe a few thousand. So we’ve got a long way to go before we can run large-scale computations.”

One important issue is how amenable the platforms are to making larger quantum circuits. Cuthbert contrasts the issue of scaling up – putting more qubits on a chip – with “scaling out”, whereby chips of a given size are linked in modular fashion. Many researchers think it unlikely that individual quantum chips will have millions of qubits like the silicon chips of today’s machines. Rather, they will be modular arrays of relatively small chips linked at their edges by quantum interconnects.

Having made the Condor, IBM now plans to focus on modular architectures (scaling out) – a necessity anyway, since superconducting qubits are micron-sized, so a chip with millions of them would be “bigger than your dining room table”, says Cuthbert. But superconducting qubits are not easy to scale out because microwave frequencies that control and read out the qubits have to be converted into optical frequencies for photonic interconnects. Cold atoms are easier to scale up, as the qubits are small, while photonic quantum computing is easiest to scale out because it already speaks the same language as the interconnects.

To be able to build up so called “fault tolerant” quantum computers, quantum platforms must solve the issue of error correction, which will enable more extensive computations without the results becoming degraded into mere noise.

In part two of this feature, we will explore how this is being achieved and meet the various firms developing quantum software. We will also look into the potential high-value commercial uses for robust quantum computers – once such devices exist.

This article forms part of Physics World‘s contribution to the 2025 International Year of Quantum Science and Technology (IYQ), which aims to raise global awareness of quantum physics and its applications.

Stayed tuned to Physics World and our international partners throughout the year for more coverage of the IYQ.

Find out more on our quantum channel.

The post Quantum computing on the verge: a look at the quantum marketplace of today appeared first on Physics World.

Phase shift in optical cavities could detect low-frequency gravitational waves

13 octobre 2025 à 10:00

A network of optical cavities could be used to detect gravitational waves (GWs) in an unexplored range of frequencies, according to researchers in the UK. Using technology already within reach, the team believes that astronomers could soon be searching for ripples in space–time across the milli-Hz frequency band at 10⁻⁵ Hz–1 Hz.

GWs were first observed a decade ago and since then the LIGO–Virgo–KAGRA detectors have spotted GWs from hundreds of merging black holes and neutron stars. These detectors work in the 10 Hz–30 kHz range. Researchers have also had some success at observing a GW background at nanohertz frequencies using pulsar timing arrays.

However, GWs have yet to be detected in the milli-Hz band, which should include signals from binary systems of white dwarfs, neutron stars, and stellar-mass black holes. Many of these signals would emanate from the Milky Way.

Several projects are now in the works to explore these frequencies, including the space-based interferometers LISA, Taiji, and TianQin; as well as satellite-borne networks of ultra-precise optical clocks. However, these projects are still some years away.

Multidisciplinary effort

Joining these efforts was a collaboration called QSNET, which was within the UK’s Quantum Technology for Fundamental Physics (QTFP) programme. “The QSNET project was a network of clocks for measuring the stability of fundamental constants,” explains Giovanni Barontini at the University of Birmingham. “This programme brought together physics communities that normally don’t interact, such as quantum physicists, technologists, high energy physicists, and astrophysicists.”

QTFP ended this year, but not before Barontini and colleagues had made important strides in demonstrating how milli-Hz GWs could be detected using optical cavities.

Inside an ultrastable optical cavity, light at specific resonant frequencies bounces constantly between a pair of opposing mirrors. When this resonant light is produced by a specific atomic transition, the frequency of the light in the cavity is very precise and can act as the ticking of an extremely stable clock.

“Ultrastable cavities are a main component of modern optical atomic clocks,” Barontini explains. “We demonstrated that they have reached sufficient sensitivities to be used as ‘mini-LIGOs’ and detect gravitational waves.”

When such GW passes through an optical cavity, the spacing between its mirrors does not change in any detectable way. However, QSNET results have led to Barontini’s team to conclude that milli-Hz GWs alter the phase of the light inside the cavity. What is more, they conclude that this effect would be detectable in the most precise optical cavities currently available.

“Methods from precision measurement with cold atoms can be transferred to gravitational-wave detection,” explains team member Vera Guarrera. “By combining these toolsets, compact optical resonators emerge as credible probes in the milli-Hz band, complementing existing approaches.”

Ground-based network

Their compact detector would comprise two optical cavities at 90° to each other – each operating at a different frequency – and an atomic reference at a third frequency. The phase shift caused by a passing gravitational wave is revealed in a change in how the three frequencies interfere with each other. The team proposes linking multiple detectors to create a global, ground-based network. This, they say, could detect a GW and also locate the position of its source in the sky.

By harnessing this existing technology, the researchers now hope that future studies could open up a new era of discovery of GWs in the milli-Hz range, far sooner than many projects currently in development.

“This detector will allow us to test astrophysical models of binary systems in our galaxy, explore the mergers of massive black holes, and even search for stochastic backgrounds from the early universe,” says team member Xavier Calmet at the University of Sussex. “With this method, we have the tools to start probing these signals from the ground, opening the path for future space missions.”

Barontini adds, “Hopefully this work will inspire the build-up of a global network of sensors that will scan the skies in a new frequency window that promises to be rich of sources – including many from our own galaxy,”.

The research is described in Classical and Quantum Gravity.

 

The post Phase shift in optical cavities could detect low-frequency gravitational waves appeared first on Physics World.

Hints of a boundary between phases of nuclear matter found at RHIC

9 octobre 2025 à 17:30

In a major advance for nuclear physics, scientists on the STAR Detector at the Relativistic Heavy Ion Collider (RHIC) in the US have spotted subtle but striking fluctuations in the number of protons emerging from high-energy gold–gold collisions. The observation might be the most compelling sign yet of the long-sought “critical point” marking a boundary separating different phases of nuclear matter. This similar to how water can exist in liquid or vapour phases depending on temperature and pressure.

Team member Frank Geurts at Rice University in the US tells Physics World that these findings could confirm that the “generic physics properties of phase diagrams that we know for many chemical substances apply to our most fundamental understanding of nuclear matter, too.”

A phase diagram maps how a substance transforms between solid, liquid, and gas. For everyday materials like water, the diagram is familiar, but the behaviour of nuclear matter under extreme heat and pressure remains a mystery.

Atomic nuclei are made of protons and neutrons tightly bound together. These protons and neutrons are themselves made of quarks that are held together by gluons. When nuclei are smashed together at high energies, the protons and neutrons “melt” into a fluid of quarks and gluons called a quark–gluon plasma. This exotic high-temperature state is thought to have filled the universe just microseconds after the Big Bang.

Smashing gold ions

The quark–gluon plasma is studied by accelerating heavy ions like gold nuclei to nearly the speed of light and smashing them together. “The advantage of using heavy-ion collisions in colliders such as RHIC is that we can repeat the experiment many millions, if not billions, of times,” Geurts explains.

By adjusting the collision energy, researchers can control the temperature and density of the fleeting quark–gluon plasma they create. This allows physicists to explore the transition between ordinary nuclear matter and the quark–gluon plasma. Within this transition, theory predicts the existence of a critical point where gradual change becomes abrupt.

Now, the STAR Collaboration has focused on measuring the minute fluctuations in the number of protons produced in each collision. These “proton cumulants,” says Geurts, are statistical quantities that “help quantify the shape of a distribution – here, the distribution of the number of protons that we measure”.

In simple terms, the first two cumulants correspond to the average and width of that distribution, while higher-order cumulants describe its asymmetry and sharpness. Ratios of these cumulants are tied to fundamental properties known as susceptibilities, which become highly sensitive near a critical point.

Unexpected discovery

Over three years of experiments, the STAR team studied gold–gold collisions at a wide range of energies, using sophisticated detectors to track and identify the protons and antiprotons created in each event. By comparing how the number of these particles changed with energy, the researchers discovered something unexpected.

As the collision energy decreased, the fluctuations in proton numbers did not follow a smooth trend. “STAR observed what it calls non-monotonic behaviour,” Geurts explains. “While at higher energies the ratios appear to be suppressed, STAR observes an enhancement at lower energies.” Such irregular changes, he said, are consistent with what might happen if the collisions pass near the critical point — the boundary separating different phases of nuclear matter.

For Volodymyr Vovchenko, a physicist at the University of Houston who was not involved in the research, the new measurements represent “a major step forward”. He says that “the STAR Collaboration has delivered the most precise proton-fluctuation data to date across several collision energies”.

Still, interpretation remains delicate. The corrections required to extract pure physical signals from the raw data are complex, and theoretical calculations lag behind in providing precise predictions for what should happen near the critical point.

“The necessary experimental corrections are intricate,” Vovchenko said, and some theoretical models “do not yet implement these corrections in a fully consistent way.” That mismatch, he cautions, “can blur apples-to-apples comparisons.”

The path forward

The STAR team is now studying new data from lower-energy collisions, focusing on the range where the signal appears strongest. The results could reveal whether the observed pattern marks the presence of a nuclear matter critical point or stems from more conventional effects.

Meanwhile, theorists are racing to catch up. “The ball now moves largely to theory’s court,” Vovchenko says. He emphasizes the need for “quantitative predictions across energies and cumulants of various order that are appropriate for apples-to-apples comparisons with these data.”

Future experiments, including RHIC’s fixed-target program and new facilities such as the FAIR accelerator in Germany, will extend the search even further. By probing lower energies and producing vastly larger datasets, they aim to map the transition between ordinary nuclear matter and quark–gluon plasma with unprecedented precision.

Whether or not the critical point is finally revealed, the new data are a milestone in the exploration of the strong force and the early universe. As Geurts put it, these findings trace “landmark properties of the most fundamental phase diagram of nuclear matter,” bringing physicists one step closer to charting how everything  – from protons to stars – first came to be.

The research is described in Physical Review Letters.

The post Hints of a boundary between phases of nuclear matter found at RHIC appeared first on Physics World.

A low vibration wire scanner fork for free electron lasers

7 octobre 2025 à 16:51
High performance, proven, wire scanner for transverse beam profile measurement for the latest generation of low emittance accelerators and FELs. (Courtesy: UHV Design)
High performance, proven, wire scanner for transverse beam profile measurement for the latest generation of low emittance accelerators and FELs. (Courtesy: UHV Design)

A new high-performance wire scanner fork that the latest generation of free electron lasers (FELs) can use for measuring beam profiles has been developed by UK-based firm UHV Design. Produced using technology licensed from the Paul Scherrer Institute (PSI) in Switzerland, the device could be customized for different FELs and low emittance accelerators around the world. It builds on the company’s PLSM range, which allows heavy objects to be moved very smoothly and with minimal vibrations.

The project began 10 years ago when the PSI was starting to build the Swiss Free Electron Laser and equipping the facility, explains Jonty Eyres. The remit for UHV Design was to provide a stiff, very smooth, bellows sealed, ultra-high vacuum compatible linear actuator that could move a wire fork without vibrating it adversely. The fork, designed by PSI, can hold wires in two directions and can therefore scan the intensity of the beam profile in both X and Y planes using just one device as opposed to two or more as in previous such structures.

“We decided to employ an industrial integrated ball screw and linear slide assembly with a very stiff frame around it, the construction of which provides the support and super smooth motion,” he says. “This type of structure is generally not used in the ultra-high vacuum industry.”

The position of the wire fork is determined through a (radiation-hard) side mounted linear optical encoder in conjunction with the PSI’s own motor and gearbox assembly. A power off brake is also incorporated to avoid any issues with back driving under vacuum load if electrical power was to be lost to the PLSM.  All electrical connections terminated with UTO style connectors to PSI specification.

Long term reliability was important to avoid costly and unnecessary down time, particularly between planned FEL maintenance shutdowns. The industrial ball screw and slide assembly by design was the perfect choice in conjunction with a bellows assembly rated for 500,000 cycles with an option to increase to 1 million cycles.

Eyres and his UHV design team began by building a prototype that the PSI tested themselves with a high-speed camera. Once validated, the UHV engineers then built a batch of 20 identical units to prove that the device could be replicated in terms of constraints and tolerances.

The real challenge in constructing this device, says Eyres, was about trying to minimize the amount of vibration on the wire, which, for PSI, is typically between 5 and 25 microns thick. This is only possible if the vibration of the wire during a scan is low compared to the cross section of the wire – that is, about a micron for a 25-micron wire. “Otherwise, you are just measuring noise,” explains Eyres. “The small vibration we achieved can be corrected for in calculations, so providing an accurate value for the beam profile intensity.”

UHV Design holds the intellectual property rights for the linear actuator and PSI the property rights of the fork. Following the success of the project and a subsequent agreement between them both, it was recently decided that UHV Design buy the licence to promote the wire fork, allowing the company to sell the device or a version of it to any institution or company operating a FEL or low-emittance accelerator. “The device is customizable and can be adapted to different types of fork, wires, motors or encoders,” says Eyres. “The heart of the design remains the same: a very stiff structure and its integrated ball screw and linear slide assembly. But, it can be tailored to meet the requirements of different beam lines in terms of stroke size, specific wiring and the components employed.”

UHV Design’s linear actuator was installed on the Swiss FEL in 2016 and has been performing very well since, says Eyres.

A final and important point to note, he adds, is that UHV Design built an identical copy of their actuator when we took on board the licence agreement, so that we could prove it could still reproduce the same performance. “We built an exact copy of the wire scanner, including the PSI fork assembly and sent it to the PSI, who then used the very same high-speed camera rig that they’d employed in 2015 to directly compare the new actuator with the original ones supplied. They reported that the results were indeed comparable, meaning that if fitted to the Swiss FEL today, it would perform in the same way.”

For more information: https://www.uhvdesign.com/products/linear-actuators/wire-scanner/

The post A low vibration wire scanner fork for free electron lasers appeared first on Physics World.

Rapid calendar life screening of electrolytes for silicon anodes using voltage holds

7 octobre 2025 à 15:48

2025-09-ecs-wb-schematic-main-image

Silicon-based lithium-ion batteries exhibit severe time-based degradation resulting in poor calendar lives. In this webinar, we will talk about how calendar aging is measured, why the traditional measurement approaches are time intensive and there is a need for new approaches to optimize materials for next generation silicon based systems. Using this new approach we also screen multiple new electrolyte systems that can lead to calendar life improvements in Si containing batteries.

An interactive Q&A session follows the presentation.

ankit-verma-headshot
Ankit Verma

Ankit Verma’s expertise is in physics-based and data-driven modeling of lithium-ion and next generation lithium metal batteries. His interests lie in unraveling the coupled reaction-transport-mechanics behavior in these electrochemical systems with experiment-driven validation to provide predictive insights for practical advancements. Predominantly, he’s working on improving silicon anodes energy density and calendar life as part of the Silicon Consortium Project, understanding solid-state battery limitations and upcycling of end-of-life electrodes as part of the ReCell Center.

Verma’s past works include optimization of lithium-ion battery anodes and cathodes for high-power and fast-charge applications and understanding electrodeposition stability in metal anodes.

 

the-electrochemical-society-logo

biologic-battery-logo-800

The post Rapid calendar life screening of electrolytes for silicon anodes using voltage holds appeared first on Physics World.

Radioactive BEC could form a ‘superradiant neutrino laser’

4 octobre 2025 à 14:48

Radioactive atoms in a Bose–Einstein condensate (BEC) could form a “superradiant neutrino laser” in which the atomic nuclei undergo accelerated beta decay. The hypothetical laser has been proposed by two researchers US who say that it could be built and tested. While such a neutrino laser has no obvious immediate applications, further developments could potentially assist in the search for background neutrinos from the Big Bang – an important goal of neutrino physicists.

Neutrinos – the ghostly particles produced in beta decay – are notoriously difficult to detect or manipulate because of the weakness of their interaction with matter. They cannot be used to produce a conventional laser because they would pass straight through mirrors unimpeded. More fundamentally, neutrinos are fermions rather than bosons such as photons. This prevents neutrinos forming a two-level system with a population inversion as only one neutrino can occupy each quantum state in a system.

However, another quantum phenomenon called superradiance can also increase the intensity and coherence of the radiation from photons. This occurs when the emitters are sufficiently close together to become indistinguishable. The emission then comes not from any single entity but from the collective ensemble. As it does not require the emitted particles to be quantum degenerate, this is not theoretically forbidden for fermions. “There are devices that use superradiance to make light sources, and people call them superradiant lasers – although that’s actually a misnomer” explains neutrino physicist Benjamin Jones of the University of Texas at Arlington and a visiting professor at the University of Manchester. “There’s no stimulated emission.”

In their new work, Jones and colleague Joseph Formaggio of Massachusetts Institute of Technology propose that, in a BEC of radioactive atoms, superradiance could enhance the neutrino emission rate and therefore speed up beta decay, with an initial burst before the expected exponential decay commences. “That has not been seen for nuclear systems so far – only for electronic ones,” says Formaggio. Rubidium was used to produce the first ever condensate in 1995 by Carl Wiemann and Eric Cornell of University of Colorado Boulder, and conveniently, one of its isotopes decays by beta emission with a half-life of 86 days.

Radioactive vapour

The presence of additional hyperfine states would make direct laser cooling of rubidium-83 more challenging than the rubidium-87 isotope used by Wiemann and Cornell, but not significantly more so than the condensation of rubidium-85, which has also been achieved. Alternatively, the researchers propose that a dual condensate could be created in which rubidium-83 is cooled by sympathetic cooling with rubidium-87. The bigger challenge, says Jones, is the Bose–Einstein condensation of a radioactive atom, which has yet to be achieved: “It’s difficult to handle in a vacuum system,” he explains, “You have to be careful to make sure you don’t contaminate your laboratory with radioactive vapour.”

If such a condensate were produced, the researchers predict that superradiance would increase with the size of the BEC. In a BEC of 106 atoms, for example, more than half the atoms would decay within three minutes. The researchers now hope to test this prediction. “This is one of those experiments that does not require a billion dollars to fund,” says Formaggio. “It is done in university laboratories. It’s a hard experiment but it’s not out of reach, and I’d love to see it done and be proven right or wrong.”

If the prediction were proved correct, the researchers suggest it could eventually lead towards a benchtop neutrino source. As the same physics applies to neutrino capture, this could theoretically assist the detection of neutrinos that decoupled from the hot plasma of the universe just seconds after the Big Bang – hundreds of thousands of years before photons in the cosmic microwave background. The researchers emphasize, however, that this would not currently be feasible.

Sound proposal

Neutrino physicist Patrick Huber of Virginia Tech is impressed by the work. “I think for a first, theoretical study of the problem this is very good,” he says. “The quantum mechanics seems to be sound, so the question is if you try to build an experiment what kind of real-world obstacles are you going to encounter?” He predicts that, if the experiment works, other researchers would quite likely find hitherto unforeseen applications.

Atomic, molecular and optical physicist James Thompson of University of Colorado Boulder is sceptical, however. He says several important aspects are either glossed over or simply ignored. Most notably, he calculates that the de Broglie wavelength of the neutrinos would be below the Bohr radius – which would prevent a BEC from feasibly satisfying the superradiance criterion that the atoms be indistinguishable.

“I think it’s a really cool, creative idea to think about,” he concludes, “but I think there are things we’ve learned in atomic physics that haven’t really crept into [the neutrino physics] community yet. We learned them the hard way by building experiments, having them not work and then figuring out what it takes to make them work.”

The proposal is described in Physical Review Letters.

The post Radioactive BEC could form a ‘superradiant neutrino laser’ appeared first on Physics World.

US scientific societies blast Trump administration’s plan to politicize grants

2 octobre 2025 à 17:30

Almost 60 US scientific societies have signed a letter calling on the US government to “safeguard the integrity” of the peer-review process when distributing grants. The move is in to response to an executive order issued by the Trump administration in August that places accountability for reviewing and awarding new government grants in the hands of agency heads.

The executive order – Improving Oversight of Federal Grantmaking – calls on each agency head to “designate a senior appointee” to review new funding announcements and to “review discretionary grants to ensure that they are consistent with agency priorities and the national interest.”

The order outlines several previous grants that it says have not aligned with the Trump administration’s current policies, claiming that in 2024 more than a quarter of new National Science Foundation (NSF) grants went to diversity, equity, and inclusion and what it calls “other far-left initiatives”.

“These NSF grants included those to educators that promoted Marxism, class warfare propaganda, and other anti-American ideologies in the classroom, masked as rigorous and thoughtful investigation,” the order states. “There is a strong need to strengthen oversight and coordination of, and to streamline, agency grantmaking to address these problems, prevent them from recurring, and ensure greater accountability for use of public funds more broadly.”

Increasing burdens

In response, the 58 agencies – including the American Physical Society, the American Astronomical Society, the Biophysical Society, the American Geophysical Union and SPIE – have written to the majority and minority leaders of the US Senate and House of Representatives, to voice their concerns that the order “raises the possibility of politicization” in federally funded research.

“Our nation’s federal grantmaking ecosystem serves as the gold standard for supporting cutting-edge research and driving technological innovation worldwide,” the letters states. “Without the oversight traditionally applied by appropriators and committees of jurisdiction, this [order] will significantly increase administrative burdens on both researchers and agencies, slowing, and sometimes stopping altogether, vital scientific research that our country needs.”

The letter says more review and oversight is required by the US Congress before the order should go into effect, adding that the scientific community “is eager” to work with congress and the Trump administration “to strengthen our scientific enterprise”.

The post US scientific societies blast Trump administration’s plan to politicize grants appeared first on Physics World.

Kirigami-inspired parachute falls on target

1 octobre 2025 à 17:00
A Kirigami-inspired parachute
On target A Kirigami-inspired parachute deploying to slow down the delivery of a water bottle from a drone. (Courtesy: Frédérick Gosselin)

Inspired by the Japanese art of kirigami, researchers in Canada and France have designed a parachute that can safely and accurately deliver its payloads when dropped directly above its target. Tested in realistic outdoor conditions, the parachute’s deformable design stabilizes the airflow around its porous structure, removing the need to drift as it falls. With its simple and affordable design, the parachute could have especially promising uses in areas including drone delivery and humanitarian aid.

When a conventional parachute is deployed, it cannot simply fall vertically towards its target. To protect itself from turbulence, which can cause its canopy to collapse, it glides at an angle that breaks the symmetry of the airflow around it, stabilizing the parachute against small perturbations.

But this necessity comes at a cost. When dropping a payload from a drone or aircraft, this gliding angle means parachutes will often drift far from their intended targets. This can be especially frustrating and potentially dangerous for operations such as humanitarian aid delivery, where precisely targeted airdrops are often vital to success.

To address this challenge, researchers led by David Mélançon at Polytechnique Montréal looked to kirigami, whereby paper is cut and folded to create elaborate 3D designs. “Previously, kirigami has been used to morph flat sheets into 3D shapes with programmed curvatures,” Mélançon explains. “We proposed to leverage kirigami’s shape morphing capability under fluid flow to design new kinds of ballistic parachutes.”

Wind-dispersed seeds

As well as kirigami, the team drew inspiration from nature. Instead of relying on a gliding angle, many wind-dispersed seeds are equipped with structures that stabilize the airflow around them: including the feathery bristles of dandelion seeds, which create a stabilized vortex in their wake; and the wings of sycamore and maple seeds, which cause them to rapidly spin as they fall. In each case, these mechanisms provide plants with passive control over where their seeds land and germinate.

For their design, Mélançon’s team created a parachute that can deform into a shape pre-programmed by a pattern of kirigami cuts, etched into a flexible disc using a laser cutter. “Our parachutes are simple flat discs, with circumferential slits inspired by a kirigami motif called a closed loop,” Mélançon describes. “Instead of attaching the payload with strings at the outer edge of the disk, we directly mount it its centre.”

When dropped, a combination of air resistance and the weight of the free-falling payload deformed the parachute into an inverted, porous bell shape. “The slits in the kirigami pattern are stretched, forcing air through its multitude of small openings,” Mélançon continues. “This ensures that the air flows in an orderly manner without any major chaotic turbulence, resulting in a predictable trajectory.”

The researchers tested their parachute extensively using numerical simulations combined with wind tunnel experiments and outdoor tests, where they used the parachute to drop a water bottle from a hovering drone. In this case, the parachute delivered its payload safely to the ground from a height of 60 m directly above its target.

Easy to make

Mélançon’s team tested their design with a variety of parachute sizes and kirigami patterns, demonstrating that designs with lower load-to-area ratios and more deformable patterns can reach comparable terminal velocity to conventional parachutes – with far greater certainty over where they will land. Compared with conventional parachutes, which are often both complex and costly to manufacture, kirigami-based designs will be far easier to fabricate.

“Little hand labour is necessary,” Mélançon says. “We have made parachutes out of sheets of plastic, paper or cardboard. We need a sheet of material with a certain rigidity, that’s all.”

By building on their design, the researchers hope that future studies will pave the way for new improvements in package home delivery. It could even advance efforts to deliver urgently needed aid during conflicts and natural disasters to those who need it most.

The parachute is described in Nature.

The post Kirigami-inspired parachute falls on target appeared first on Physics World.

Destroyers of the world: the physicists who built nuclear weapons

1 octobre 2025 à 12:00

The title of particle physicist Frank Close’s engaging new book, Destroyer of Worlds, refers to Robert Oppenheimer’s famous comment after he witnessed the first detonation of an atomic bomb, known as the Trinity test, in July 1945. Quoting the Hindu scripture Bhagavad Gita, he said “Now I am become death, the destroyer of worlds.” But although Close devotes much space to the Manhattan Project, which Oppenheimer directed between 1942 and 1945, his book has a much wider remit.

Aimed at non-physicist readers with a strong interest in science, though undoubtedly appealing to physicists too, the book seeks to explain the highly complex physics and chemistry that led to the atomic bomb – a term first coined by H G Wells in his 1914 science-fiction novel The World Set Free. It also describes the contributions of numerous gifted scientists to the development of those weapons.

Close draws mainly on numerous published sources from this deeply analysed period, including Richard Rhodes’s seminal 1988 study The Making of the Atomic Bomb. He starts with Wilhelm Röntgen’s discovery of X-rays in 1895, before turning to the discovery of radioactivity by Henri Becquerel in 1896 – described by Close as “the first pointer to nuclear energy [that was] so insignificant that it was almost missed”. Next, he highlights the work on radium by Marie and Pierre Curie in 1898.

After discussing the emergence of nuclear physics, Close goes on to talk about the Allies’ development of the nuclear bomb. A key figure in this history was Enrico Fermi, who abandoned Fascist Italy in 1938 and emigrated to the US, where he worked on the Manhattan Project and built the first nuclear reactor, in Chicago, in 1942.

Fermi showed his legendary ability to estimate a physical phenomenon’s magnitude by shredding a sheet of paper into small pieces and throwing them into the air

Within seconds of seeing Trinity’s blast in the desert in 1945, Fermi showed his legendary ability to estimate a physical phenomenon’s magnitude by shredding a sheet of paper into small pieces and throwing them into the air. The bomb’s shock wave blew this “confetti” (Close’s word) a few metres away. After measuring the exact distance, Fermi immediately estimated that the blast was equivalent to about 10,000 tonnes of TNT. This figure was not far off the 18,000 tonnes determined a week later following a detailed analysis by the project team.

The day after the Trinity test, a group of 70 scientists, led by Leo Szilard, sent a petition to US President Harry Truman, requesting him not to use the bomb against Japan. Albert Einstein agreed with the petition but did not sign it, having been excluded from the Manhattan Project on security grounds (though in 1939 he famously backed the bomb’s development, fearing that Nazi Germany might build its own device). Despite the protests, atomic bombs were dropped on Hiroshima and Nagasaki less than a month later – a decision that Close neither defends nor condemns.

Other key figures in the Manhattan Project were emigrants to the UK, who had fled Germany in the mid-1930s because of Nazi persecution of Jews, and later joined the secret British Tube Alloys bomb project. The best known are probably the nuclear physicists Otto Frisch and Rudolf Peierls, who initially worked together at the University of Birmingham for Tube Alloys before joining the Manhattan Project. They both receive their due from Close.

Oddly, however, he neglects to mention their fellow émigré Franz (Francis) Simon by name, despite acknowledging the importance of his work in demonstrating a technique to separate fissionable uranium-235 from the more stable uranium-238. In 1940 Simon, then working at the Clarendon Laboratory in wartime Oxford, showed that separation could be achieved by gaseous diffusion of uranium hexafluoride through a porous barrier, which he initially demonstrated by hammering his wife’s kitchen sieve flat to make the barrier.

The Manhattan Project set an example for the future of science as a highly collaborative, increasingly international albeit sometimes dangerous adventure

As Close ably documents and explains, numerous individuals and groups eventually ensured the success of the Manhattan Project. In addition to ending the Second World War and preserving freedom against Fascism, there is an argument that it also set an example for the future of science as a highly collaborative, increasingly international albeit sometimes dangerous adventure.

Close finishes the book with a shorter discussion of the two decades of Cold War rivalry between scientists from the US and the Soviet Union to develop and test the hydrogen bomb. It features physicists such as Edward Teller and Andrei Sakharov, who led the efforts to build the American “Super Bomb” and the Soviet “Tsar Bomba”, respectively.

The book ends in around 1965, after the 1963 partial test-ban treaty signed by the US, Soviet Union and the UK, preventing further tests of the hydrogen bomb for fear of their likely devastating effects on Earth’s atmosphere. As Close writes, the Tsar Bomba was more powerful than the meteorite impact 65 million years ago that wreaked global change and killed the dinosaurs, which had ruled for 150 million years.

“Within just one per cent of that time, humans have produced nuclear arsenals capable of replicating such levels of destruction,” Close warns. “The explosion of a gigaton weapon would signal the end of history. Its mushroom cloud ascending towards outer space would be humanity’s final vision.”

  • 2025 Allen Lane £25.00hb 321pp

The post Destroyers of the world: the physicists who built nuclear weapons appeared first on Physics World.

NASA criticized over its management of $3.3bn Dragonfly mission to Titan

30 septembre 2025 à 16:40

An internal audit has slammed NASA over its handling of the Dragonfly mission to Saturn’s largest moon, Titan. The drone-like rotorcraft, which is designed to land on and gather samples from Titan, has been hit by a two-year delay, with costs surging by $1bn to $3.3bn. NASA now envisions a launch date of July 2028 with Dragonfly arriving at Titan in 2034.

NASA chose Dragonfly in June 2019 as the next mission under its New Frontiers programme. Managed by the Johns Hopkins University Applied Physics Laboratory, it is a nuclear-powered, car-sized craft with eight rotors. Dragonly will spend over three years studying potential landing sites before collecting data on Titan’s unique liquid environment and looking for signs that it could support life.

The audit, carried out by NASA’s Inspector General, took no issue with NASA’s tests of the rotors’ performance, which were carried out via simulations. Indeed, the mission team is already planning formal testing of the system to start in January. But the audit criticized NASA for letting Dragonfly’s development “proceed under less than ideal circumstances”, including with a “lower than optimum project cost reserves”.

Its report aims to now avoid those problems affecting future New Horizon missions. Specifically, it calls on Nicky Fox, NASA’s associate administrator for its science mission directorate, to document lessons learned from NASA’s decision to start work on the project before establishing a baseline commitment.

It also says that NASA should maintain adequate levels of “unallocated future expenses” for the project and make sure that “the science community is informed of updates to the expected scope and cadence for future New Frontier missions”. A NASA spokesperson told Physics World that NASA management agrees with the recommendations in the report adding that the agency “will use existing resources to address [them]”.

The post NASA criticized over its management of $3.3bn Dragonfly mission to Titan appeared first on Physics World.

How the slowest experiment in the world became a fast success

30 septembre 2025 à 12:00

Nothing is really known about the origin of the world-famous “pitch-drop experiment” at the School of Physics, Trinity College Dublin. Discovered in the 1980s during a clear-out of dusty cupboards, this curious glass funnel contains a dark, black substance. All we do know is that it was prepared in October 1944 (assuming you trust the writing on it). We don’t know who filled the funnel, with what exactly, or why.

Placed on a shelf at Trinity, the funnel was largely ignored by generations of students passing by. But anyone who looked closely would have seen a drop forming slowly at the bottom of the funnel, preparing to join older drops that had fallen roughly once a decade. Then, in 2013 this ultimate example of “slow science” went viral when a webcam recorded a video of a tear-drop blob of pitch falling into the beaker below.

The video attracted more than two million hits on YouTube (a huge figure back then) and the story was covered on the main Irish evening TV news. We also had a visit from German news magazine Der Spiegel, while Discover named it as one of the top 100 science stories of 2013. As one of us (SH) described in a 2014 Physics World feature, the iconic experiment became “the drop heard round the world”.

Pitching the idea

Inspired by that interest, we decided to create custom-made replicas of the experiment to send to secondary schools across Ireland as an outreach initiative. It formed part of our celebrations of 300 years of physics at Trinity, which dates back to 1724 when the college established the Erasmus Smith’s Professorship in Natural and Experimental Philosophy.

An outreach activity that takes 10 years for anything to happen is obviously never going to work. Technical staff at Trinity’s School of Physics, who initiated the project, therefore experimented for months with different tar samples. Their goal was a material that appears solid but will lead to a falling drop every few months – not every decade.

After hitting upon a special mix of two types of bitumen in just the right proportion, the staff also built a robust experimental set-up consisting of a stand, a funnel and flask to hold any fallen drops. Each was placed on a wooden base and contained inside a glass bell jar. There were also a thermometer and a ruler for data-taking along with a set of instructions.

On 27 November 2024 we held a Zoom call with all participating schools, culminating in the official call to remove the funnel stopper

Over 100 schools – scattered all over Ireland – applied for one of the set-ups, with a total of 37 selected to take part. Most kits were personally hand-delivered to schools, which were also given a video explaining how to unpack and assemble the set-ups. On 27 November 2024 we held a Zoom call with all participating schools, culminating in the official call to remove the funnel stopper. The race was on.

Joining the race

Each school was asked to record the temperature and length of the thread of pitch slowly emerging from the funnel. They were also given a guide to making a time-lapse video of the drop and provided with information about additional experiments to explore the viscosity of other materials.

To process incoming data, we set up a website, maintained by yet another one of our technical staff. It contained interactive graphs showing the increased in drop length for every school, together with the temperature when the measurement was taken. All data were shared between schools.

After about four months, four schools had recorded a pitch drop and we decided to take stock at a half-day event at Trinity in March 2025. Attended by more than 80 pupils aged 12–18 and teachers from 17 schools, we were amazed by how much excitement our initiative had created. It spawned huge levels of engagement, with lots of colourful posters.

By the end of the school year, most had recorded a drop, showing our tar mix had worked well. Some schools had also done experiments testing other viscous materials, such as syrup, honey, ketchup and oil, examining the effect of temperature on flow rate. Others had studied the flow of granular materials, such as salt and seeds. One school had even captured on video the moment their drop fell, although sadly nobody was around to see it in person.

Some schools displayed the kits in their school entrance, others in their trophy cabinet. One group of students appeared on their local radio station; another streamed the set-up live on YouTube. The pitch-drop experiment has been a great way for students to learn basic scientific skills, such as observation, data-taking, data analysis and communication.

As for teachers, the experiment is an innovative way for them to introduce concepts such as viscosity and surface tension. It lets them explore the notion of multiple variables, measurement uncertainty and long-time-scale experiments. Some are now planning future projects on statistical analysis using the publicly available dataset or by observing the pitch drop in a more controlled environment.

Wouldn’t it be great if other physics departments followed our lead?

The post How the slowest experiment in the world became a fast success appeared first on Physics World.

Cosmic muons monitor river sediments surrounding Shanghai tunnel

25 septembre 2025 à 17:00
Photograph of the portable muon detector in the Shanghai tunnel
Trundling along A portable version of the team’s muon detector was used along the length of the tunnel. (Courtesy: Kim Siang Khaw et al/Journal of Applied Physics/CC BY 4.0)

Researchers in China say that they are the first to use cosmic-ray muography to monitor the region surrounding a tunnel. Described as a lightweight, robust and affordable scintillator setup, the technology was developed by Kim Siang Khaw at Shanghai Jiao Tong University and colleagues. They hope that their approach could provide a reliable and non-invasive method for the real-time monitoring of subterranean infrastructure.

Monitoring the structural health of tunnels and other underground infrastructure is challenging because of the lack of access. Inspection often relies on techniques such as borehole drilling, sonar scanning, and multibeam echo sounders to determine when maintenance is needed. These methods can be invasive, low resolution and involve costly and disruptive shutdowns. As a result there is often a trade-off between the quality of inspections and the frequency at which they are done.

This applies to the Shanghai Outer Ring Tunnel: a major travel artery in China’s largest city, which runs for almost 3 km beneath the Huangpu River. Completed in 2023, the submerged section of the tunnel is immersed in water-saturated sediment, creating a unique set of challenges for structural inspection.

Time-varying stresses

In particular, different layers of sediment surrounding the tunnel can vary widely in their density, permeability, and cohesion. As they build up above the tunnel, they can impart uneven, time-varying stresses, making it incredibly challenging for existing techniques to accurately assess when maintenance is needed.

To address these challenges, a multi-disciplinary team was formed to explore possible solutions. “During these talks, the [Shanghai Municipal Bureau of Planning and Natural Resources] emphasized the practical challenges of monitoring sediment build-up around critical infrastructure, such as the Shanghai Outer Ring Tunnel, without causing disruptive and costly shutdowns,” Khaw describes.

Among the most promising solutions they discussed was muography, which involves detecting the muons created when high-energy cosmic rays interact with Earth’s upper atmosphere. These muons can penetrate deep beneath Earth’s surface and are absorbed at highly predictable rates depending on the density of the material they pass through.

A simple version of muography involves placing a muon detector on the surface of an object and another detector beneath the object. By comparing the muon fluxes in the two detectors, the density of the object can be determined. By measuring the flux attenuation along different paths through the object, an image of the interior density of the object can be obtained.

Muography has been used for several decades in areas as diverse as archaeology, volcanology and monitoring riverbanks. So far, however, its potential for monitoring underground infrastructure has gone largely untapped.

“We took this ‘old-school’ technique and pioneered its use in a completely new scenario: dynamically monitoring low-density, watery sediment build-up above a submerged, operational tunnel,” Khaw explains. “Our approach was not just in the hardware, but in integrating the detector data with a simplified tunnel model and validating it against environmental factors like river tides.”

With its durable, lightweight, and affordable design, the scintillator features a dual-layer configuration that suppresses background noise while capturing cosmic muons over a broad range of angles. Crucially, it is portable and could be discreetly positioned inside an underground tunnel to carry out real-time measurements, even as traffic flows.

Sediment profiles

To test the design, Khaw’s team took measurements along the full length of the Shanghai Outer Ring Tunnel while it was undergoing maintenance; allowing them to map out a profile of the sediment surrounding the tunnel. They then compared their muon flux measurements with model predictions based on sediment profiles for the Huangpu River measured in previous years. They were pleased to obtain results that were better than anticipated.

“We didn’t know the actual tidal height until we completed the measurement and checked tidal gauge data,” Khaw describes. “The most surprising and exciting discovery was a clear anti-correlation between muon flux and the tidal height of the Huangpu River.” Unexpectedly, the detector was also highly effective at measuring the real-time height of water above the tunnel, with its detected flux closely following the ebb and flow of the tides.

Reassuringly, the team’s measurements confirmed that there are no as-yet unmapped obstructions or gaps in the sediment above the tunnel thereby confirming the structure’s safety.

“Additionally, we have effectively shown a dual-purpose technology: it offers a reliable, non-invasive method for sediment monitoring and also reveals a new technique for tidal monitoring,” says Khaw. “This opens the possibility of using muon detectors as multi-functional sensors for comprehensive urban infrastructure and environmental oversight.”

The research is described in the Journal of Applied Physics.

The post Cosmic muons monitor river sediments surrounding Shanghai tunnel appeared first on Physics World.

Gyroscopic backpack improves balance for people with movement disorder

25 septembre 2025 à 10:00

A robotic backpack equipped with gyroscopes can enhance stability for people with severe balance issues and may eventually remove the need for mobility walkers. Designed to dampen unintended torso motion and improve balance, the backpack employs similar gyroscopic technology to that used by satellites and space stations to maintain orientation. Individuals with the movement disorder ataxia put the latest iteration of the device – the GyroPack – through its paces in a series of standing, walking and body motion exercises.

In development for over a decade, GyroPack is the brainchild of a team of neurologists, biomechanical engineers and rehabilitation specialists at the Radboud University Medical Centre, Delft University of Technology (TU Delft) and Erasmus Medical Centre. The first tests of its ability to improve balance performance with ataxia-impacted adults, described in npj Robotics, produced encouraging enough results to continue the GyroPack’s development as a portable robotic wearable for individuals with neurological conditions.

Degenerative ataxias, a variety of diseases of the nervous system, cause progressive cerebral dysfunction manifesting as symptoms including lack of coordination, imbalance when standing and difficulty walking. Ataxia can afflict people of all ages, including young children. Managing the progressive symptoms may require lifetime use of cumbersome, heavily weighted walkers as mobility aids and to prevent falling.

GyroPack design

The 6 kg version of the GyroPack tested in this study contains two control moment gyroscopes (CMGs), which are attitude control devices that control orientation to a specific inertial frame-of-reference. Each CMG consists of a flywheel and a gimbal, which together generate the change in angular momentum that’s exerted onto the wearer to resist unintended torso rotations. Each CMG also contains an inertial measurement unit to determine the orientation and angular rate of change of the CMG.

The backpack also holds two independent, 1.5 kg miniaturized actuators designed by the team that convert energy into motion. The system is controlled by a laptop and powered through a separate power box that filters and electrically separates electrical signals for safety. All activities can be immediately terminated when an emergency stop button is pushed.

Lead researcher Jorik Nonnekes of Radboud UMC describes how the system works: “The change of orientation imposed by the gimbal motor, combined with the angular momentum of the flywheels, causes a free moment, or torque, that is exerted onto the system the CMG is attached to – which in this study is the human upper body,” he explains. “A cascaded control scheme reliably deals with actuator limitations without causing undesired disturbances on the user. The gimbals are controlled in such a way that the torque exerted on the trunk is proportional and opposite to the trunk’s angular velocity, which effectively lets the system damp rotational motion of the wearer. This damping has been shown to make balancing easier for unimpaired subjects and individuals post-stroke.”

Performance assessment

Study participant wearing the GyroPack
Exercise study A participant wearing the GyroPack. (Courtesy: npj Robot. 10.1038/s44182-025-00041-4)

For the study, 14 recruits diagnosed with degenerative ataxia performed five tasks: standing still with feet together and arms crossed for up to 30 s; walking on a treadmill for 2 min without using the handrail; making a clockwise and a counterclockwise 360° turn-in-place; performing a tandem stance with the heel of one foot touching the toes of the other for up to 30 s; and testing reactive balance by applying two forward and two backward treadmill perturbations.

The participants performed these tasks under three conditions, two whilst wearing the backpack and one without as a baseline. In one scenario, the backpack was operated in assistive mode to investigate its damping power and torque profiles. In the other, the backpack was in “sham mode”, without assistive control but with sound and motor vibrations indistinguishable from normal operation.

The researchers report that when fully operational, the GyroPack increased the user’s average standing time compared with not wearing the backpack at all. When used during walking, it reduced the variability of trunk angular velocity and the extrapolated centre-of-mass, two common indicators of gait stability. The trunk angular velocity variability also showed a significant reduction when comparing assistive to sham GyroPack modes. However, the performance of turn-in-place and perturbation recovery tasks were similar for all three scenarios.

Interestingly, wearing the backpack in the sham scenario improved walking tasks compared with not wearing a backpack at all. The researchers attributed this to possibly more weight in the torso area improving body stabilization or to a placebo effect.

Next, the team plans to redesign the device to make it lighter and quieter. “It’s not yet suitable for everyday use,” says Nonnekes in a press statement. “But in the future, it could help people with ataxia participate more freely in daily life, like attending social events without needing a walker, which many find bulky and inconvenient. This could greatly enhance their mobility and overall quality of life.”

The post Gyroscopic backpack improves balance for people with movement disorder appeared first on Physics World.

Environmental physics should be on a par with quantum physics or optics

24 septembre 2025 à 12:00

The world is changing rapidly – economically, geopolitically, technologically, militarily and environmentally. But when it comes to the environment, many people feel the world is on the cusp of catastrophe. That’s especially true for anyone directly affected by endemic environmental disasters, such as drought or flooding, where mass outmigration is the only option possible.

The challenges are considerable and the crisis is urgent. But we know that physics has already contributed enormously to society – and I believe that environmental physics can make a huge difference by identifying, addressing and alleviating the problems at stake. However, physicists will only be able to make a difference if we put environmental physics at the centre of our university teaching.

Grounded in physics

Environmental physics is defined as the response of living organisms to their environment within the framework of the physics principles and processes. It examines the interactions within and between the biosphere, the hydrosphere, the cryosphere, the lithosphere, the geosphere and the atmosphere. Stretching from geophysics, meteorology and climate change to renewable energy and remote sensing, it also covers soils and vegetation, the urban and built environment, and the survival of humans and animals in extreme environments.

Environmental physics was pioneered in the UK in the 1950s by the physicists Howard Penman and John Monteith, who were based at the Rothamsted Experimental Station, which is one of the oldest agricultural research institutions in the world. In recent decades, environmental physics has become more prevalent in universities across the world.

Some UK universities either teach environmental physics in their undergraduate physics degrees or have elements of it within environmental science degrees. That’s the approach taken, for example, by University College London as well as well as the universities of Cambridge, Leicester, Manchester, Oxford, Reading, Strathclyde and Warwick.

When it comes to master’s degrees in environmental physics, there are 17 related courses in the UK, including nuclear and environmental physics at Glasgow and radiation and environmental protection at Surrey. Even the London School of Economics has elements of environmental physics in some of its business, geography and economics degrees via a “physics of climate” course.

But we need to do more. The interdisciplinary nature of environmental physics means it overlaps with not just physics and maths but agriculture, biology, chemistry, computing, engineering, geology and health science too.

Indeed, recent developments in machine learning, digital technology and artificial intelligence (AI) have had an impact on environmental physics – for example, through the use of drones in environmental monitoring and simulations – while AI algorithms can catalyse modelling and weather forecasting. AI could also in future be used to predict natural disasters, such as earthquakes, tsunamis, hurricanes and volcanic eruptions, and to assess the health implications of environmental pollution.

Environmental physics is exciting and challenging, has solid foundations in mathematics and the sciences via experiments both in the lab and field. Environmental measurements are a great way to learn about the use of uncertainties, monitoring and modelling, while providing scope for project and teamwork. A grounding in environmental physics can also open the door to lots of exciting career opportunities, with ongoing environmental change meaning lots of ongoing environmental research will be vital.

Solving major regional and global environmental problems is a key part of sociopolitics and so environmental physics has a special role to play in the public arena. It gives students the chance to develop presentational and interpersonal skills that can be used to influence decision makers at local and national government level.

Taken together, I believe a module on environmental physics should be a component of every undergraduate degree as a minimum, ideally having the same weight as quantum or statistical physics or optics. Students of environmental physics have the potential to be enabled, engaged and, ultimately, to be empowered to meet the demands that the future holds.

The post Environmental physics should be on a par with quantum physics or optics appeared first on Physics World.

Negative time observed in photon-atom interaction

23 septembre 2025 à 17:00

“Negative time” might sound like science fiction, but an international team of theorists and experimentalists has determined that a photon can, in fact, spend a negative amount of time in an excited atomic state while passing through a cloud of atoms. The finding could have applications in studies of light-matter interactions and quantum sensing – though not, alas, in time travel or other sensational effects.

Quantum mechanics has produced a lot of weird results, and the latest originated in 2022 with an experiment conducted by physicists at the University of Toronto, Canada. Led by Aephraim Steinberg, they found that when a photon passing through a cloud of atoms excites an electron in one of the atoms, it seems to spend a similar amount of time in this atomic excitation as a photon that passes straight through the cloud, apparently without exciting an atom at all.

A theoretical framework

To understand the theory behind this counterintuitive result, Steinberg and colleagues worked with researchers from the Massachusetts Institute of Technology in the US, Griffith University in Australia, and the Indian Institute of Science Education and Research. The framework they developed, which they now describe in APL Quantum, involves a single photon being sent into an atom cloud that is continuously monitored with a so-called “weak probe” that detects the presence of an atomic excitation anywhere in the cloud. Integrating this weak-probe signal over time thus provides a measure of how long the photon spends in an excited atomic state before it leaves the atom.

After crunching the numbers, the researchers made a prediction that surprised even them: the average excitation time can be negative. They also found that this excitation time should be the same as another, more familiar, time known as the group delay.

The team’s lead theorist, Griffith’s Howard Wiseman, says it is important to distinguish between these two times. A negative group delay, he observes, can be explained in a relatively intuitive way. Because the front of the photon pulse exits the atom cloud before the peak of the pulse enters it, and the peak never exits because most of the photons are scattered, there is, he says, an “illusion” that the photons leave the medium before they arrive.

However, he continues, what this framework actually measures is the time a transmitted photon spends in the atom cloud. It says nothing about whether such a photon excites an atom on its way through the cloud. Although it is normally assumed that any photon that excites an atom gets randomly scattered and never reaches the detector, says Wiseman, “We now say that this is not true and forward-scattered photons actually contribute a lot to the average measurement”.

Real-time measurements

To test this theory, Steinberg and colleagues set up a new experiment that sends two counter-propagating laser beams into a cloud of 85Rb atoms that have been cooled to 60–70 µK. The first beam contains the photons that may give rise to atomic excitation and may be either transmitted or scattered. The second beam is used for the weak measurements and detects the presence of an excitation via tiny shifts in its phase. These measurements required a high level of stability and a low level of interference in all parts of the setup.

After refining their system, the researchers measured average atomic excitation times for transmitted photons ranging from (–0.82 ± 0.31)𝜏0 for the most narrowband pulse to (0.54 ± 0.28)𝜏0 for the most broadband pulse. Here, 𝜏0 is the excitation time averaged over both scattered and transmitted photons, which is always positive and ranges from 10–20 ns, depending on various parameters. This result shows that negative excitation times do indeed have a physical reality in quantum measurements.

A matter of time

According to Steinberg, while he and his colleagues previously knew that negative numbers could pop out of the mathematics, they tended to sweep them under the rug and make excuses for them, assuming that while they correctly described the location of a peak, they weren’t physically relevant. “I am now led to revisit this and say: those negative numbers appear to have more physical significance that we would previously have attributed to them,” he tells Physics World. As a result, he hopes to “begin to investigate more deeply what we think the meaning of a ‘negative time’ is”.

Jonte Hance, a quantum physicist at Newcastle University, UK, who was not involved in this research, warns that interpreting negative time too literally can lead to paradoxes that aren’t necessary for the physics to work. Nevertheless, he says, the “anomalous” values recorded in the weak measurement “point to something interesting and quantum happening”.

Hance explains that in his view, a negative value for the mean atomic excitation time for transmitted photons implies contextuality – a property of quantum systems whereby measuring the system in different ways can make it look like it has incompatible properties if we assume that measuring the system does nothing to it. “Contextuality seems to be one of the tell-tale signs a quantum scenario may provide us with an advantage at a certain task over all possible classical ways of doing that task,” he says. “And so it makes me excited for what this could be used for.”

The post Negative time observed in photon-atom interaction appeared first on Physics World.

Bridging the gap between scientists, policy makers and industry to build the quantum ecosystem

23 septembre 2025 à 12:00

When we started our PhDs in physics at Imperial College London, our paths seemed conventional: a lot of lab work, conferences and a bit of teaching on the side. What we did not expect was that within a couple of years we would be talking with MPs in the House of Commons, civil servants in Whitehall and business leaders in industry. We found ourselves contributing to policy reports and organizing roundtable discussions alongside policy-makers, scientists and investors; focusing on quantum technology and its impact on the economy and society.

Our journey into science policy engagement started almost by chance. Back in 2022 we received an e-mail from Imperial‘s Centre for Quantum Engineering Science and Technology (QuEST) advertising positions for PhD students to support evidence-based policy-making. Seeing it as an opportunity to contribute beyond the lab, we both took up the challenge. It became an integral part of our PhD experience. What started as a part-time role alongside our PhDs turned into something much more than that.

Mixing PhDs and policy

Three people stood in an otherwise empty large stone built chamber
Getting involved From left: Dimitrie Cielecki, Elizabeth Pasatembou and Michael Ho in the UK Houses of Parliament. (Courtesy: Craig Whittall)

Elizabeth Pasatembou

Elizabeth Pasatembou started her PhD in 2021, working with the particle-physics group and Centre for Cold Matter at Imperial College London. Her research focused on quantum sensing for fundamental physics as part of the Atom Interferometer Observatory and Network (AION) project. She will soon start as postdoctoral fellow working on quantum communications with the Cyprus Quantum Communications Infrastructure (CyQCI) team at the Cyprus University of Technology, which is part of the pan-European Quantum Communication Infrastructure (EuroQCI) project.

Her interest in science policy engagement started out of curiosity and the desire to make a more immediate impact during her PhD. “Research can feel slow,” she says. “Taking up this role and getting involved in policy gave me the chance to use my expertise in a way that felt directly relevant, and develop new skills along the way. I also saw this as an opportunity to challenge myself and try something new.”

Pasatembou also worked on a collaborative project between the Imperial Deep Tech Entrepreneurship and QuEST, conducting interviews with investors to inform the design of a tailored curriculum on quantum technologies for the investors community.

Dimitrie Cielecki

Dimitrie Cielecki joined Imperial’s Complex Nanophotonics group as a PhD candidate in 2021. The opportunity to work in science policy came at a time when his research was evolving in new directions. “The first year of my PhD was not straightforward, with my project taking unexpected, yet exciting, turns in the realm of photonics, but shifting away from quantum,” explains Cielecki, whose PhD topic was spatio-temporal light shaping for metamaterials.

After seeing an advert for a quantum-related policy fellowship, he decided to jump in. “I didn’t even know what supporting policy-making meant at that point,” he says. “But I quickly became driven by the idea that my actions and opinions could have a quick impact in this field.”

Cielecki is now a quantum innovation researcher at the Institute for Deep Tech Entrepreneurship in the Imperial Business School, where he is conducting research on the correlations between technical progress, investors’ confidence and commercial success in the emerging quantum sector.

We joined QuEST and the Imperial Policy Forum – the university’s policy engagement programme – in 2022 and were soon sitting at the table with leading voices in the nascent quantum technology field. We had many productive conversations with senior figures from most quantum technology start-ups in the UK. We also found ourselves talking to leaders of the National Quantum Technology Programme (including its chair, Sir Peter Knight); to civil servants from the Office for Quantum in the Department of Science, Innovation and Technology (DSIT); and to members of both the House of Commons and the House of Lords.

Sometimes we would carry out tasks such as identifying the relevant stakeholders for an event or a roundtable discussion with policy implications. Other times we would do desk research and contribute to reports used in the policy-making process. For example, we responded to the House of Commons written evidence inquiry on Commercialising Quantum Technologies (2023) and provided analysis and insights for the Regulatory Horizons Council report Regulating Quantum Technology Applications (2024). We also moderated a day of roundtable discussions with quantum specialists for the Parliamentary Office of Science and Technology’s briefing note Quantum Computing, Sensing and Communications (2025).

A two-way street

When studying science, we tend to think of it as a purely intellectual exercise, divorced from the real world. But we know that the field is applied to many areas of life, which is why countries, governments and institutions need policies to decide how science should be regulated, taught, governed and so on.

Science policy has two complimentary sides. First, it’s about how governments and institutions support and shape the practice of science through, for example, how funding is allocated. Second, science policy looks at how scientific knowledge informs and guides policy decisions in society, which also links to the increasingly important area of evidence-informed policy-making. These two dimensions are of course linked – science policy connects the science and its applications to regulation, economics, strategy and public value.

Quantum policy specifically focuses on the frameworks, strategies and regulations that shape how governments, industries and research institutions develop and deploy quantum technologies. Many countries have published national quantum strategies, which include technology roadmaps tied to government investments. These outline the infrastructure needed to speed up the adoption of quantum technology – such as facilities, supply chains and a skilled workforce.

In the UK, the National Quantum Technology Programme (NQTP) – a government-led initiative that brings together industry, academia and government – has pioneered the idea of co-ordinated national efforts for the development of quantum technologies. Set up in 2014, the programme has influenced other countries to adopt a similar approach. The NQTP has been immensely successful in bringing together different groups from both the public and private sectors to create a productive environment that advances quantum science and technology. Co-operation and communication have been at the core of this programme, which has led to the UK’s 10-year National Quantum Strategy. Launched in 2023, this details specific projects to help accelerate technological progress and make the country a leading quantum-enabled economy. But that won’t happen unless we have mechanisms to help translate science into innovation, resilient supply chains, industry-led standardization, stable regulatory frameworks and a trained workforce.

cyber-security abstract illustration
Up for discussion Quantum topics being debated as national policy include quantum cryptography and security. (Courtesy: iStock/wavebreakmedia)

Quantum technologies can bring benefits for national security, from advanced sensing to secure communications. But their dual-use nature also poses potential threats as the technology matures, particularly with the prospect of cryptographically relevant quantum computers – machines powerful enough to break encryption. To mitigate these risks in a complex geopolitical landscape, governments need tailored regulations, whether that’s preparing for the transition to post-quantum cryptography (making communication safe from powerful code-cracking quantum computers) or controlling exports of sensitive products that could compromise security.

Like artificial intelligence (AI) and other emerging technologies, there are also ethical considerations to take into account when developing quantum technologies. In particular, we need policies to ensure transparency, inclusivity and equitable access. International organizations such as UNESCO and the World Economic Forum have already started integrating quantum into their policy agendas. But as quantum technology is such a rapidly evolving new field, we need to strike a balance between innovation and regulation. Too many rules can stifle innovation but, on the other hand, policy needs to keep up with innovation to avoid any future serious incidents.

Language barriers

Policy engagement involves collaborating with three sets of stakeholders – academia; industry and investors; and policy-makers. But as we started to work with these groups, we noticed each had a different way of communicating, creating a kind of language barrier. Scientists love throwing around equations, data and figures, often using highly technical terminology. Industry leaders and investors, on the other hand, talk in terms of how innovations could affect business performance and profitability, and what the risk for their investments could be. As for policy-makers, they focus more on how to distinguish between reality and hype, and look at budgets and regulations.

We found ourselves acting as cross-sector translators, seeking to bridge the gap between the three groups. We had to listen to each stakeholder’s requirements and understand what they needed to know. We then had to reframe technical insights and communicate them in a relevant and useful way – without simplifying the science. Once we grasped everyone’s needs and expectations, we offered relevant information, putting it into context for each group so everyone was on the same page.

To help us do this, we considered the stakeholders as “inventor”, “funder”, “innovator” or “regulator”. As quantum technology is such a rapidly growing sector, the groupings of academia, industry and policy-makers are so entangled that the roles are often blurred. This alternative framework helped us to identify the needs and objectives of the people we were working with and to effectively communicate our science or evidence-backed messages.

Finding the right people

During our time as policy fellows, we were lucky to have mentors to teach us how to navigate this quantum landscape. In terms of policy, Craig Whittall from the Imperial Policy Forum was our guide on protocol and policy scoping. We worked closely with QuEST management – Peter Haynes and Jess Wade – to organize discussions, collect evidence from researchers, generate policy leads, and formulate insights or recommendations. We also had the pleasure of working with other PhD students, including Michael Ho, Louis Chen and Victor Lovic, who shared the same passion for bridging quantum research and policy.

Having access to world-leading scientists and a large pool of early-career researchers spread across all departments and faculties, facilitated by the network in QuEST, made it easier for us to respond to policy inquiries. Early on, we mapped out what quantum-related research is going on at Imperial and created a database of the researchers involved. This helped inform the university’s strategy regarding quantum research, and let us identify who should contribute to the various calls for evidence by government or parliament offices.

Group of four people in front of large banner advertising QuEST
Getting started Imperial College London encourages its researchers – established and early-career – to get involved in shaping policy. From left: Dimitrie Cielecki, Michael Ho, Louis Chen, Elizabeth Pasatembou. (Courtesy: Elizabeth Pasatembou)

PhD students are often treated as learners rather than contributors. But our experience showed that with the right support and guidance, early-career researchers (ECRs) such as ourselves can make real impact by offering fresh perspectives and expertise. We are the scientists, innovators or funders of the future so there is value in training people like us to understand the bigger picture as we embark on our careers.

To encourage young researchers to get involved in policy, QuEST and DSIT recently organized two policy workshops for ECR quantum tech specialists. Civil servants from the Office for Quantum explained their efforts and priorities, while we answered questions about our experience – the aim being to help ECRs to engage in policy-making, or choose it as a career option.

In April 2025 QuEST also launched an eight-week quantum primer for policy-makers. The course was modelled on a highly successful equivalent for AI, and looked to help policy-makers make more technically informed policy discussions. The first cohort welcomed civil servants from across government, and it was so highly reviewed a second course will be running from October 2025.

Our experience with QuEST has shown us the importance of scientists taking an active role in policy-making. With the quantum sector evolving at a formidable rate, it is vital that a framework is in place to take research from the lab to society. Scientists, industry, investors and policy-makers need to work together to create regulations and policies that will ensure the responsible use of quantum technologies that will benefit us all.

The post Bridging the gap between scientists, policy makers and industry to build the quantum ecosystem appeared first on Physics World.

Delft Circuits: cryogenic RF cable innovations offer a flexible path to quantum scalability

22 septembre 2025 à 10:55
Flexible thinking, scalable innovation Delft Circuits has established itself as a one-stop shop for scalable cryogenic I/O assemblies in quantum computing. The company’s Cri/oFlex® cabling platform combines fully integrated filtering with a compact footprint and low heatload. (Courtesy: Delft Circuits)

As manufacturers in the nascent quantum supply chain turn their gaze towards at-scale commercial opportunities in quantum computing, the scenic city of Delft in the Netherlands is emerging as a heavyweight player in quantum science, technology and innovation. At the heart of this regional quantum ecosystem is Delft Circuits, a Dutch manufacturer of specialist I/O cabling solutions, which is aligning its product development roadmap to deliver a core enabling technology for the scale-up and industrial deployment of next-generation quantum computing, communications and sensing systems.

daan-kuitenbrouwer-cofounder-headshot
Kuitenbrouwer “Cri/oFlex® allows us to increase the I/O cabling density easily – and by a lot.” (Courtesy: Delft Circuits)

In brief, the company’s Cri/oFlex® cryogenic RF cables comprise a stripline (a type of transmission line) based on planar microwave circuitry – essentially a conducting strip encapsulated in dielectric material and sandwiched between two conducting ground planes. The use of the polyimide Kapton® as the dielectric ensures Cri/oFlex® cables remain flexible in cryogenic environments (which are necessary to generate quantum states, manipulate them and read them out), with silver or superconducting NbTi providing the conductive strip and ground layer. The standard product comes as a multichannel flex (eight channels per flex) with a range of I/O channel configurations tailored to the customer’s application needs, including flux bias lines, microwave drive lines, signal lines or read-out lines.

“As quantum computers evolve – think more and more qubits plus increasingly exacting requirements on gate fidelity – system developers will reach a point where current coax cabling technology doesn’t cut it anymore,” explains Daan Kuitenbrouwer, co-founder of Delft Circuits. “The key to our story is that Cri/oFlex® allows us to increase the I/O cabling density easily – and by a lot – to scale the number of channels in a single system while guaranteeing high gate fidelities [minimizing noise and heating] as well as market-leading uptime and reliability.”

Quantum alignment

To put some hard-and-fast performance milestones against that claim, Kuitenbrouwer and colleagues have just published a granular product development roadmap that aligns Cri/oFlex® cabling specifications against the anticipated evolution of quantum computing systems –  from 150+ qubits today out to 40,000 qubits and beyond in 2029 (see figure, “Quantum alignment”).

dc-roadmap
Quantum alignment The new product development roadmap from Delft Circuits starts with the guiding principles, highlighting performance milestones to be achieved by the quantum computing industry over the next five years – specifically, the number of physical qubits per system and gate fidelities. By extension, cabling metrics in the Delft Circuits roadmap focus on “quantity”: the number of I/O channels per loader (i.e. the wiring trees that insert into a cryostat, with typical cryostats having between 6–24 slots for loaders) and the number of channels per cryostat (summing across all loaders); also on “quality” (the crosstalk in the cabling flex). To complete the picture, the roadmap outlines product introductions at a conceptual level to enable both the quantity and quality timelines. (Courtesy: Delft Circuits)

“Our roadmap is all about enabling, from an I/O perspective, the transition of quantum technologies out of the R&D lab into at-scale practical applications,” says Kuitenbrouwer. “As such, we studied the development roadmaps of more than 10 full-stack quantum computing vendors to ensure that our ‘guiding principles’ align versus the aggregate view of quantity and quality of qubits targeted by the system developers over time.”

Notwithstanding the emphasis on technology innovation and continuous product improvement, Delft Circuits is also “coming of age” in line with the wider quantum community. Most notably, the company’s centre of gravity is shifting inexorably from academic end-users to servicing vendors large and small in the quantum supply chain. “What we see are full-stack quantum computing companies starting to embrace horizontal thinking – which, in our case, means a technology partner able to solve their entire I/O cabling challenge,” explains Kuitenbrouwer.

To gain traction, however, systems integrators at the sub-stack level must, as a given, design their product offering with industrial metrics front-and-centre – for example, scalability, manufacturability, reliability, cost per I/O channel and second-sourcing. Equally important is the need to forge long-term vendor-customer relationships that often move beyond the transactional into the realm of co-development and collaboration – though all against a standardized package of cabling options.

“We integrate Cri/oFlex® with cryostats that have relatively standard vacuum feedthroughs and thermalization – more or less the same across the board,” says Kuitenbrouwer. What changes is the type of qubit – superconducting, spin, photonic – which in turn determines the configuration of the I/O line and where to place the attenuators, low-pass filters and IR filters. “This is something we can adjust relatively easily – at high volume and high reliability – with the whole I/O package installed and tested at the customer premises,” he adds.

Timing is key for quantum advantage

Commercially, Delft Circuits is already making real headway, getting “in the door” with many of the leading developers of quantum computing systems in North America and Europe. One of the main reasons for that is the ability to respond to customer requirements in an agile and timely fashion, argues Sal Bosman, a fellow co-founder of Delft Circuits.

sal bosman cofounder headshot
Bosman “Currently, we are the only industrial supplier able to deliver flexible circuits of superconducting materials at scale.” (Courtesy: Delft Circuits)

“We work on the basis of a very structured design process, playing to our strengths in superconductor fabrication, integrated microwave components and cryogenic engineering,” Bosman notes. “We have also developed our own in-house software to simulate the performance of Cri/oFlex® cabling in full-stack quantum systems. No other vendor can match this level of customer support and attention to detail.”

Right now, though, it’s all about momentum as Delft Circuits seeks to capitalize on its first-mover advantage and, what Bosman claims, is the unique value proposition of its Cri/oFlex® technology: a complete and inherently scalable I/O solution with integrated flex cables incorporating filters and high-density interconnects to quantum chips or control electronics.

With this in mind, the company is busy constructing a new 750m2 clean-room (with an option to double that footprint) alongside its existing 1000m2 in-house pilot-production and test facility. “Currently, we are the only industrial supplier able to deliver flexible circuits of superconducting materials at scale,” concludes Bosman.

“Over the next two to three years,” he adds, “we have a credible opportunity to grab significant market share when it comes to cabling I/O for quantum. Watch this space: a lot of customers are already coming to us saying ‘we don’t want to buy more coax, we want to work with you.’”

Location, location, location

loader persons modular scalable design feature image
Cryogenic integration Delft Circuits can supply fully pre-assembled loaders with Cri/oFlex® cabling inside. (Courtesy: Delft Circuits)

Delft Circuits sits within a thriving regional cluster for quantum science and technology called Quantum Delta Delft, which is centred around the canal-ringed city of Delft between The Hague and Rotterdam.

Formed in 2017 and initially located at the Faculty of Applied Sciences at Delft University of Technology (TU Delft), Delft Circuits has since grown as an independent company and is now based in the historic Cable District, where its facilities include a dedicated fabrication, pilot-production and testing area.

TU Delft is itself home to a high-profile interfaculty research institute called QuTech, a collaboration with the Netherlands Organisation for Applied Scientific Research (TNO) that’s tasked with developing full-stack hardware and software layers (including enhanced qubit technologies) for quantum computing and quantum communications systems.

Alongside this academic powerhouse, the Delft region has seen the emergence of other quantum tech start-ups like QuantWare (quantum chips), Qblox (control electronics) and Orange Quantum Systems (test solutions). All three companies work closely with Delft Circuits as part of the ImpaQT UA cooperative, a joint effort to develop open standards and interoperable technologies that enable system integrators to build quantum computing hardware stacks from off-the-shelf components.

“The ImpaQT UA story is ongoing,” explains Kuitenbrouwer. “As partners, we are super-complementary and collaborate closely to shape the future of quantum computing.” That’s why the new development roadmap is so important for Delft Circuits: to communicate a vision from the “component layer” up the value chain to the full-stack quantum computing companies.

As well as the talent pipeline that comes with proximity to TU Delft and QuTech, Quantum Delta Delft is home to TNO’s Quantum Information Technology Testing (QITT) Facility, which enables European companies to evaluate their cryogenic or non-cryogenic quantum devices and software in a full-stack quantum computing set-up.

The post Delft Circuits: cryogenic RF cable innovations offer a flexible path to quantum scalability appeared first on Physics World.

How the STFC Hartree Centre is helping UK industry de-risk quantum computing investment

18 septembre 2025 à 11:00

What role does the Hartree Centre play in quantum computing?

The Hartree Centre gives industry fast-track access to next-generation supercomputing, AI and digital capabilities. We are a “connector” when it comes to quantum computing, helping UK businesses and public-sector organizations to de-risk the early-stage adoption of a technology that is not yet ready to buy off-the-shelf. Our remit spans quantum software, theoretical studies and, ultimately, the integration of quantum computing into existing high-performance computing (HPC) infrastructure and workflows.

What does industry need when it comes to quantum computing?

It’s evident that industry wants to understand the commercial upsides of quantum computing, but doesn’t yet have the necessary domain knowledge and skill sets to take full advantage of the opportunities. By working with the STFC Hartree Centre, businesses can help their computing and R&D teams to bridge that quantum knowledge gap.   

How does the interaction with industry partners work?

The Hartree Centre’s quantum computing effort is built around a cross-disciplinary team of scientists and a mix of expertise spanning physics, chemistry, mathematics, computer science and quantum information science. We offer specialist quantum consultancy to clients across industries as diverse as energy, pharmaceuticals and food manufacturing.

How does that work in practice?

We begin by doing the due diligence on the client’s computing challenge, understanding the computational bottlenecks and, where appropriate, translating the research problem so that it can be executed, in whole or in part, on a quantum computer or a mixture of hybrid and quantum computing resources.

What are the operational priorities for the Hartree Centre in quantum computing?

Integrating classical HPC and quantum computing is a complex challenge along three main pathways: infrastructure – bridging fundamentally different hardware architectures; software – workflow management, resource scheduling and organization; and finally applications – adapting and optimizing computing workflows across quantum and classical domains. All of these competencies are mandatory for successful exploitation of quantum computing systems.

So it’s likely these pathways will converge?

Correct. Ultimately, the task is how do we distribute a workload to run on an HPC platform, also on a quantum computer, when many of the algorithms and data streams must loop back and forth between the two systems.

How do you link up classical computing and quantum resources?

We have been addressing this problem with our quantum technology partners – IBM and Pasqal – and a team at Rensselaer Polytechnic in New York. Together, we have introduced a Quantum Resource Management Interface – an open-source tool that supports unified job submission for quantum and classical computing tasks and that’s scalable to cloud computing environments. It’s the “black-box” solution industry has been looking for to bridge the established HPC and emerging quantum domains.

The STFC Hartree Centre
Quantum hub The STFC Hartree Centre employs more than 160 scientists and technologists who specialize in supercomputing, applied scientific computing, data science, AI, cloud and quantum computing. (Courtesy: STFC)

The Hartree Centre has a flagship collaboration with IBM in quantum computing. Can you tell us more?

The Hartree National Centre for Digital Innovation (HNCDI) is a £210m public–private partnership with IBM to create innovative digital technologies spanning HPC, AI, data analytics and quantum computing. HNCDI is the cornerstone of IBM’s quantum technology strategy in the UK and, over the past four years, the collaboration has clocked up more than 30 joint projects with industry. In each of these projects, HNCDI is using quantum computers to tackle problems that are out of reach for classical computers.

Do you have any examples of early wins for HNCDI in quantum?

One is streamlining drug discovery and development. As part of a joint effort with the pharmaceutical firm AstraZeneca and quantum-software developer Algorithmiq, we have improved the accuracy of molecular modelling with the help of quantum computing and, by extension, developed a better understanding of the molecular interactions and processes involved in drug synthesis. Another eye-catching development is Qiskit Machine Learning (ML), an open-source library for quantum machine-learning tasks on quantum hardware and classical simulators. While Qiskit ML started as a proof-of-concept library from IBM, our team at the Hartree Centre has, over the past couple of years, developed it into a modular tool for non-specialist users as well as quantum computational scientists and developers.

So quantum computing could play a big role in healthcare?

Healthcare has yielded productive lines of enquiry, including a proof-of-concept study to demonstrate the potential of quantum machine-learning in cancer diagnostics. Working with Royal Brompton and Harefield Hospitals and Imperial College London, we have evaluated histopathology datasets to categorize different types of breast-cancer cells through AI workflows. It’s research that could eventually lead to better predictions regarding the onset and progression of disease.

And what about other sectors?

We have been collaborating with the German power utility E.ON to study the complex challenges that quantum computing may be able to address in the energy sector – such as strategic infrastructure development, effective energy demand management and streamlined integration of renewable energy sources.

What does the next decade look like for the Hartree Centre’s quantum computing programme?

Longer term, the goal is to enable our industry partners to become at-scale end-users of quantum computing, delivering economic and societal impact along the way. As for our own development roadmap at the Hartree Centre, we are evaluating options for the implementation of a large-scale quantum computing platform to further diversify our existing portfolio of HPC, AI, data science and visual computing technologies.

STFC Hartree Centre: helping UK industry deliver societal impact

Vassil Alexandrov
Quantum returns “Our goal is to help UK industry generate economic growth and societal impact,” says Vassil Alexandrov, CSO of the STFC Hartree Centre. (Courtesy: STFC)

The Hartree Centre is part of the Science and Technology Facilities Council (STFC), one of the main UK research councils supporting fundamental and applied initiatives in astronomy, physics, computational science and space science.

Based at the Daresbury Laboratory, part of the Sci-Tech Daresbury research and innovation campus in north-west England, the Hartree Centre has more than 160 scientists and technologists specializing in supercomputing, applied scientific computing, data science, AI, cloud and quantum computing.

“Our goal is to help UK industry generate economic growth and societal impact by exploiting advanced HPC capabilities and digital technologies,” explains Vassil Alexandrov, chief science officer at STFC Hartree Centre.

One of the core priorities for Alexandrov and his team is the interface between “exascale” computing and scalable AI. It’s a combination of technologies that’s being lined up to tackle “grand challenges” like the climate crisis and the transition from fossil fuels to clean energy.

A case in point is the Climate Resilience Demonstrator, which uses “digital twins” to simulate how essential infrastructure like electricity grids and telecoms networks might respond to extreme weather events. “These kinds of insights are critical to protect communities, maintain service delivery and build more resilient public infrastructure,” says Alexandrov.

Elsewhere, as part of the Fusion Computing Lab, the Hartree Centre is collaborating with the UK Atomic Energy Authority on sustainable energy generation from nuclear fusion. “We have a joint team of around 60 scientists and engineers working on this initiative to iterate and optimize the building blocks for a fusion power plant,” notes Alexandrov. “The end-game is to deliver net power safely and affordably to the grid from magnetically confined fusion.”

Exascale computing and AI also underpin the Research Computing and Innovation Centre, a collaboration with AWE, the organization that runs research, development and support for the UK’s nuclear-weapons stockpile.

The post How the STFC Hartree Centre is helping UK industry de-risk quantum computing investment appeared first on Physics World.

Artificial intelligence could help detect ‘predatory’ journals

17 septembre 2025 à 11:42

Artificial intelligence (AI) could help sniff out questionable open-access publications that are more interested in profit than scientific integrity. That is according to an analysis of 15,000 scientific journals by an international team of computer scientists. They find that dubious journals tend to publish an unusually high number of articles and feature authors who have many affiliations and frequently self-cite (Sci. Adv. 11 eadt2792).

Open access removes the requirement for traditional subscriptions. Articles are instead made immediately and freely available for anyone to read, with publication costs covered by authors by paying an article-processing charge.

But as the popularity of open-access journals has risen, there has been a growth in “predatory” journals that exploit the open-access model by making scientists pay publication fees without a proper peer-review process in place.

To build an AI-based method for distinguishing legitimate from questionable journals, Daniel Acuña, a computer scientist at the University of Colorado Boulder, and colleagues used the Directory of Open Access Journals (DOAJ) – an online, community-curated index of open-access journals.

The researchers trained their machine-learning model on 12,869 journals indexed on the DOAJ and 2536 journals that have been removed from the DOAJ due to questionable practices that violate the community’s listing criteria. The team then tested the tool on 15,191 journals listed by Unpaywall, an online directory of free research articles.

To identify questionable journals, the AI-system analyses journals’ bibliometric information and the content and design of their websites, scrutinising details such as the affiliations of editorial board members and the average author h-index – a metric that quantifies a researcher’s productivity and impact.

The AI-model flagged 1437 journals as questionable, with the researchers concluding that 1092 were genuinely questionable while 345 were false positives.

They also identified around 1780 problematic journals that the AI screening failed to flag. According to the study authors, their analysis shows that problematic publishing practices leave detectable patterns in citation behaviour such as the last authors having a low h-index together with a high rate of self-citation.

Acuña adds that the tool could help to pre-screen large numbers of journals, adding, however, that “human professionals should do the final analysis”. The researcher’s novel AI screening system isn’t publicly accessible but they hope to make it available to universities and publishing companies soon.

The post Artificial intelligence could help detect ‘predatory’ journals appeared first on Physics World.

Space–time crystal emerges in a liquid crystal

16 septembre 2025 à 16:27

The first-ever “space–time crystal” has been created in the US by Hanqing Zhao and Ivan Smalyukh at the University of Colorado Boulder. The system is patterned in both space and time and comprises a rigid lattice of topological solitons that are sustained by steady oscillations in the orientations of liquid crystal molecules.

In an ordinary crystal atomic or molecular structures repeat at periodic intervals in space. In 2012, however, Frank Wilczek suggested that systems might also exist with quantum states that repeat at perfectly periodic intervals in time – even as they remain in their lowest-energy state.

First observed experimentally in 2017, these time crystals are puzzling to physicists because they spontaneously break time–translation symmetry, which states that the laws of physics are the same no matter when you observe them. In contrast, a time crystal continuously oscillates over time, without consuming energy.

A space–time crystal is even more bizarre. In addition to breaking time–translation symmetry, such a system would also break spatial symmetry, just like the repeating molecular patterns of an ordinary crystal. Until now, however, a space–time crystal had not been observed directly.

Rod-like molecules

In their study, Zhao and Smalyukh created a space–time crystal in the nematic phase of a liquid crystal. In this phase the crystal’s rod-like molecules align parallel to each other and also flow like a liquid. Building on computer simulations, they confined the liquid crystal between two glass plates coated with a light-sensitive dye.

“We exploited strong light–matter interactions between dye-coated, light-reconfigurable surfaces, and the optical properties of the liquid crystal,” Smalyukh explains.

When the researchers illuminated the top plate with linearly polarized light at constant intensity, the dye molecules rotate to align perpendicular to the direction of polarization. This reorients nearby liquid crystal molecules, and the effect propagates deeper into the bulk. However, the influence weakens with depth, so that molecules farther from the top plate are progressively less aligned.

As light travels through this gradually twisting structure, its linear polarization is transformed, becoming elliptically polarized by the time it reaches the bottom plate. The dye molecules there become aligned with this new polarization, altering the liquid crystal alignment near the bottom plate. These changes propagate back upward, influencing molecules near the top plate again.

Feedback loop

This is a feedback loop, with the top and bottom plates continuously influencing each other via the polarized light passing through the liquid crystal.

“These light-powered dynamics in confined liquid crystals leads to the emergence of particle-like topological solitons and the space–time crystallinity,” Smalyukh says.

In this environment, particle-like topological solitons emerge as stable, localized twists in the liquid crystal’s orientation that do not decay over time. Like particles, the solitons move and interact with each other while remaining intact.

Once the feedback loop is established, these solitons emerge in a repeating lattice-like pattern. This arrangement not only persisted as the feedback loop continued, but is sustained by it. This is a clear sign that the system exhibits crystalline order in time and space simultaneously.

Accessible system

Having confirmed their conclusions with simulations, Zhao and Smalyukh are confident this is the first experimental demonstration of a space–time crystal. The discovery that such an exotic state can exist in a classical, room-temperature system may have important implications.

“This is the first time that such a phenomenon is observed emerging in a liquid crystalline soft matter system,” says Smalyukh. “Our study calls for a re-examining of various time-periodic phenomena to check if they meet the criteria of time-crystalline behaviour.”

Building on these results, the duo hope to broaden the scope of time crystal research beyond a purely theoretical and experimental curiosity. “This may help expand technological utility of liquid crystals, as well as expand the currently mostly fundamental focus of studies of time crystals to more applied aspects,” Smalyukh adds.

The research is described in Nature Materials.

The post Space–time crystal emerges in a liquid crystal appeared first on Physics World.

Top quarks embrace in quasi-bound toponium

15 septembre 2025 à 17:35

For decades, physicists believed that the top quark, the heaviest known subatomic particle, was too short-lived to form a temporary pair with its antimatter partner. Unlike lighter quarks, which can combine to form protons, neutrons, or longer-lived quark–antiquark pairs, the top quark decays almost instantly. This made the idea of a top–antitop bound state – a fleeting association held together by the strong force – seem impossible. But now, the CMS collaboration at the Large Hadron Collider (LHC) has found the first evidence of such a state, which is dubbed toponium.

Gautier Hamel de Monchenault, spokesperson for CMS, explains, “Many physicists long believed this was impossible. That’s why this result is so significant — it challenges assumptions that have been around for decades, and particle physics textbooks will likely need to be updated because of it.”

Protons and neutrons are formed from quarks, which are fundamental particles that cannot be broken down into smaller constituents.

“There are six types of quark,” explains the German physicist Christian Schwanenberger, who is at DESY and the University of Hamburg and was not involved in the study. “Five of them form bound states thanks to the strong force, one of the four fundamental forces of nature. The top quark, however, is somehow different. It is the heaviest fundamental particle we know, but so far we have not observed it forming bound states in the same way the others do.”

Quasi-bound state

The top quark’s extreme mass makes it decay almost immediately after it is produced. “The top and antitop quarks just have time to exchange a few gluons, the carriers of the strong force, before one of them decays, hence the appellation ‘quasi-bound state’,” Hamel de Monchenault explains.

By detecting these ephemeral interactions, physicists can observe the strong force in a new regime – and the CMS team developed a clever new method to do so. The breakthrough came when the team examined how the spins of the top quark and antitop quark influence each other to create a subtle signature in the particles produced when the quarks decay.

Top quarks are produced in proton–proton collisions at the LHC, where they quickly decay into other particles. These include bottom quarks that then decay to form jets of particles, which can be detected. Top quarks can also decay to form W bosons, which themselves decay into lighter particles (leptons) such as electrons or muons, accompanied by neutrinos.

“We can detect the charged leptons directly and measure their energy very precisely, but we have to infer the presence of the neutrinos indirectly, through an imbalance of the total energy measured,” says Hamel de Monchenault. By studying the pattern and energy of the leptons and jets, the CMS team deduced the existence of top–antitop pairs and spotted the subtle signature of the fleeting quasi-bound state.

Statistical significance

The CMS researchers observed an excess of events in which the top and antitop quarks were produced almost at rest relative to each other – the precise condition needed for a quasi-bound state to form. “The signal has a statistical significance above 5σ, which means the chance it’s just a statistical fluctuation is less than one in a few million,” Hamel de Monchenault says.

While this excess accounts for only about 1% of top quark pair production, it aligns with predictions for toponium formation and offers insights into the strong force.

“Within the achieved precision, the result matches the predictions of advanced calculations involving the strong force,” explains Hamel de Monchenault. “An effect once thought too subtle to detect with current technology has now been observed. It’s comforting in a way: even the heaviest known quarks are not always alone – they can briefly embrace their opposites.”

Future directions

The discovery has energized the particle physics community. “Scientists are excited to explore the strong force in a completely new regime,” says Schwanenberger. Researchers will refine theoretical models, simulate toponium more precisely, and study its decay patterns and excited states. Much of this work will rely on the High-Luminosity LHC, expected to start operations around 2030, and potentially on future electron–positron colliders capable of studying top quarks with unprecedented precision.

“The present results are based on LHC data recorded between 2015 and 2018 [Run 2]. Since 2022, ATLAS and CMS are recording data at a slightly higher energy, which is favourable for top quark production. The amount of data already surpasses that of Run 2, and we expect that with such huge amounts of data, the properties of this new signal can be studied in detail,” Hamel de Monchenault says.

This research could ultimately answer a fundamental question: is the top quark simply another quark like its lighter siblings, or could it hold the key to physics beyond the Standard Model? “Investigating different toponium states will be a key part of the top quark research programme,” Schwanenberger says. “It could reshape our understanding of matter itself and reveal whatever holds the world together in its inmost folds.”

The results are published in Reports on Progress in Physics.

The post Top quarks embrace in quasi-bound toponium appeared first on Physics World.

Researchers map the unrest in the Vulcano volcano

15 septembre 2025 à 10:00

The isle of Vulcano is a part of the central volcanic ridge of the Aeolian archipelago on the Tyrrhenian Sea in southern Italy. Over the course of its history, Vulcano has undergone multiple explosive eruptions, with the last one thought to have occurred around 1888–1890. However, there is an active hydrothermal system under Vulcano that has shown evidence of intermittent magma and gas flows since 2021 – a sign that the volcano has been in a state of unrest.

During unrest, the volcanic risk increases significantly – and the summer months on the island currently attract a lot of tourists that might be at risk, even from minor eruptive events or episodes of increased degassing. To examine why this unrest has occurred, researchers from the University of Geneva have collaborated with the National Institute of Geophysics and Volcanology (INGV) in Italy to recreate a 3D model of the interior of the volcano on Vulcano, using a combination of nodal seismic networks and artificial intelligence (AI).

Until now, few studies have examined the deep underground details of volcanoes, instead relying on looking at the outline of their internal structure. This is because the geological domains where eruptions nucleate are often inaccessible using airborne geophysical techniques, and onshore studies don’t penetrate far enough into the volcanic plumbing system to look at how the magma and hydrothermal fluids mix. Recent studies have shown the outline of the plumbing systems, but they’ve not had sufficient resolution to distinguish the magma from the hydrothermal system.

3D modelling of the volcano

To better understand what could have caused the 2021 Vulcano unrest, the researchers deployed a nodal network of 196 seismic sensors across Vulcano and Lipari (another island in the archipelago) to measure secondary seismic waves (S-waves) using a technique called seismic ambient noise tomography. S-waves propagate slowly as they pass through fluid-rich zones, which allows magma to be identified.

The researchers captured the S-wave data using the nodal sensor network and processed it with AI – using a deep neural network. This allowed the extensive seismic dispersion data to be quickly and automatically recovered, enabling generation of a 3D S-wave velocity model. The data were captured during the volcano’s early unrest’s phase, and the sensors recorded the natural ground vibrations over a period of one month. The model revealed the high-resolution tomography of the shallow part of a volcanic system in unrest, with the approach compared to taking an “X-ray” of the volcano.

“Our study shows that our end-to-end ambient noise tomography method works with an unprecedented resolution due to using dense nodal seismic networks,” says lead author Douglas Stumpp from the University of Geneva. “The use of deep neural networks allowed us to quickly and accurately measure enormous seismic dispersion data to provide near-real time monitoring.”

The model showed that there was no new magma body between Lipari and Vulcano within the first 2 km of the Earth’s crust, but it did reveal regions that could host cooling melts at the base of the hydrothermal system. These melts were proposed to be degassing melts that could easily release gas and brines if disturbed by an Earthquake – suggesting that tectonic fault dynamics may trigger volcanic unrest. It’s thought that the volcano might have released trapped fluids at depth after being perturbed by fault activity during the 2021 unrest.

Improving risk management

While this method doesn’t enable the researchers to predict when the eruption will happen, it provides a significant understanding into how the internal dynamics of volcanoes work during periods of unrest. The use of AI enables rapid processing of large amounts of data, so in the future, the approach could be used as an early warning system by analysing the behaviour of the volcano as it unfolds.

In theory, this could help to design dynamic evacuation plans based on the direct real-time behaviour of the volcano, which would potentially save lives. The researchers state that this could take some time to develop due to the technical challenge of processing such massive volumes of data in real time – but they note that this is now more feasible thanks to machine learning and deep learning.

When asked about how the researchers plan to further develop the research, Stumpp concludes that “our study paves the ground for 4D ambient noise tomography monitoring – three dimensions of space and one dimension of time. However, I believe permanent and maintained seismic nodal networks with telemetric access to the data need to be implemented to achieve this goal”.

The research is published in Nature Communications.

The post Researchers map the unrest in the Vulcano volcano appeared first on Physics World.

High-speed 3D microscope improves live imaging of fast biological processes

12 septembre 2025 à 10:00

A new high-speed multifocus microscope could facilitate discoveries in developmental biology and neuroscience thanks to its ability to image rapid biological processes over the entire volume of tiny living organisms in real time.

The pictures from many 3D microscopes are obtained sequentially by scanning through different depths, making them too slow for accurate live imaging of fast-moving natural functions in individual cells and microscopic animals. Even current multifocus microscopes that capture 3D images simultaneously have either relatively poor image resolution or can only image to shallow depths.

In contrast, the new 25-camera “M25” microscope – developed during his doctorate by Eduardo Hirata-Miyasaki and his supervisor Sara Abrahamsson, both then at the University of California Santa Cruz, together with collaborators at the Marine Biological Laboratory in Massachusetts and the New Jersey Institute of Technology – enables high-resolution 3D imaging over a large field-of-view, with each camera capturing 180 × 180 × 50 µm volumes at a rate of 100 per second.

“Because the M25 microscope is geared towards advancing biomedical imaging we wanted to push the boundaries for speed, high resolution and looking at large volumes with a high signal-to-noise ratio,” says Hirata-Miyasaki, who is now based in the Chan Zuckerberg Biohub in San Francisco.

The M25, detailed in Optica, builds on previous diffractive-based multifocus microscopy work by Abrahamsson, explains Hirata-Miyasaki. In order to capture multiple focal planes simultaneously, the researchers devised a multifocus grating (MFG) for the M25. This diffraction grating splits the image beam coming from the microscope into a 5 × 5 grid of evenly illuminated 2D focal planes, each of which is recorded on one of the 25 synchronized machine vision cameras, such that every camera in the array captures a 3D volume focused on a different depth. To avoid blurred images, a custom-designed blazed grating in front of each camera lens corrects for the chromatic dispersion (which spreads out light of different wavelengths) introduced by the MFG.

The team used computer simulations to reveal the optimal designs for the diffractive optics, before creating them at the University of California Santa Barbara nanofabrication facility by etching nanometre-scale patterns into glass. To encourage widespread use of the M25, the researchers have published the fabrication recipes for their diffraction gratings and made the bespoke software for acquiring the microscope images open source. In addition, the M25 mounts to the side port of a standard microscope, and uses off-the-shelf cameras and camera lenses.

The M25 can image a range of biological systems, since it can be used for fluorescence microscopy – in which fluorescent dyes or proteins are used to tag structures or processes within cells – and can also work in transmission mode, in which light is shone through transparent samples. The latter allows small organisms like C. elegans larvae, which are commonly used for biological research, to be studied without disrupting them.

The researchers performed various imaging tests using the prototype M25, including observations of the natural swimming motion of entire C. elegans larvae. This ability to study cellular-level behaviour in microscopic organisms over their whole volume may pave the way for more detailed investigations into how the nervous system of C. elegans controls its movement, and how genetic mutations, diseases or medicinal drugs affect that behaviour, Hirata-Miyasaki tells Physics World. He adds that such studies could further our understanding of human neurodegenerative and neuromuscular diseases.

“We live in a 3D world that is also very dynamic. So with this microscope I really hope that we can keep pushing the boundaries of acquiring live volumetric information from small biological organisms, so that we can capture interactions between them and also [see] what is happening inside cells to help us understand the biology,” he continues.

As part of his work at the Chan Zuckerberg Biohub, Hirata-Miyasaki is now developing deep-learning models for analysing dynamic cell and organism multichannel dynamic live datasets, like those acquired by the M25, “so that we can extract as much information as possible and learn from their dynamics”.

Meanwhile Abrahamsson, who is currently working in industry, hopes that other microscopy development labs will make their own M25 systems.  She is also considering commercializing the instrument to help ensure its widespread use.

The post High-speed 3D microscope improves live imaging of fast biological processes appeared first on Physics World.

LIGO could observe intermediate-mass black holes using artificial intelligence

10 septembre 2025 à 19:41

A machine learning-based approach that could help astronomers detect lower-frequency gravitational waves has been unveiled by researchers in the UK, US, and Italy. Dubbed deep loop shaping, the system would apply real-time corrections to the mirrors used in gravitational wave interferometers. This would dramatically reduce noise in the system, and could lead to a new wave of discoveries of black hole and neutron star mergers – according to the team.

In 2015, the two LIGO interferometers made the very first observation of a gravitational wave: attributing its origin to a merger of two black holes that were roughly 1.3 billion light–years from Earth.

Since then numerous gravitational waves have been observed with frequencies ranging from 30–2000 Hz. These are believed to be from the mergers of small black holes and neutron stars.

So far, however, the lower reaches of the gravitational wave frequency spectrum (corresponding to much larger black holes) have gone largely unexplored. Being able to detect gravitational waves at 10–30 Hz would allow us to observe the mergers of intermediate-mass black holes at 100–100,000 solar masses. We could also measure the eccentricities of binary black hole orbits. However, these detections are not currently possible because of vibrational noise in the mirrors at the end of each interferometer arm.

Subatomic precision

“As gravitational waves pass through LIGO’s two 4-km arms, they warp the space between them, changing the distance between the mirrors at either end,” explains Rana Adhikari at Caltech, who is part of the team that has developed the machine-learning technique. “These tiny differences in length need to be measured to an accuracy of 10-19 m, which is 1/10,000th the size of a proton. [Vibrational] noise has limited LIGO for decades.”

To minimize noise today, these mirrors are suspended by a multi-stage pendulum system to suppress seismic disturbances. The mirrors are also polished and coated to eliminate surface imperfections almost entirely. On top of this, a feedback control system corrects for many of the remaining vibrations and imperfections in the mirrors.

Yet for lower-frequency gravitational waves, even this subatomic level of precision and correction is not enough. As a laser beam impacts a mirror, the mirror can absorb minute amounts of energy – creating tiny thermal distortions that complicate mirror alignment. In addition, radiation pressure from the laser, combined with seismic motions that are not fully eliminated by the pendulum system, can introduce unwanted vibrations in the mirror.

The team proposed that this problem could finally be addressed with the help of artificial intelligence (AI). “Deep loop shaping is a new AI method that helps us to design and improve control systems, with less need for deep expertise in control engineering,” describes Jonas Buchli at Google DeepMind, who led the research. “While this is helping us to improve control over high precision devices, it can also be applied to many different control problems.”

Deep reinforcement learning

The team’s approach is based on deep reinforcement learning, whereby a system tests small adjustments to its controls and adapts its strategy over time through a feedback system of rewards and penalties.

With deep loop shaping, the team introduced smarter feedback controls for the pendulum system suspending the interferometer’s mirrors. This system can adapt in real time to keep the mirrors aligned with minimal control noise – counteracting thermal distortions, seismic vibrations, and forces induced by radiation pressure.

“We tested our controllers repeatedly on the LIGO system in Livingston, Louisiana,” Buchli continues. “We found that they worked as well on hardware as in simulation, confirming that our controller keeps the observatory’s system stable over prolonged periods.”

Based on these promising results, the team is now hopeful that deep loop shaping could help to boost the cosmological reach of LIGO and other existing detectors, along with future generations of gravitational-wave interferometers.

“We are opening a new frequency band, and we might see a different universe much like the different electromagnetic bands like radio, light, and X-rays tell complementary stories about the universe,” says team member Jan Harms at the Gran Sasso Science Institute in Italy. “We would gain the ability to observe larger black holes, and to provide early warnings for neutron star mergers. This would allow us to tell other astronomers where to point their telescopes before the explosion occurs.”

The research is described in Science.

The post LIGO could observe intermediate-mass black holes using artificial intelligence appeared first on Physics World.

❌