↩ Accueil

Vue normale

Reçu hier — 9 septembre 2025Physics World

‘Breathing’ crystal reversibly releases oxygen

9 septembre 2025 à 17:03

A new transition-metal oxide crystal that reversibly and repeatedly absorbs and releases oxygen could be ideal for use in fuel cells and as the active medium in clean energy technologies such as thermal transistors, smart windows and new types of batteries. The “breathing” crystal, discovered by scientists at Pusan National University in Korea and Hokkaido University in Japan, is made from strontium, cobalt and iron and contains oxygen vacancies.

Transition-metal oxides boast a huge range of electrical properties that can be tuned all the way from insulating to superconducting. This means they can find applications in areas as diverse as energy storage, catalysis and electronic devices.

Among the different material parameters that can be tuned are the oxygen vacancies. Indeed, ordering these vacancies can produce new structural phases that show much promise for oxygen-driven programmable devices.

Element-specific behaviours

In the new work, a team of researchers led by physicist Hyoungjeen Jeen of Pusan and materials scientist Hiromichi Ohta in Hokkaido studied SrFe0.5Co0.5Ox. The researchers focused on this material, they say, since it belongs to the family of topotactic oxides, which are the main oxides being studied today in solid-state ionics. “However, previous work had not discussed which ion in this compound was catalytically active,” explains Jeen. “What is more, the cobalt-containing topotactic oxides studied so far were fragile and easily fractured during chemical reactions.”

The team succeeded in creating a unique platform from a solid solution of epitaxial SrFe0.5Co0.5O2.5 in which both the cobalt and iron ions bathed in the same chemical environment. “In this way, we were able to test which ion was better for reduction reactions and whether or not it sustained its structural integrity,” Jeen tells Physics World. “We found that our material showed element-specific reduction behaviours and reversible redox reactions.”

The researchers made their material using a pulsed laser deposition technique, ideal for the epitaxial synthesis of multi-element oxides that allowed them to grow SrFe0.5Co0.5O2.5 crystals in which the iron and cobalt ions were randomly located in the crystal. This random arrangement was key to the material’s ability to repeatedly release and absorb oxygen, they say.

“It’s like giving the crystal ‘lungs’ so that it can inhale and exhale oxygen on command,” says Jeen.

Stable and repeatable

This simple breathing picture comes from the difference in the catalytic activity of cobalt and iron in the compound, he explains. Cobalt ions prefer to lose and gain oxygen and these ions are the main sites for the redox activity. However, since iron ions prefer not to lose oxygen during the reduction reaction, they serve as pillars in this architecture. This allows for stable and repeatable oxygen release and uptake.

Until now, most materials that absorb and release oxygen in such a controlled fashion were either too fragile or only functioned at extremely high temperatures. The new material works under more ambient conditions and is stable. “This finding is striking in two ways: only cobalt ions are reduced, and the process leads to the formation of an entirely new and stable crystal structure,” explains Jeen.

The researchers also showed that the material could return to its original form when oxygen was reintroduced, so proving that the process is fully reversible. “This is a major step towards the realization of smart materials that can adjust themselves in real time,” says Ohta. “The potential applications include developing a cathode for intermediate solid oxide fuel cells, an active medium for thermal transistors (devices that can direct heat like electrical switches), smart windows that adjust their heat flow depending on the weather and even new types of batteries.”

Looking ahead, Jeen, Ohta and colleagues aim to investigate the material’s potential for practical applications.

They report their present work in Nature Communications.

The post ‘Breathing’ crystal reversibly releases oxygen appeared first on Physics World.

New hollow-core fibres break a 40-year limit on light transmission

9 septembre 2025 à 11:32

Optical fibres form the backbone of the Internet, carrying light signals across the globe. But some light is always lost as it travels, becoming attenuated by about 0.14 decibels per kilometre even in the best fibres. That means signals must be amplified every few dozen kilometres – a performance that hasn’t improved in nearly four decades.

Physicists at the University of Southampton, UK have now developed an alternative that could call time on that decades-long lull. Writing in Nature Photonics, they report hollow-core fibres that exhibit 35% less attenuation while transmitting signals 45% faster than standard glass fibres.

“A bit like a soap bubble”

The core of conventional fibres is made of pure glass and is surrounded by a cladding of slightly different glass. Because the core has a higher refractive index than the cladding, light entering the fibre reflects internally, bouncing back and forth in a process known as total internal reflection. This effect traps the light and guides it along the fibre’s length.

The Southampton team led by Francesco Poletti swapped the standard glass core for air. Because air is more transparent than glass, channelling light through it cuts down on scattering and speeds up signals. The problem is that air’s refractive index is lower, so the new fibre can’t use total internal reflection. Instead, Poletti and colleagues guided the light using a mechanism called anti-resonance, which requires the walls of the hollow core to be made from ultra-thin glass membranes.

“It’s a bit like a soap bubble,” Poletti says, explaining that such bubbles appear iridescent because their thin films reflect some wavelengths and lets others through. “We designed our fibre the same way, with glass membranes that reflect light at certain frequencies back into the core.” That anti-resonant reflection, he adds, keeps the light trapped and moving through the fibre’s hollow centre.

Greener telecommunications

To make the new air-core fibre, the researchers stacked thin glass capillaries in a precise pattern, forming a hollow channel in the middle. Heating and drawing the stack into a hair-thin filament preserved this pattern on a microscopic scale. The finished fibre has a nested design: an air core surrounded by ultra-thin layers that provide anti-resonant guidance and cut down on leakage.

To test their design, the team measured transmission through a full spool of fibre, then cut the fibre shorter and compared the results. They also fired in light pulses and tracked the echoes. Their results show that the hollow fibres reduce attenuation to just 0.091 decibels per kilometre. This lower loss implies that fewer amplifiers would be needed in long cables, lowering costs and energy use. “There’s big potential for greener telecommunications when using our fibres,” says Poletti.

Poletti adds that reduced attenuation (and thus lower energy use) is only one of the new fibre’s advantages. At the 0.14 dB/km attenuation benchmark, the new hollow fibre supports a bandwidth of 54 THz compared to 10 THz for a normal fibre. At the reduced 0.1 dB/km attenuation, the bandwidth is still 18 THz, which is close to twice that of a normal cable. This means that a single strand can carry far more channels at once.

Perhaps the most impressive advantage is that because the speed of light is faster in air than in glass, data could travel the same distance up to 45% faster. “It’s almost the same speed light takes when we look at a distant star,” Poletti says. The resulting drop in latency, he adds, could be crucial for real-time services like online gaming or remote surgery, and could also speed up computing tasks such as training large language models.

Field testing

As well as the team’s laboratory tests, Microsoft has begun testing the fibres in real systems, installing segments in its network and sending live traffic through them. These trials prove the hollow-core design works with existing telecom equipment, opening the door to gradual rollout. In the longer run, adapting amplifiers and other gear that are currently tuned for solid glass fibres could unlock even better performance.

Poletti believes the team’s new fibres could one day replace existing undersea cables. “I’ve been working on this technology for more than 20 years,” he says, adding that over that time, scepticism has given way to momentum, especially now with Microsoft as an industry partner. But scaling up remains a real hurdle. Making short, flawless samples is one thing; mass-producing thousands of kilometres at low cost is another. The Southampton team is now refining the design and pushing toward large-scale manufacturing. They’re hopeful that improvements could slash losses by another order of magnitude and that the anti-resonant design can be tuned to different frequency bands, including those suited to new, more efficient amplifiers.

Other experts agree the advance marks a turning point. “The work builds on decades of effort to understand and perfect hollow-core fibres,” says John Ballato, whose group at Clemson University in the US develops fibres with specialty cores for high-energy laser and biomedical applications. While Ballato notes that such fibres have been used commercially in shorter-distance communications “for some years now”, he believes this work will open them up to long-haul networks.

The post New hollow-core fibres break a 40-year limit on light transmission appeared first on Physics World.

Indefinite causal order: how quantum physics is challenging our understanding of cause and effect

9 septembre 2025 à 10:01

The concept of cause and effect plays an important role in both our everyday lives, and in physics. If you set a ball down in front of a window and kick it hard, a split-second later the ball will hit the window and smash it. What we don’t observe is a world where the window smashes on its own, thereby causing the ball to be kicked – that would seem rather nonsensical. In other words, kick before smash, and smash before kick, are two different physical processes each having a unique and definite causal order.

But, does definite causal order also reign supreme the quantum world, where concepts like position and time can be fuzzy?  Most physicists are happy to accept the paradox of Schrödinger’s cat – a thought experiment in which a cat hidden in a box is simultaneously dead and alive at the same time, until you open the box to check. Schrödinger’s cat illustrates the quantum concept of “superposition”, whereby a system can be in two or more states at the same time. It is only when a measurement is made (by opening the box), does the system collapse into one of its possible states.

But could two (or more) causally distinct processes occur at the same time in the quantum world? The answer, perhaps shockingly, is yes and this paradoxical phenomenon is called indefinite causal order (ICO).

Stellar superpositions and the order of time

It turns out that different causal processes can also exist in a superposition. One example is a thought experiment called the “gravitational quantum switch”, which was proposed in 2019 by Magdalena Zych of the University of Queensland and colleagues (Nat. Comms 10 3772). This features our favourite quantum observers Alice and Bob, who are in the vicinity of a very large mass, such as a star. Alice and Bob both have initially synchronized clocks and in the quantum world, these clocks would continue to run at identical rates. However, Einstein’s general theory of relativity dictates that the flow of time is influenced by the distribution of matter in the vicinity of Alice and Bob. This means that if Alice is closer to the star than Bob, then her clock will run slower than Bob’s, and vice versa.

Like with Schrödinger’s cat, quantum mechanics allows the star to be in a superposition of spatial states; meaning that in one state Alice is closer to the star than Bob, and in the other Bob is closer to the star than Alice. In other words, this is a superposition of a state in which Alice’s clock runs slower than Bob’s, and a state in which Bob’s clock runs slower than Alice’s.

Alice and Bob are both told they will receive a message at a specific time (say noon) and that they would then pass that message on to the their counterpart. If Alice’s clock is running faster than Bob’s then she will receive the message first, and then pass it on to Bob, and vice versa. This superposition of Alice to Bob with Bob to Alice is an example of indefinite causal order.

Now, you might be thinking “so what” because this seems to be a trivial example. But it becomes more interesting if you replace the message with a quantum particle like a photon; and have Alice and Bob perform different operations on that photon. If the two operations do not commute – such as rotations of the photon polarization in the X and Z planes – then the order in which the operations are done will affect the outcome.

As a result, this “gravitational quantum switch” is a superposition of two different causal processes with two different outcomes. This means that Alice and Bob could do more exotic operations on the photon, such as “measure-and-reprepare” operations (where a quantum system is first measured, and then, based on the measurement outcome, a new quantum state is prepared). In this case Alice measures the quantum state of the received photon and prepares a photon that she sends to Bob (or vice versa).

Much like Schrödinger’s cat, a gravitational quantum switch cannot currently be realized in the lab. But, never say never. Physicists have been able to create experimental analogues of some thought experiments, so who knows what the future will bring. Indeed, a gravitational quantum switch could provide important information regarding a quantum description of gravity – something that has eluded physicists ever since quantum mechanics and general relativity were being developed in the early 20th century.

Switches and superpositions

Moving on to more practical ICO experiments, physicists have already built and tested light-based quantum switches in the lab. Instead of having the position of the star determining whether Alice or Bob go first, the causal order is determined by a two-level quantum state – which can have a value of 0 or 1. If this control state is 0, then Alice goes first and if the control state is 1, then Bob goes first. Crucially, when the control state is in a superposition of 0 and 1 the system shows indefinite causal order (see figure 1).

1 Simultaneous paths

Illustration of a proton travelling between Alice and Bob on different routes
(Illustration courtesy: Mayank Shreshtha)

In this illustration of a quantum switch a photon (driving a car) can follow two different paths, each with a different causal order. One path (top) leads to Alice’s garage followed by a visit to Bob’s drive thru. The second path (middle) visits Bob first, and then Alice. The path taken by the photon is determined by a control qubit that is represented by a traffic light. If the value of the qubit is “0” then the photon visits Alice First; if the qubit is “1” then the photon visits Bob first. Both of these scenarios have definite causal order.

However, the control qubit can exist in a quantum superposition of “0” and “1” (bottom). In this superposition, the path followed by the photon – and therefore the temporal order in which it visits Alice and Bob – is not defined. This is an example of indefinite causal order. Of course, any attempt to identify exactly which path the photon goes through initially will destroy the superposition (and therefore the ICO) and the photon will take only one definite path.

The first such quantum switch was created by in 2015 by Lorenzo Procopio (now at Germany’s University of Paderborn) and colleagues at the Vienna Center for Quantum Science and Technology (Nat. Comms 6, 7913). Their quantum switch involves firing a photon at a beam splitter, which puts the photon into a superposition of a photon that has travelled straight through the splitter (state 0) and a photon that has been deflected by 90 degrees (state 1). This spatial superposition is the control state of the quantum switch, playing the role of the star in the gravitational quantum switch.

State 0 photons first travel to an Alice apparatus where a polarization rotation is done in a specific direction (say X). Then the photons are sent to a Bob apparatus where a non-commuting rotation (say Z) is done. Conversely, the photons that travel along the state 1 path encounter Bob before Alice.

Finally, the state 0 and state 1 paths are recombined at a second beamsplitter, which is monitored by two photon-detectors. Because Alice-then-Bob has a different effect on a photon than does Bob-then-Alice, interference can occur between recombined photons. This interference is studied by systematically changing certain aspects of the experiment. For example, by changing Alice’s direction of rotation or the polarization of the incoming photons.

In 2017 quantum-information researcher Giulia Rubino, then at the Vienna Center for Quantum Science and Technology, teamed up with Procopia and colleagues to verify ICO in their quantum switch using a “causal witness” (Sci. Adv. 3 e1602589). This involves doing a specific set of experiments on the quantum switch and calculating a mathematical entity (the causal witness) that reveals whether a system has definite or indefinite causal order. Sure enough, this test revealed that their system does indeed have ICO. Since then, physicists working in several independent labs have successfully created their own quantum switches.

Computational speed up?

While this effect might still seem somewhat obscure, in 2019, an international team led by the renowned Chinese physicist Jian-Wei Pan showed that a quantum switch can be very useful for doing computations that are distributed between two parties (Phys. Rev. Lett122 120504). In such a scenario a string of data is received and then processed by Alice, who then passes the results on to Bob for further processing. In an experiment using photons, they showed that ICO delivers an exponential speed-up of the rate at which longer strings are processed – compared to a system with no ICO.

Physicists are also exploring if ICO could be used to enhance quantum metrology. Indeed, recent calculations by Oxford University’s Giulio Chiribella and colleagues suggest that it could lead to a significant increase in precision when compared to techniques that involve states with definite causal order (Phys. Rev. Lett. 124 190503).

While other applications could be possible, it is often difficult work out whether ICO offers the best solution to a specific problem. For example, physicists had thought a quantum switch offered an advantage when it comes to communicating along a noisy channel, but it turns out that some configurations of Alice and Bob with definite causal order were just as good as an ICO.

Beyond the quantum switch, there are other types of circuits that would display ICO. These include “quantum circuits with quantum control of causal order”, which have yet to be implemented in the lab because of their complexity.

But despite the challenges in creating ICO systems and proving that they outperform other solutions, it looks like ICO is set to join ranks of other weird phenomena such as superposition and entanglement that have found practical applications in quantum technologies.

This article forms part of Physics World‘s contribution to the 2025 International Year of Quantum Science and Technology (IYQ), which aims to raise global awareness of quantum physics and its applications.

Stayed tuned to Physics World and our international partners throughout the year for more coverage of the IYQ.

Find out more on our quantum channel.

The post Indefinite causal order: how quantum physics is challenging our understanding of cause and effect appeared first on Physics World.

Reçu avant avant-hierPhysics World

Reformulation of general relativity brings it closer to Newtonian physics

5 septembre 2025 à 15:37

The first-ever detection of gravitational waves was made by LIGO in 2015 and since then researchers have been trying to understand the physics of the black-hole and neutron-star mergers that create the waves. However, the physics is very complicated and is defined by Albert Einstein’s general theory of relativity.

Now Jiaxi Wu, Siddharth Boyeneni and Elias Most at the California Institute of Technology (Caltech) have addressed this challenge by developing a new formulation of general relativity that is inspired by the equations that describe electromagnetic interactions. They show that general relativity behaves in the same way as the gravitational inverse square law described by Isaac Newton more than 300 years ago. “This is a very non-trivial insight,” says Most.

One of the fascinations of black holes is the extreme physics they invoke. These astronomical objects  pack so much mass into so little space that not even light can escape their gravitational pull. Black holes (and neutron stars) can exist in binary systems in which the objects orbit each other. These pairs eventually merge to create single black holes in events that create detectable gravitational waves. The study of these waves provides an important testbed for gravitational physics. However, the mathematics of general relativity that describe these mergers is very complicated.

Inverse square law

According to Newtonian physics, the gravitational attraction between two masses is proportional to the inverse of the square of the distance between them – the inverse square law. However, as Most points out, “Unless in special cases, general relativity was not thought to act in the same way.”

Over the past decade, gravitational-wave researchers have taken various approaches including post-Newtonian theory and effective one-body approaches to better understand the physics of black-hole mergers. One important challenge is how to model parameters such as orbital eccentricity and precession in black hole systems and how best to understand “ringdown”. The latter is the process whereby a black hole formed by a merger emits gravitational waves as it relaxes into a stable state.

The trio’s recasting of the equations of general relativity was inspired by the Maxwell equations that describe how electric and magnetic fields leapfrog each other through space. According to these equations, the forces between electric charges diminish according to the same inverse square law as Newton’s gravitational attraction.

Early reformulations

The original reformulations of “gravitoelectromagnetism” date back to the 90s. Most explains how among those who did this early work was his Caltech colleague and LIGO Nobel laureate Kip Thorne, who exploited a special mathematical structure of the curvature of space–time.

“This structure mathematically looks like the equations governing light and the attraction of electric charges, but the physics is quite different,” Most tells Physics World. The gravito-electric field thus derived describes how an object might squish under the forces of gravity. “Mathematically this means that the previous gravito-electric field falls off with inverse distance cubed, which is unlike the inverse distance square law of Newtonian gravity or electrostatic attraction,” adds Most.

Most’s own work follows on from previous studies of the potential radio emission from the interaction of magnetic fields during the collision of neutron stars and black holes from which it seemed reasonable to then “think about whether some of these insights naturally carry over to Einstein’s theory of gravity”. The trio began with different formulations of general relativity and electromagnetism with the aim of deriving gravitational analogues for the electric and magnetic fields that behave more closely to classical theories of electromagnetism. They then demonstrated how their formulation might describe the behaviour of a non-rotating Schwarzschild black hole, as well as a black hole binary.

Not so different

“Our work says that actually general relativity is not so different from Newtonian gravity (or better, electric forces) when expressed in the right way,” explains Most. The actual behaviour predicted is the same in both formulations but the trio’s reformulation reveals how general relativity and Newtonian physics are more similar than they are generally considered to be. “The main new thing is then what does it mean to ‘observe’ gravity, and what does it mean to measure distances relative to how you ‘observe’.”

Alexander Phillipov is a black-hole expert at the University of Maryland in the US and was not directly involved with Most’s research. He describes the research as “very nice”, adding that while the analogy between gravity and electromagnetism has been extensively explored in the past, there is novelty in the interpretation of results from fully nonlinear general relativistic simulations in terms of effective electromagnetic fields. “It promises to provide valuable intuition for a broad class of problems involving compact object mergers.”

The research is described in Physical Review Letters.

The post Reformulation of general relativity brings it closer to Newtonian physics appeared first on Physics World.

Researchers create glow-in-the-dark succulents that recharge with sunlight

5 septembre 2025 à 14:17

“Picture the world of Avatar, where glowing plants light up an entire ecosystem,” describes Shuting Liu of South China Agricultural University in Guangzhou.

Well, that vision is now a step closer thanks to researchers in China who have created glow-in-the-dark succulents that recharge in sunlight.

Instead of coaxing cells to glow through genetic modification, the team instead used afterglow phosphor particles – materials similar to those found in glow-in-the-dark toys – that can absorb light and release it slowly over time.

The researchers then injected the particles into succulents, finding that they produced a strong glow, thanks to the narrow, uniform and evenly distributed channels within the leaf that helped to disperse the particles.

After a couple of minutes of exposure to sunlight or indoor LED light, the modified plants glowed for up to two hours. By using different types of phosphors, the researchers created plants that shine in various colours, including green, red and blue.

The team even built a glowing plant wall with 56 succulents, which was bright enough to illuminate nearby objects.

“I just find it incredible that an entirely human-made, micro-scale material can come together so seamlessly with the natural structure of a plant,” notes Liu. “The way they integrate is almost magical. It creates a special kind of functionality.”

The post Researchers create glow-in-the-dark succulents that recharge with sunlight appeared first on Physics World.

Big data helps Gaelic football club achieve promotion following 135-year wait

5 septembre 2025 à 11:18

An astrophysics PhD student from County Armagh in Northern Ireland has combined his passion for science with Gaelic football to help his club achieve a historic promotion.

Eamon McGleenan plays for his local team – O’Connell’s GAC Tullysaran – and is a PhD student at Queen’s University Belfast, where he is a member of the Predictive Sports Analytics (PSA) research team, which was established in 2023.

McGleenan and his PhD supervisor David Jess teamed up with GAC Tullysaran to investigate whether data analysis and statistical techniques could improve their training and results.

Over five months, the Queen’s University researchers took over 550 million individual measurements from the squad, which included information such as player running speed, accelerations and heart rates.

“We applied mathematical models to the big data we obtained from the athletes,” notes McGleenan. “This allowed us to examine how the athletes evolved over time and we then provided key insights for the coaching staff, who then generated bespoke training routines and match tactics.”

The efforts immediately paid off as in July GAC Tullysaran won their league by two points and were promoted for the first time in 135 years to the top-flight Senior Football League, which they will start in March.

“The statistical insight provided by PSA is of great use and I like how it lets me get the balance of training right, especially in the run-up to match day,” noted Tullysaran manager Pauric McGlone, who adds that it also provided a bit of competition in the squad that ensured the players were “conditioned in a way that allows them to perform at their best”.

For more about the PSA’s activities, see here.

The post Big data helps Gaelic football club achieve promotion following 135-year wait appeared first on Physics World.

Zero-point motion of atoms measured directly for the first time

5 septembre 2025 à 10:11

Physicists in Germany say they have measured the correlated behaviour of atoms in molecules prepared in their lowest quantum energy state for the first time. Using a technique known as Coulomb explosion imaging, they showed that the atoms do not simply vibrate individually. Instead, they move in a coupled fashion that displays fixed patterns.

According to classical physics, molecules with no thermal energy – for example, those held at absolute zero – should not move. However, according to quantum theory, the atoms making up these molecules are never completely “frozen”, so they should exhibit some motion even at this chilly temperature. This motion comes from the atoms’ zero-point energy, which is the minimum energy allowed by quantum mechanics for atoms in their ground state at absolute zero. It is therefore known as zero-point motion.

Reconstructing the molecule’s original structure

To study this motion, a team led by Till Jahnke from the Institute for Nuclear Physics at Goethe University Frankfurt and the Max Planck Institute for Nuclear Physics in Heidelberg used the European XFEL in Hamburg to bombard their sample – an iodopyridine molecule consisting of 11 atoms – with ultrashort, high-intensity X-ray pulses. These high-intensity pulses violently eject electrons out of the iodopyridine, causing its constituent atoms to become positively charged (and thus to repel each other) so rapidly that the molecule essentially explodes.

To image the molecular fragments generated by the explosion, the researchers used a customized version of a COLTRIMS reaction microscope. This approach allowed them to reconstruct the molecule’s original structure.

From this reconstruction, the researchers were able to show that the atoms do not simply vibrate individually, but that they do so in correlated, coordinated patterns. “This is known, of course, from quantum chemistry, but it had so far not been measured in a molecule consisting of so many atoms,” Jahnke explains.

Data challenges

One of the biggest challenges Jahnke and colleagues faced was interpreting what the microscope data was telling them. “The dataset we obtained is super-rich in information and we had already recorded it in 2019 when we began our project,” he says. “It took us more than two years to understand that we were seeing something as subtle (and fundamental) as ground-state fluctuations.”

Since the technique provides detailed information that is hidden to other imaging approaches, such as crystallography, the researchers are now using it to perform further time-resolved studies – for example, of photochemical reactions. Indeed, they performed and published the first measurements of this type at the beginning of 2025, while the current study (which is published in Science) was undergoing peer review.

“We have pushed the boundaries of the current state-of-the-art of this measurement approach,” Jahnke tells Physics World, “and it is nice to have seen a fundamental process directly at work.”

For theoretical condensed matter physicist Asaad Sakhel at Balqa Applied University, Jordan, who was not involved in this study, the new work is “an outstanding achievement”. “Being able to actually ‘see’ zero-point motion allows us to delve deeper into the mysteries of quantum mechanics in our quest to a further understanding of its foundations,” he says.

The post Zero-point motion of atoms measured directly for the first time appeared first on Physics World.

Artificial intelligence predicts future directions in quantum science

4 septembre 2025 à 15:55

Can artificial intelligence predict future research directions in quantum science? Listen to this episode of the Physics World Weekly podcast to discover what is already possible.

My guests are Mario Krenn – who heads the Artificial Scientist Lab at Germany’s Max Planck Institute for the Science of Light – and Felix Frohnert, who is doing a PhD on the intersection of quantum physics and machine learning at Leiden University in the Netherlands.

Frohnert, Krenn and colleagues published a paper earlier this year called “Discovering emergent connections in quantum physics research via dynamic word embeddings” in which they analysed more than 66,000 abstracts from the quantum-research literature to see if they could predict future trends in the field. They were particularly interested in the emergence of connections between previously isolated subfields of quantum science.

We chat about what motivated the duo to use machine learning to study quantum science; how their prediction system works; and I ask them whether they have been able to predict current trends in quantum science using historical data.

Their paper appears in the journal Machine Learning Science and Technology. It is published by IOP Publishing – which also brings you Physics World.  Krenn is on the editorial board of the journal and in the podcast he explains why it is important to have a platform to publish research at the intersection of physics and machine learning.

This article forms part of Physics World‘s contribution to the 2025 International Year of Quantum Science and Technology (IYQ), which aims to raise global awareness of quantum physics and its applications.

Stayed tuned to Physics World and our international partners throughout the next 12 months for more coverage of the IYQ.

Find out more on our quantum channel.

 

The post Artificial intelligence predicts future directions in quantum science appeared first on Physics World.

Errors in A-level physics papers could jeopardize student university admissions, Institute of Physics warns

4 septembre 2025 à 15:30

Errors in some of this year’s A-level physics exam papers could leave students without good enough grades to study physics at university. The mistakes have forced Tom Grinyer, chief executive of the Institute of Physics (IOP), to write to all heads of physics at UK universities, calling on them to “take these exceptional circumstances into account during the final admissions process”. The IOP is particularly concerned about students whose grades are lower than expected or are “a significant outlier” compared to other subjects.

The mistakes in question appeared in the physics (A) exam papers 1 and 2 set by the OCR exam board. Erratum notices had been issued to students at the start of the exam in June, but a further error in paper 2 was only spotted after the exam had taken place, causing some students to get stuck. Physics paper 2 from the rival AQA exam board was also seen to contain complex phrasing that hindered students’ ability to answer the question and led to time pressures.

A small survey of physics teachers carried out after the exam by the IOP, which publishes Physics World, reveals that 41% were dissatisfied with the OCR physics exam papers and more than half (58%) felt that students had a negative experience. Two-thirds of teachers, meanwhile, reported that students had a negative experience during the AQA exam. A-levels are mostly taken by 18 year olds in England, Wales and Northern Ireland, with the grades being used by universities to decide admission.

Grinyer says that the IOP is engaging in “regular, open dialogue with exam boards” to ensure that the assessment process supports and encourages students, while maintaining the rigour and integrity of the qualification. “Our immediate concern,” Grinyer warns, “is that the usual standardization processes and adjustments to grade boundaries – particularly for the OCR paper with errors – may fail to compensate fully for the negative effect on exam performance for some individuals.”

An OCR spokesperson told Physics World that the exam board is “sorry to the physics students and teachers affected by errors in A-level physics this year”. The board says that it “evaluated student performance across all physics papers, and took all necessary steps to mitigate the impact of these errors”. The OCR claims that the 13,000 students who sat OCR A-level physics A this year “can be confident” in their A-level physics results.

“We have taken immediate steps to review and strengthen our quality assurance processes to prevent such issues from occurring in the future,” the OCR adds. “We appreciated the opportunity to meet with the Institute of Physics to discuss these issues, and also to discuss our shared interest in encouraging the growth of this vital subject.”

Almost 23,500 students sat AQA A-level physics this year and an AQA spokesperson told Physics World that the exam board “listened to feedback and took steps to make A-level physics more accessible” to students and that there “is no need for universities to make an exception for AQA physics outcomes when it comes to admissions criteria”.

“These exam papers were error-free, as teachers and students would expect, and we know that students found the papers this year to be more accessible than last year,” they say. “We’ll continue to engage with any feedback that we receive, including feedback from the Institute of Physics, to explore how we can enhance our A-level physics assessments and give students the best possible experience when they sit exams.”

Students ‘in tears’

The IOP now wants A-level physics students to be given a “fair opportunity” when it comes to university admissions. “These issues are particularly concerning for students on widening participation pathways, many of whom already face structural barriers to high-stakes assessment,” the IOP letter states. “The added challenge of inaccessible or error-prone exam papers risks compounding disadvantage and may not reflect the true potential of these students.”

The IOP also contacted AQA last year over inaccessible contexts and language used in previous physics exams. But despite AQA’s assurances that the problems would be addressed, some of the same issues have now recurred. Helen Sinclair, head of physics at the all-girls Wimbledon High School, believes that the “variable quality” of recent A-level papers have had “far-reaching consequences” on young people thinking of going into physics at university.

“Our students have exceptionally high standards for themselves and the opaque nature of many questions affects them deeply, no matter what grades they ultimately achieve. This has even led some to choose to apply for other subjects at university,” she told Physics World. “This is not to say that papers should not be challenging; however, better scaffolding within some questions would help students anchor themselves in what is an already stressful environment, and would ultimately enable them to better demonstrate their full potential within an exam.”

Students come out of the exams feeling disheartened, and those students share their perceptions with younger students

Abbie Hope, Stokesley School

Those concerns are echoed by Abbie Hope, head of physics at Stokesley School near Middlesbrough. She says the errors in this year’s exam papers are “not acceptable” and believes that OCR has “failed their students”. Hope says that AQA physics papers in recent years have been “very challenging” and have resulted in students feeling like they cannot do physics. She also says some have emerged from exam halls in tears.

“Students come out of the exams feeling disheartened and share their perceptions with younger students,” she says. “I would rather students sat a more accessible paper, with higher grade boundaries so they feel more successful when leaving the exam hall, rather than convinced they have underachieved and then getting a surprise on results day.” Hope fears the mistakes will undermine efforts to encourage uptake and participation in physics and that exam boards need to serve students and teachers better.

A ‘growing unease’

Rachael Houchin, head of physics at Royal Grammar School Newcastle, says this year’s errors have added to her “growing unease” about the state of physics education in the UK. “Such incidents – particularly when they are public and recurring – do little to improve the perception of the subject or encourage its uptake,” she says. “Everyone involved in physics education – at any level – has a duty to get it right. If we fail, we risk physics drifting into the category of subjects taught predominantly in selective or independent schools, and increasingly absent from the mainstream.”

Hari Rentala, associate director of education and workforce at the IOP, is concerned that the errors unfairly “perpetuate the myth” that physics is a difficult subject. “OCR appear to have managed the situation as best they can, but this is not much consolation for how students will have felt during the exam and over the ensuing weeks,” says Rentala. “Once again AQA set some questions that were overly challenging. We can only hope that the majority of students who had a negative experience as a result of these issues at least receive a fair grade – as grade boundaries have been adjusted down.”

Mixed news for pupils

Despite the problems with some specific papers, almost 45,000 students took A-level physics in the UK – a rise of 4.3% on last year – to reach the highest level for 25 years. Physics is now the sixth most popular subject at A-level, up from ninth last year, with girls representing a quarter of all candidates. Meanwhile, in Scotland the number of entries in both National 5 and Higher physics was 13,680 and 8560, respectively, up from 13,355 and 8065 last year.

“We are delighted so many young people, and increasing numbers of girls, are hearing the message that physics can open up a lifetime of opportunities,” says Grinyer. “If we can build on this momentum there is a real opportunity to finally close the gap between boys and girls in physics at A-level. To do that we need to continue to challenge the stereotypes that still put too many young people off physics and ensure every young person knows that physics – and a career in science and innovation – could be for them.”

However, there is less good news for younger pupils, with a new IOP report finding that more than half a million GCSE students are expected to start the new school year with no physics teacher. It reveals that a quarter of English state schools have no specialist physics teachers at all and fears that more than 12,000 students could miss out on taking A-level physics because of this. The IOP wants the UK government to invest £120m over the next 10 years to address the shortage by retaining, recruiting and retraining a new generation of physics teachers.

The post Errors in A-level physics papers could jeopardize student university admissions, Institute of Physics warns appeared first on Physics World.

Quantum sensors reveal ‘smoking gun’ of superconductivity in pressurized bilayer nickelates

4 septembre 2025 à 10:00

Physicists at the Chinese Academy of Sciences (CAS) have used diamond-based quantum sensors to uncover what they say is the first unambiguous experimental evidence for the Meissner effect – a hallmark of superconductivity – in bilayer nickelate materials at high pressures. The discovery could spur the development of highly sensitive quantum detectors that can be operated under high-pressure conditions.

Superconductors are materials that conduct electricity without resistance when cooled to below a certain critical transition temperature Tc. Apart from a sharp drop in electrical resistance, another important sign that a material has crossed this threshold is the appearance of the Meissner effect, in which the material expels a magnetic field from its interior (diamagnetism). This expulsion creates such a strong repulsive force that a magnet placed atop the superconducting material will levitate above it.

In “conventional” superconductors such as solid mercury, the Tc is so low that the materials must be cooled with liquid helium to keep them in the superconducting state. In the late 1980s, however, physicists discovered a new class of superconductors that have a Tabove the boiling point of liquid nitrogen (77 K). These “unconventional” or high-temperature superconductors are derived not from metals but from insulators containing copper oxides (cuprates).

Since then, the search has been on for materials that superconduct at still higher temperatures, and perhaps even at room temperature. Discovering such materials would have massive implications for technologies ranging from magnetic resonance imaging machines to electricity transmission lines.

Enter nickel oxides

In 2019 researchers at Stanford University in the US identified nickel oxides (nickelates) as additional high-temperature superconductors. This created a flurry of interest in the superconductivity community because these materials appear to superconduct in a way that differs from their copper-oxide cousins.

Among the nickelates studied, La3Ni2O7-δ (where δ can range from 0 to 0.04) is considered particularly promising because in 2023, researchers led by Meng Wang of China’s Sun Yat-Sen University spotted certain signatures of superconductivity at a temperature of around 80 K. However, these signatures only appeared when crystals of the material were placed in a device called a diamond anvil cell (DAC). This device subjects samples of material to extreme pressures of more than 400 GPa (or 4 × 106 atmospheres) as it squeezes them between the flattened tips of two tiny, gem-grade diamond crystals.

The problem, explains Xiaohui Yu of the CAS’ Institute of Physics, is that it is not easy to spot the Meissner effect under such high pressures. This is because the structure of the DAC limits the available sample volume and hinders the use of highly sensitive magnetic measurement techniques such as SQUID. Another problem is that the sample used in the 2023 study contains several competing phases that could mix and degrade the signal of the La3Ni2O7-δ.

Nitrogen-vacancy centres embedded as in-situ quantum sensors

In the new work, Yu and colleagues used nitrogen-vacancy (NV) centres embedded in the DAC as in-situ quantum sensors to track and image the Meissner effect in pressurized bilayer La3Ni2O7-δ. This newly developed magnetic sensing technique boasts both high sensitivity and high spatial resolution, Yu says. What is more, it fits perfectly into the DAC high-pressure chamber.

Next, they applied a small external magnetic field of around 120 G. Under these conditions, they measured the optically detected magnetic resonance (ODMR) spectra of the NV centres point by point. They could then extract the local magnetic field from the resonance frequencies of these spectra. “We directly mapped the Meissner effect of the bilayer nickelate samples,” Yu says, noting that the team’s image of the magnetic field clearly shows both a diamagnetic region and a region where magnetic flux is concentrated.

Weak demagnetization signal

The researchers began their project in late 2023, shortly after receiving single-crystal samples of La3Ni2O7-δ from Wang. “However, after two months of collecting data, we still had no meaningful results,” Yu recalls. “From these experiments, we learnt that the demagnetization signal in La3Ni2O7-δ crystals was quite weak and that we needed to improve either the nickelate sample or the sensitivity of the quantum sensor.”

To overcome these problems, they switched to using polycrystalline samples, enhancing the quality of the nickelate samples by doping them with praseodymium to make La2PrNi2O7. This produced a sample with an almost pure bilayer structure and thus a much stronger demagnetization signal. They also used shallow NV centres implanted on the DAC cutlet (the smaller face of the two diamond tips).

“Unlike the NV centres in the original experiments, which were randomly distributed in the pressure-transmitting medium and have relatively large ODMR widths, leading to only moderate sensitivity in the measurements, these shallow centres are evenly distributed and well aligned, making it easier for us to perform magnetic imaging with increased sensitivity,” Yu explains.

These improvements enabled the team to obtain a demagnetization signal from the La2PrNi2O7 and La3Ni2O7-δ samples, he tells Physics World. “We found that the diamagnetic signal from the La2PrNi2O7 samples is about five times stronger than that from the La3Ni2O7-δ ones prepared under similar conditions – a result that is consistent with the fact that the Pr-doped samples are of a better quality.”

Physicist Jun Zhao of Fudan University, China, who was not involved in this work, says that Yu and colleagues’ measurement represents “an important step forward” in nickelate research. “Such measurements are technically very challenging, and their success demonstrates both experimental ingenuity and scientific significance,” he says. “More broadly, their result strengthens the case for pressurized nickelates as a new platform to study high-temperature superconductivity beyond the cuprates. It will certainly stimulate further efforts to unravel the microscopic pairing mechanism.”

As well as allowing for the precise sensing of magnetic fields, NV centres can also be used to accurately measure many other physical quantities that are difficult to measure under high pressure, such as strain and temperature distribution. Yu and colleagues say they are therefore looking to further expand the application of these structures for use as quantum sensors in high-pressure sensing.

They report their current work in National Science Review.

The post Quantum sensors reveal ‘smoking gun’ of superconductivity in pressurized bilayer nickelates appeared first on Physics World.

Quantum foundations: towards a coherent view of physical reality

3 septembre 2025 à 12:00

One hundred years after its birth, quantum mechanics remains one of the most powerful and successful theories in all of science. From quantum computing to precision sensors, its technological impact is undeniable – and one reason why 2025 is being celebrated as the International Year of Quantum Science and Technology.

Yet as we celebrate these achievements, we should still reflect on what quantum mechanics reveals about the world itself. What, for example, does this formalism actually tell us about the nature of reality? Do quantum systems have definite properties before we measure them? Do our observations create reality, or merely reveal it?

These are not just abstract, philosophical questions. Having a clear understanding of what quantum theory is all about is essential to its long-term coherence and its capacity to integrate with the rest of physics. Unfortunately, there is no scientific consensus on these issues, which continue to provoke debate in the research community.

That uncertainty was underlined by a recent global survey of physicists about quantum foundational issues, conducted by Nature (643 1157). It revealed a persistent tension between “realist” views, which seek an objective, visualizable account of quantum phenomena, and “epistemic” views that regard the formalism as merely a tool for organizing our knowledge and predicting measurement outcomes.

Only 5% of the 1100 people who responded to the Nature survey expressed full confidence in the Copenhagen interpretation, which is still prevalent in textbooks and laboratories. Further divisions were revealed over whether the wavefunction is a physical entity, a mere calculation device, or a subjective reflection of belief. The lack of agreement on such a central feature underscores the theoretical fragility underlying quantum mechanics.

The willingness to explore alternatives reflects the intellectual vitality of the field but also underscores the inadequacy of current approaches

More broadly, 75% of respondents believe that quantum theory will eventually be replaced, at least partially, by a more complete framework. Encouragingly, 85% agree that attempts to interpret the theory in intuitive or physical terms are valuable. This willingness to explore alternatives reflects the intellectual vitality of the field but also underscores the inadequacy of current approaches.

Beyond interpretation

We believe that this interpretative proliferation stems from a deeper problem, which is that quantum mechanics lacks a well-defined physical foundation. It describes the statistical outcomes of measurements, but it does not explain the mechanisms behind them. The concept of causality has been largely abandoned in favour of operational prescriptions such that quantum theory works impressively in practice but remains conceptually opaque.

In our view, the way forward is not to multiply interpretations or continue debating them, but to pursue a deeper physical understanding of quantum phenomena. One promising path is stochastic electrodynamics (SED), a classical theory augmented by a random electromagnetic background field, the real vacuum or zero-point field discovered by Max Planck as early as 1911. This framework restores causality and locality by explaining quantum behaviour as the statistical response of particles to this omnipresent background field.

Over the years, several researchers from different lines of thought have contributed to SED. Since our early days with Trevor Marshall, Timothy Boyer and others, we have refined the theory to the point that it can now account for the emergence of features that are considered building blocks of quantum formalism, such as the basic commutator and Heisenberg inequalities.

Particles acquire wave-like properties not by intrinsic duality, but as a consequence of their interaction with the vacuum field. Quantum fluctuations, interference patterns and entanglement emerge from this interaction, without the need to resort to non-local influences or observer-dependent realities. The SED approach is not merely mechanical, but rather electrodynamic.

Coherent thoughts

We’re not claiming that SED is the final word. But it does offer a coherent picture of microphysical processes based on physical fields and forces. Importantly, it doesn’t abandon the quantum formalism but rather reframes it as an effective theory – a statistical summary of deeper dynamics. Such a perspective enables us to maintain the successes of quantum mechanics while seeking to explain its origins.

For us, SED highlights that quantum phenomena can be reconciled with concepts central to the rest of physics, such as realism, causality and locality. It also shows that alternative approaches can yield testable predictions and provide new insights into long-standing puzzles. One phenomenon lying beyond current quantum formalism that we could now test, thanks to progress in experimental physics, is the predicted violation of Heisenberg’s inequalities over very short time periods.

As quantum science continues to advance, we must not lose sight of its conceptual foundations. Indeed, a coherent, causally grounded understanding of quantum mechanics is not a distraction from technological progress but a prerequisite for its full realization. By turning our attention once again to the foundations of the theory, we may finally complete the edifice that began to rise a century ago.

The centenary of quantum mechanics should be a time not just for celebration but critical reflection too.

This article forms part of Physics World‘s contribution to the 2025 International Year of Quantum Science and Technology (IYQ), which aims to raise global awareness of quantum physics and its applications.

Stayed tuned to Physics World and our international partners throughout the year for more coverage of the IYQ.

Find out more on our quantum channel.

The post Quantum foundations: towards a coherent view of physical reality appeared first on Physics World.

Twisted graphene reveals a new type of chirality

3 septembre 2025 à 10:32

Structural chirality refers to the geometric property of objects that are not superimposable on their mirror images, a concept that is central to organic chemistry. In contrast, topological chirality in physics involves quantum properties like spin and is essential for understanding topological edge states. The connection between these two forms of chirality remains an open question.

Traditionally, topological phenomena have been studied in spinful systems, where the presence of spin allows for chiral interactions and symmetry-breaking effects. This new study challenges that paradigm by demonstrating that topological chirality can arise even in spinless systems, purely from the three-dimensional structural arrangement of otherwise featureless units.

The researchers mathematically investigate two types of twisted 3D graphite systems, composed of stacked 2D graphene layers. Importantly, large twist angles were used (21.8). In one configuration, the layers are twisted into a helical screw-like structure, while in the other, the twist angles alternate between layers, forming a periodic chiral pattern. These structural designs give rise to novel topological phases.

A key mechanism underlying these effects is intervalley Umklapp scattering. This scattering captures the chirality of the twisted interfaces and induces a sign-flipped interlayer hopping, by introducing a π-flux lattice gauge field. This field alters the symmetry algebra of the system, enabling the emergence of spinless topological chirality.

This research opens up a new design principle for topological materials. By engineering the spatial patterning of structureless units, researchers can induce topological chirality without relying on spin. This has significant implications for the development of topological photonic and acoustic devices, potentially leading to simpler, more tunable materials for applications in quantum computing, sensing, and waveguiding technologies.

Read the full article

Spinless topological chirality from Umklapp scattering in twisted 3D structures

Cong Chen et al 2025 Rep. Prog. Phys. 88 018001

Do you want to learn more about this topic?

Interacting topological insulators: a review by Stephan Rachel (2018)

The post Twisted graphene reveals a new type of chirality appeared first on Physics World.

Unveiling topological edge states with attosecond precision

3 septembre 2025 à 10:31

In condensed matter physics, topological phase transitions are a key area of research because they lead to unusual and potentially useful states of matter. One example is the Floquet topological insulator, which can switch from a non-topological to a topological phase when exposed to a laser pulse. However, detecting these transitions is difficult due to the extremely fast timescales involved and interference from infrared fields, which can distort the photoelectron signals.

A Chern insulator is a unique material that acts as an insulator in its bulk but conducts electricity along its edges. These edge states arise from the material’s crystal structure of the bulk. Unlike other topological materials, Chern insulators do not require magnetic fields. Their edge conduction is topologically protected, meaning it is highly resistant to defects and noise. This makes them promising candidates for quantum technologies, spintronics, and energy-efficient electronics.

In this study, researchers developed a new method to detect phase changes in Chern insulators. Using numerical simulations, they demonstrated that attosecond x-ray absorption spectroscopy, combined with polarization-dependent dichroism, can effectively reveal these transitions. Their semi-classical approach isolates the intra-band Berry connection, providing deeper insight into how topological edge states form and how electrons behave in these systems.

This work represents a significant advance in topological materials research. It offers a new way to observe changes in quantum materials in real time, expands the use of attosecond spectroscopy from simple atoms and molecules to complex solids, and opens the door to studying dynamic systems like Floquet topological insulators.

Read the full article

Topological phase transitions via attosecond x-ray absorption spectroscopy

Juan F P Mosquera et al 2024 Rep. Prog. Phys. 87 117901

Do you want to learn more about this topic?

Strong–laser–field physics, non–classical light states and quantum information science by U BhattacharyaTh LamprouA S MaxwellA OrdóñezE PisantyJ Rivera-DeanP StammerM F CiappinaM Lewenstein and P Tzallas (2023)

The post Unveiling topological edge states with attosecond precision appeared first on Physics World.

Broadband wireless gets even broader thanks to integrated transmitter

3 septembre 2025 à 10:00

Researchers in China have unveiled an ultrabroadband system that uses the same laser and resonator to process signals at frequencies ranging from below 1 GHz up to more than 100 GHz. The system, which is based on a thin-film lithium niobate resonator developed in 2018 by members of the same team, could facilitate the spread of the so-called “Internet of things” in which huge numbers of different devices are networked together at different frequency bands to avoid interference.

Modern complementary metal oxide semiconductor (CMOS) electronic devices generally produce signals at frequencies of a few GHz. These signals are then often shifted into other frequency bands for processing and transmission. For example, sending electronic signals long distances down silicon optical fibres generally means using a frequency of around 200 THz, as silicon is transparent at the corresponding “telecoms” wavelength of 1550nm.

One of the most popular materials for performing this conversion is lithium niobate. This material has been called “the silicon of photonics” because it is highly nonlinear, allowing optical signals to be generated efficiently at a wide range of frequencies.

In integrated devices, bulk lithium niobate modulators are undesirable. However, in 2018 Cheng Wang and colleagues led by Marko Lončar of Harvard University in Massachusetts, US, developed a miniaturized, thin-film version that used an interferometric design to create a much stronger electro-optic effect in a shorter distance. “Usually, the bandwidth limit is set by the radiofrequency loss,” explains Wang, who is now at the City University of Hong Kong, China. “Being shorter means you can go to much higher frequencies.”

A broadband data transmission system

In the new work, Wang, together with researchers at Peking University in China and the University of California, Santa Barbara in the US, used an optimized version of this setup to make a broadband data transmission system. They divided the output of a telecom-wavelength oscillator into two arms. In one of these arms, optical signal modulation software imprinted a complex amplitude-phase pattern on the wave. The other arm was exposed to the data signal and a lithium niobate microring resonator. The two arms were then recombined at a photodetector, and the frequency difference between the two arms (in the GHz range) was transmitted using an antenna to a detector, where the process was reversed.

Crucially, the offset between the centre frequencies of the two arms (the frequency of the beat note at the photodetector when the two arms are recombined) is determined solely by the frequency shift imposed by the lithium niobate resonator. This can be tuned anywhere between 0.5 GHz and 115 GHz via the thermo-optic effect – essentially, incorporating a small electronic heater and using it to tune the refractive index. The signal is then encoded in modulations of the beat frequency, with additional information imprinted into the phase of the waves.

The researchers say this system is an improvement on standard electronic amplifiers because such devices usually operate in relatively narrow bands. Using them to make large jumps in frequency therefore means that signals need to be shifted multiple times. This introduces cumulative noise into the signal and is also problematic for applications such as robotic surgery, where the immediate arrival of a signal can literally be a matter of life and death.

Internet of things applications

The researchers demonstrated wireless data transfer across a distance of 1.3 m, achieving speeds of up to 100 gigabits per second. In the present setup, they used three different horn antennas to transmit microwaves of different frequencies through free space, but they hope to improve this: “That is our next goal – to get a fully frequency-tuneable link,” says Peking University’s Haowen Shu.

The researchers believe such a wideband setup could be crucial to the development of the “Internet of things” in which all sorts of different electronic devices are networked together without unwanted interference. Atmospheric transparency windows below 6 GHz, where loss is lower and propagation lengths are higher, are likely to be crucial for providing wireless Internet access to rural areas. Meanwhile, higher frequencies – with higher data rates – will probably be needed for augmented reality and remote surgery applications.

Alan Willner, an electrical engineer and optical scientist at the University of Southern California, US, who was not involved in the research, thinks the team is on the right track. “You have lots of spectrum in various radio bands for wireless communications,” he says. “But how are you going to take advantage of these bands to transmit high data rates in a cost-effective and flexible way? Are you going to use multiple different systems – one each for microwave, millimetre wave, and terahertz?  Using one tuneable and reconfigurable integrated platform to cover these bands is significantly better. This research is a great step in that direction.”

The research is published in Nature.

The post Broadband wireless gets even broader thanks to integrated transmitter appeared first on Physics World.

From ‘rewarding and exciting’ to ‘challenging and overwhelming’: what it means to have a career in intelligence and cyber security

2 septembre 2025 à 14:00
GCHQ spelt out in Scrabble pieces on a chess board
Your next move? A career in intelligence can suit physicists with the right mindset. (Courtesy: Shutterstock/shaneinswedenx)

As a physics graduate or an early career researcher looking for a job, you might not think of the UK’s primary intelligence and security agency – Government Communications Headquarters (GCHQ) – as somewhere you might consider. But GCHQ, which covers counter-terrorism, cybersecurity, organized crime and defence support for the UK, hires a vast number of physicists. Indeed, to celebrate the 2025 International Year of Quantum Science and Technology, the agency has hosted many internal talks, informational campaigns and more.

GCHQ works with the Secret Intelligence Service (MI6), MI5, as well as the armed forces, a number of international partners, and firms in the private sector and academia. To find out more about a career at GCHQ – working with cutting-edge technology to identify, analyse and disrupt threats to the UK – Physics World speaks to two people with academic backgrounds who have a long career at the organization. They tell us about the benefits, the difficulties and the complexity of working at an intelligence agency.

Nia is the deputy director for science at GCHQ, where she has worked for the past 15 years. After studying physics at university, she joined GCHQ as a graduate and has since contributed to a wide range of scientific and technological initiatives in support of national security. She is a Fellow of both the Institute of Physics (IOP), which publishes Physics World, and the Institution of Engineering and Technology (IET).

Cheryl leads GCHQ’s adoption of quantum technologies. Following a degree in engineering, her career began as an apprentice at an avionics company. Since then, she has had many roles across research and development at GCHQ and across broader UK government departments, with a focus on understanding and implementing emerging technology. Cheryl is a Fellow of the IET and a Member of the IOP. 

When did your interest in science first develop?

Nia My fascination with science was nurtured from a young age, largely inspired by my parents. My mum was a physics teacher, and my dad is a passionate historian with an insatiable curiosity about the world. Growing up in an environment rich with books, experiments, and discussions about how things work – whether exploring astrophysics, geology or ancient Egypt – instilled in me a lifelong desire to understand our universe. My mum’s electronics, mechanics and physics lessons meant there were always breadboards, crocodile clips and even a Van de Graaff generator in the house, transforming learning into an exciting tangible experience.

Cheryl As a child I was always interested in nature and in how things work. I used to build bug farms in the garden and still have my old Observer’s books with the butterflies, etc, ticked off when spotted. Leaning towards my practical side of constantly making things (and foolishly believing my careers teacher that a physics degree would only lead to teaching), I took physics, chemistry and maths A-levels and a degree in engineering.

Could you tell us a bit about your educational background and your career path that led to you work at GCHQ?

Nia I was born and grew up in South Wales and attended a Welsh-language school where I studied physics, maths and chemistry at A-level. I then studied physics at Durham University for four years, before I started working at GCHQ as a graduate. My first role was in an area that is now the National Cyber Security Centre (NCSC). As the cyber security arm of GCHQ, it researches the reliability of semiconductors in national security applications and uses that research to shape policy and security standards. This was great for me as my final year in university was focused on material science and condensed matter physics which came in very useful.

Cheryl My engineering degree apprenticeship was through an aerospace company in Cheltenham, and I worked there afterwards designing test kits for the RAF. It was almost natural that I should at least try a few years at GCHQ as a local employer and I had plans to then move to other R&D labs.

What’s it like to work here – what are some of the stresses of working in this kind of an environment and not being able to discuss your job with friends and family? What are some of the best aspects of working at GCHQ?

Nia Working at GCHQ is rewarding and exciting especially as we look at the most exciting developments in emerging technologies. It can also be challenging especially when navigating the complexities of global security challenges amid an unpredictable geopolitical landscape. There are days when media reports or international events feel overwhelming, but knowing that my work contributes towards safeguarding the UK’s interests today and into the future offers a strong sense of purpose.

The most rewarding aspect, by far, is the people. We have some of the brightest, most dedicated experts – mentors, colleagues, friends – whose commitment inspires me daily. Their support and collaboration make even the most demanding days manageable.

Cheryl At GCHQ I found that I have been able to enjoy several very different “careers” within the organization, including opportunities to travel and to develop diverse skills. This, together with a flexibility to change working patterns to suit stages of family life, has meant I have stayed for most of my career.

I’ve had some amazing and unique opportunities and experiences

Cheryl, GCHQ

I’ve had some amazing and unique opportunities and experiences. In the Cheltenham area it’s accepted that so many people work here and is widely respected that we cannot talk about the detail of what we do.

Fingerprint on circuitboard illustration
Safety net Maintaining secure communication and anticipating new threats are key to the work carried out at GCHQ. (Shutterstock/S and V Design)

What role does physics and especially quantum science play in what you do? And what role does physics play when it comes to the national security of the UK?

Nia As deputy director of science at GCHQ, my role involves collaborating with experts to understand how emerging technologies, including quantum science, impact national security. Quantum offers extraordinary potential for secure communication and advanced sensing – but it equally threatens to upend existing security protocols if adversaries harness it maliciously. A deep understanding of physics is crucial – not only to spot opportunities but also to anticipate and counter threats.

Quantum science is just one example of how a fundamental understanding of physics and maths gives you the foundations to understand the broad waterfront of emerging technologies coming our way. We work closely with government departments, academia, industry and start-ups to ensure the UK remains at the forefront of this field, shaping a resilient and innovative security ecosystem.

Cheryl I first came across quantum science, technologies and quantum computing around 15 years ago through an emerging technology analysis role in R&D; and I watched and learned keenly as I could see that these would be game changing. Little did I know at the time that I would later be leading our adoption of quantum and just how significant these emerging technologies for sensing, timing and computing would grow to be.

The UK national ecosystem developing around quantum technologies is a great mix of minds from academia, industry and government departments and is one of the most collegiate, inspiring and well-motivated communities that I have interacted with.

For today’s physics graduates who might be interested in a career at GCHQ, what are some of the key skills they require?

Nia Many people will have heard of historic tales of the tap on the shoulder for people to work in intelligence agencies, but as with all other jobs the reality is that people can find out about careers at GCHQ in much the same way they would with any other kind of job.

Maintaining a hunger to learn and adapt is what will set you apart

Nia, GCHQ

I would emphasize qualities like curiosity, problem-solving and resilience as being key. The willingness to roll up your sleeves, a genuine care for collaborative work, and empathy are equally important – particularly because much of what we do is sensitive and demands trust and discretion. Maintaining a hunger to learn and adapt is what will set you apart.

Cheryl We have roles where you will be helping to solve complex problems – doing work you simply won’t find anywhere else. It’s key to have curiosity, an open mind and don’t be put off by the fact you can’t ask too many questions in advance!

What sort of equality, diversity and inclusion initiatives do you have at GCHQ and how are you looking to get more women and minorities working there?

Nia Diversity and inclusion are mission-critical for us at GCHQ, gathering the right mix of minds to find innovative solutions to the toughest of problems. We’re committed to building on our work to better represent the communities we serve, including increasing the number of people from ethnic minority backgrounds and the number of women in senior roles.

Cheryl We are committed to having a workforce that reflects the communities we serve. Our locations in the north-west, in both Manchester and now Lancashire, are part of the mission to find the right mix of minds

What is your advice to today’s physics grads? What is it that you know today that you wish you knew at the start of your career?

Nia One key lesson is that career paths are rarely linear. When starting out, uncertainty can feel daunting, but it’s an opportunity for growth. Embrace challenges and seize opportunities that excite you – whether they seem narrowly related to your studies or not. Every experience contributes to your development. Additionally, don’t underestimate the importance of work–life balance. GCHQ offers a supportive environment – remember, careers are marathons, not sprints. Patience and curiosity will serve you well.

Cheryl It takes multidisciplinary teams to deliver game-changers and new ecosystems. Your initial “career choices” are just a stepping stone from which you can forge your own path and follow your instincts.

The post From ‘rewarding and exciting’ to ‘challenging and overwhelming’: what it means to have a career in intelligence and cyber security appeared first on Physics World.

Desert dust helps freeze clouds in the northern hemisphere

2 septembre 2025 à 10:07

Micron-sized dust particles in the atmosphere could trigger the formation of ice in certain types of clouds in the Northern Hemisphere. This is the finding of researchers in Switzerland and Germany, who used 35 years of satellite data to show that nanoscale defects on the surface of these aerosol particles are responsible for the effect. Their results, which agree with laboratory experiments on droplet freezing, could be used to improve climate models and to advance studies of cloud seeding for geoengineering.

In the study, which was led by environmental scientist Diego Villanueva of ETH Zürich, the researchers focused on clouds in the so-called mixed-phase regime, which form at temperatures of between −39° and 0°C and are commonly found in mid- and high-latitudes, particularly over the North Atlantic, Siberia and Canada. These mixed-phase regime clouds (MPRCs) are often topped by a liquid or ice layer, and their makeup affects how much sunlight they reflect back into space and how much water they can release as rain or snow. Understanding them is therefore important for forecasting weather and making projections of future climate.

Researchers have known for a while that MPRCs are extremely sensitive to the presence of ice-nucleating particles in their environment. Such particles mainly come from mineral dust aerosols (such as K-feldspar, quartz, albite and plagioclase) that get swept up into the upper atmosphere from deserts. The Sahara Desert in northern Africa, for example, is a prime source of such dust in the Northern Hemisphere.

More dust leads to more ice clouds

Using 35 years of satellite data collected as part of the Cloud_cci project and MERRA-2 aerosol reanalyses, Villanueva and colleagues looked for correlations between dust levels and the formation of ice-topped clouds. They found that at temperatures of between -15°C and -30°C, the more dust there was, the more frequent the ice clouds were. What is more, their calculated increase in ice-topped clouds with increasing dust loading agrees well with previous laboratory experiments that predicted how dust triggers droplet freezing.

The new study, which is detailed in Science, shows that there is a connection between aerosols in the micrometre-size range and cloud ice observed over distances of several kilometres, Villanueva says. “We found that it is the nanoscale defects on the surface of dust aerosols that trigger ice clouds, so the process of ice glaciation spans more than 15 orders of magnitude in length,” he explains.

Thanks to this finding, Villaneuva tells Physics World that climate modellers can use the team’s dataset to better constrain aerosol-cloud processes, potentially helping them to construct better estimates of cloud feedback and global temperature projections.

The result also shows how sensitive clouds are to varying aerosol concentrations, he adds. “This could help bring forward the field of cloud seeding and include this in climate geoengineering efforts.”

The researchers say they have successfully replicated their results using a climate model and are now drafting a new manuscript to further explore the implications of dust-driven cloud glaciation for climate, especially for the Arctic.

The post Desert dust helps freeze clouds in the northern hemisphere appeared first on Physics World.

Radioactive ion beams enable simultaneous treatment and imaging in particle therapy

2 septembre 2025 à 10:00

Researchers in Germany have demonstrated the first cancer treatment using a radioactive carbon ion beam (11C), on a mouse with a bone tumour close to the spine. Performing particle therapy with radioactive ion beams enables simultaneous treatment and visualization of the beam within the body.

Particle therapy using beams of protons or heavy ions is a highly effective cancer treatment, with the favourable depth–dose deposition – the Bragg peak – providing extremely conformal tumour targeting. This conformality, however, makes particle therapy particularly sensitive to range uncertainties, which can impact the Bragg peak position.

One way to reduce such uncertainties is to use positron emission tomography (PET) to map the isotopes generated as the treatment beam interacts with tissues in the patient. For therapy with carbon (12C) ions, currently performed at 17 centres worldwide, this involves detecting the beta decay of 10C and 11C projectile fragments. Unfortunately, such fragments generate a small PET signal, while their lower mass shifts the measured activity peak away from the Bragg peak.

The researchers – working within the ERC-funded BARB (Biomedical Applications of Radioactive ion Beams) project – propose that treatment with positron-emitting ions such as 11C could overcome these obstacles. Radioactive ion beams have the same biological effectiveness as their corresponding stable ion beams, but generate an order of magnitude larger PET signal. They also reduce the shift between the activity and dose peaks, enabling precise localization of the ion beam in vivo.

“Range uncertainty remains the main problem of particle therapy, as we do not know exactly where the Bragg peak is,” explains Marco Durante, head of biophysics at the GSI Helmholtz Centre for Heavy Ion Research and principal investigator of the BARB project. “If we ‘aim-and-shoot’ using a radioactive beam and PET imaging, we can see where the beam is and can then correct it. By doing this, we can reduce the margins around the target that spoil the precision of particle therapy.”

In vivo experiments

To test this premise, Durante and colleagues performed in vivo experiments at the GSI/FAIR accelerator facility in Darmstadt. For online range verification, they used a portable small-animal in-beam PET scanner built by Katia Parodi and her team at LMU Munich. The scanner, initially designed for the ERC project SIRMIO (Small-animal proton irradiator for research in molecular image-guided radiation-oncology), contains 56 depth-of-interaction detectors – based on scintillator blocks of pixelated LYSO crystals – arranged spherically with an inner diameter of 72 mm.

LMU researchers with small-animal PET scanner
LMU researchers Members of the LMU team involved in the BARB project (left to right: Peter Thirolf, Giulio Lovatti, Angelica Noto, Francesco Evangelista, Munetaka Nitta and Katia Parodi) with the small-animal PET scanner. (Courtesy: Katia Parodi/Francesco Evangelista, LMU)

“Not only does our spherical in-beam PET scanner offer unprecedented sensitivity and spatial resolution, but it also enables on-the-fly monitoring of the activity implantation for direct feedback during irradiation,” says Parodi, co-principal investigator of the BARB project.

The researchers used a radioactive 11C-ion beam – produced at the GSI fragment separator – to treat 32 mice with an osteosarcoma tumour implanted in the neck near the spinal cord. To encompass the full target volume, they employed a range modulator to produce a spread-out Bragg peak (SOBP) and a plastic compensator collar, which also served to position and immobilize the mice. The anaesthetized animals were placed vertically inside the PET scanner and treated with either 20 or 5 Gy at a dose rate of around 1 Gy/min.

For each irradiation, the team compared the measured activity with Monte Carlo-simulated activity based on pre-treatment microCT scans. The activity distributions were shifted by about 1 mm, attributed to anatomical changes between the scans (with mice positioned horizontally) and irradiation (vertical positioning). After accounting for this anatomical shift, the simulation accurately matched the measured activity. “Our findings reinforce the necessity of vertical CT planning and highlight the potential of online PET as a valuable tool for upright particle therapy,” the researchers write.

With the tumour so close to the spine, even small range uncertainties risk damage to the spinal cord, so the team used the online PET images generated during the irradiation to check that the SOPB did not cover the spine. While this was not seen in any of the animals, Durante notes that if it had, the beam could be moved to enable “truly adaptive” particle therapy. Assessing the mice for signs of radiation-induced myelopathy (which can lead to motor deficits and paralysis) revealed that no mice exhibited severe toxicity, further demonstrating that the spine was not exposed to high doses.

PET imaging in a mouse
PET imaging in a mouse (a) Simulation showing the expected 11C-ion dose distribution in the pre-treatment microCT scan. (b) Corresponding simulated PET activity. (c) Online PET image of the activity during 11C irradiation, overlaid on the same microCT used for simulations. The target is outlined in black, the spine in red. (Courtesy: CC BY 4.0/Nat. Phys. 10.1038/s41567-025-02993-8)

Following treatment, tumour measurements revealed complete tumour control after 20 Gy irradiation and prolonged tumour growth delay after 5 Gy, suggesting complete target coverage in all animals.

The researchers also assessed the washout of the signal from the tumour, which includes a slow activity decrease due to the decay of 11C (which has a half-life of 20.34 min), plus a faster decrease as blood flow removes the radioactive isotopes from the tumour. The results showed that the biological washout was dose-dependent, with the fast component visible at 5 Gy but disappearing at 20 Gy.

“We propose that this finding is due to damage to the blood vessel feeding the tumour,” says Durante. “If this is true, high-dose radiotherapy may work in a completely different way from conventional radiotherapy: rather than killing all the cancer stem cells, we just starve the tumour by damaging the blood vessels.”

Future plans

Next, the team intends to investigate the use of 10C or 15O treatment beams, which should provide stronger signals and increased temporal resolution. A new Super-FRS fragment separator at the FAIR accelerator facility will provide the high-intensity beams required for studies with 10C.

Looking further ahead, clinical translation will require a realistic and relatively cheap design, says Durante. “CERN has proposed a design [the MEDICIS-Promed project] based on ISOL [isotope separation online] that can be used as a source of radioactive beams in current accelerators,” he tells Physics World. “At GSI we are also working on a possible in-flight device for medical accelerators.”

The findings are reported in Nature Physics.

The post Radioactive ion beams enable simultaneous treatment and imaging in particle therapy appeared first on Physics World.

Garbage in, garbage out: why the success of AI depends on good data

1 septembre 2025 à 14:00

Artificial intelligence (AI) is fast becoming the new “Marmite”. Like the salty spread that polarizes taste-buds, you either love AI or you hate it. To some, AI is miraculous, to others it’s threatening or scary. But one thing is for sure – AI is here to stay, so we had better get used to it.

In many respects, AI is very similar to other data-analytics solutions in that how it works depends on two things. One is the quality of the input data. The other is the integrity of the user to ensure that the outputs are fit for purpose.

Previously a niche tool for specialists, AI is now widely available for general-purpose use, in particular through Generative AI (GenAI) tools. Also known as Large Language Models (LLMs), they’re now widley available through, for example, OpenAI’s ChatGPT, Microsoft Co-pilot, Anthropic’s Claude, Adobe Firefly or Google Gemini.

GenAI has become possible thanks to the availability of vast quantities of digitized data and significant advances in computing power. Based on neural networks, this size of model would in fact have been impossible without these two fundamental ingredients.

GenAI is incredibly powerful when it comes to searching and summarizing large volumes of unstructured text. It exploits unfathomable amounts of data and is getting better all the time, offering users significant benefits in terms of efficiency and labour saving.

Many people now use it routinely for writing meeting minutes, composing letters and e-mails, and summarizing the content of multiple documents. AI can also tackle complex problems that would be difficult for humans to solve, such as climate modelling, drug discovery and protein-structure prediction.

I’d also like to give a shout out to tools such as Microsoft Live Captions and Google Translate, which help people from different locations and cultures to communicate. But like all shiny new things, AI comes with caveats, which we should bear in mind when using such tools.

User beware

LLMs, by their very nature, have been trained on historical data. They can’t therefore tell you exactly what may happen in the future, or indeed what may have happened since the model was originally trained. Models can also be constrained in their answers.

Take the Chinese AI app DeepSeek. When the BBC asked it what had happened at Tiananmen Square in Beijing on 4 June 1989 – when Chinese troops cracked down on protestors – the Chatbot’s answer was suppressed. Now, this is a very obvious piece of information control, but subtler instances of censorship will be harder to spot.

Trouble is, we can’t know all the nuances of the data that models have been trained on

We also need to be conscious of model bias. At least some of the training data will probably come from social media and public chat forums such as X, Facebook and Reddit. Trouble is, we can’t know all the nuances of the data that models have been trained on – or the inherent biases that may arise from this.

One example of unfair gender bias was when Amazon developed an AI recruiting tool. Based on 10 years’ worth of CVs – mostly from men – the tool was found to favour men. Thankfully, Amazon ditched it. But then there was Apple’s gender-biased credit-card algorithm that led to men being given higher credit limits than women of similar ratings.

Another problem with AI is that it sometimes acts as a black box, making it hard for us to understand how, why or on what grounds it arrived at a certain decision. Think about those online Captcha tests we have to take to when accessing online accounts. They often present us with a street scene and ask us to select those parts of the image containing a traffic light.

The tests are designed to distinguish between humans and computers or bots – the expectation being that AI can’t consistently recognize traffic lights. However, AI-based advanced driver assist systems (ADAS) presumably perform this function seamlessly on our roads. If not, surely drivers are being put at risk?

A colleague of mine, who drives an electric car that happens to share its name with a well-known physicist, confided that the ADAS in his car becomes unresponsive, especially when at traffic lights with filter arrows or multiple sets of traffic lights. So what exactly is going on with ADAS? Does anyone know?

Caution needed

My message when it comes to AI is simple: be careful what you ask for. Many GenAI applications will store user prompts and conversation histories and will likely use this data for training future models. Once you enter your data, there’s no guarantee it’ll ever be deleted. So  think carefully before sharing any personal data, such medical or financial information. It also pays to keep prompts non-specific (avoiding using your name or date of birth) so that they cannot be traced directly to you.

Democratization of AI is a great enabler and it’s easy for people to apply it without an in-depth understanding of what’s going on under the hood. But we should be checking AI-generated output before we use it to make important decisions and we should be careful of the personal information we divulge.

It’s easy to become complacent when we are not doing all the legwork. We are reminded under the terms of use that “AI can make mistakes”, but I wonder what will happen if models start consuming AI-generated erroneous data. Just as with other data-analytics problems, AI suffers from the old adage of “garbage in, garbage out”.

But sometimes I fear it’s even worse than that. We’ll need a collective vigilance to avoid AI being turned into “garbage in, garbage squared”.

The post Garbage in, garbage out: why the success of AI depends on good data appeared first on Physics World.

Why foamy heads on Belgium beers last so long

29 août 2025 à 16:44

It’s well documented that a frothy head on a beverage can stop the liquid from sloshing around and onto the floor – it’s one reason why when walking around with coffee, it swills around more than beer, for example.

When it comes to beer, a clear sign of a good brew is a big head of foam at the top of a poured glass.

Beer foam is made of many small bubbles of air, separated from each other by thin films of liquid. These thin films must remain stable, or the bubbles will pop, and the foam will collapse.

What holds these thin films together is not completely understood and is likely conglomerates of proteins, surface viscosity or the presence of surfactants – molecules that reduce surface tension and are found in soaps and detergents.

To find out more, researchers from ETH Zurich and Eindhoven University of Technology (EUT) investigated beer-foam stability for different types of beers at varying stages of the fermentation process.

They found that for single-fermentation beers, the foams are mostly held together with the surface viscosity of the beer. This is influenced by proteins in the beer – the more they contain the more viscous the film and more stable the foam will be.

“We can directly visualize what’s happening when two bubbles come into close proximity,” notes EUT material scientist Emmanouil Chatzigiannakis. “We can directly see the bubble’s protein aggregates, their interface, and their structure.”

When it comes to double-fermented beers, however, the proteins in the beer are altered slightly by yeast cells and come together to form a two-dimensional membrane that keeps foam intact longer.

The head was found to be even more stable for triple-fermented beers, which include Belgium Trappist beers. The proteins change further and behave like a surfactant that stabilizes the bubbles.

The team says that the finding of how the fermentation process alters the stability of bubbles could be used to produce more efficient ways of creating foams – or identify ways to control the amount of froth so that everyone can pour a perfect glass of beer every time. Cheers!

The post Why foamy heads on Belgium beers last so long appeared first on Physics World.

Making molecules with superheavy elements could shake up the periodic table

29 août 2025 à 14:00

Nuclear scientists at the Lawrence Berkeley National Laboratory (LBNL) in the US have produced and identified molecules containing nobelium for the first time. This element, which has an atomic number of 102, is the heaviest ever to be observed in a directly-identified molecule, and team leader Jennifer Pore says the knowledge gained from such work could lead to a shake-up at the bottom of the periodic table.

“We compared the chemical properties of nobelium side-by-side to simultaneously produced molecules containing actinium (element number 89),” says Pore, a research scientist at LBNL. “The success of these measurements demonstrates the possibility to further improve our understanding of heavy and superheavy-element chemistry and so ensure that these elements are placed correctly on the periodic table.”

The periodic table currently lists 118 elements. As well as vertical “groups” containing elements with similar properties and horizontal “periods” in which the number of protons (atomic number Z) in the nucleus increases from left to right, these elements are arranged in three blocks. The block that contains actinides such as actinium (Ac) and nobelium (No), as well as the slightly lighter lanthanide series, is often shown offset, below the bottom of the main table.

The end of a predictive periodic table?

Arranging the elements this way is helpful because it gives scientists an intuitive feel for the chemical properties of different elements. It has even made it possible to predict the properties of new elements as they are discovered in nature or, more recently, created in the laboratory.

The problem is that the traditional patterns we’ve come to know and love may start to break down for elements at the bottom of the table, putting an end to the predictive periodic table as we know it. The reason, Pore explains, is that these heavy nuclei have a very large number of protons. In the actinides (Z > 88), for example, the intense charge of these “extra” protons exerts such a strong pull on the inner electrons that relativistic effects come into play, potentially changing the elements’ chemical properties.

“As some of the electrons are sucked towards the centre of the atom, they shield some of the outer electrons from the pull,” Pore explains. “The effect is expected to be even stronger in the superheavy elements, and this is why they might potentially not be in the right place on the periodic table.”

Understanding the full impact of these relativistic effects is difficult because elements heavier than fermium (Z = 100) need to be produced and studied atom by atom. This means resorting to complex equipment such as accelerated ion beams and the FIONA (For the Identification Of Nuclide A) device at LBNL’s 88-Inch Cyclotron Facility.

Producing and directly identifying actinide molecules

The team chose to study Ac and No in part because they represent the extremes of the actinide series. As the first in the series, Ac has no electrons in its 5f shell and is so rare that the crystal structure of an actinium-containing molecule was only determined recently. The chemistry of No, which contains a full complement of 14 electrons in its 5f shell and is the heaviest of the actinides, is even less well known.

In the new work, which is described in Nature, Pore and colleagues produced and directly identified molecular species containing Ac and No ions. To do this, they first had to produce Ac and No. They achieved this by accelerating beams of 48Ca with the 88-Inch Cyclotron and directing them onto targets of 169Tm and 208Pb, respectively. They then used the Berkeley Gas-filled Separator to separate the resulting actinide ions from unreacted beam material and reaction by-products.

The next step was to inject the ions into a chamber in the FIONA spectrometer known as a gas catcher. This chamber was filled with high-purity helium, as well as trace amounts of H2O and N2, at a pressure of approximately 150 torr. After interactions with the helium gas reduced the actinide ions to their 2+ charge state, so-called “coordination compounds” were able to form between the 2+ actinide ions and the H2O and N2 impurities. This compound-formation step took place either in the gas buffer cell itself or as the gas-ion mixture exited the chamber via a 1.3-mm opening and entered a low-pressure (several torr) environment. This transition caused the gas to expand at supersonic speeds, cooling it rapidly and allowing the molecular species to stabilize.

Once the actinide molecules formed, the researchers transferred them to a radio-frequency quadrupole cooler-buncher ion trap. This trap confined the ions for up to 50 ms, during which time they continued to collide with the helium buffer gas, eventually reaching thermal equilibrium. After they had cooled, the molecules were reaccelerated using FIONA’s mass spectrometer and identified according to their mass-to-charge ratio.

A fast and sensitive instrument

FIONA is much faster than previous such instruments and more sensitive. Both properties are important when studying the chemistry of heavy and superheavy elements, which Pore notes are difficult to make, and which decay quickly. “Previous experiments measured the secondary particles made when a molecule with a superheavy element decayed, but they couldn’t identify the exact original chemical species,” she explains. “Most measurements reported a range of possible molecules and were based on assumptions from better-known elements. Our new approach is the first to directly identify the molecules by measuring their masses, removing the need for such assumptions.”

As well as improving our understanding of heavy and superheavy elements, Pore says the new work might also have applications in radioactive isotopes used in medical treatment. For example, the 225Ac isotope shows promise for treating certain metastatic cancers, but it is difficult to make and only available in small quantities, which limits access for clinical trials and treatment. “This means that researchers have had to forgo fundamental chemistry experiments to figure out how to get it into patients,” Pore notes. “But if we could understand such radioactive elements better, we might have an easier time producing the specific molecules needed.”

The post Making molecules with superheavy elements could shake up the periodic table appeared first on Physics World.

Super sticky underwater hydrogels designed using data mining and AI

29 août 2025 à 10:00

The way in which new materials are designed is changing, with data becoming ever more important in the discovery and design process. Designing soft materials is a particularly tricky task that requires selection of different “building blocks” (monomers in polymeric materials, for example) and optimization of their arrangement in molecular space.

Soft materials also exhibit many complex behaviours that need to be balanced, and their molecular and structural complexities make it difficult for computational methods to help in the design process – often requiring costly trial and error experimental approaches instead. Now, researchers at Hokkaido University in Japan have combined artificial intelligence (AI) with data mining methods to develop an ultra-sticky hydrogel material suitable for very wet environments – a difficult design challenge because the properties that make materials soft don’t usually promote adhesion. They report their findings in Nature.

Challenges of designing sticky hydrogels

Hydrogels are a permeable soft material composed of interlinked polymer networks with water held within the network. Hydrogels are highly versatile, with properties controlled by altering the chemical makeup and structure of the material.

Designing hydrogels computationally to perform a specific function is difficult, however, because the polymers used to build the hydrogel network can contain a plethora of chemical functional groups, complicating the discovery of suitable polymers and the structural makeup of the hydrogel. The properties of hydrogels are also influenced by factors including the molecular arrangement and intermolecular interactions between molecules (such as van der Waals forces and hydrogen bonds). There are further challenges for adhesive hydrogels in wet environments, as hydrogels will swell in the presence of water, which needs to be factored into the material design.

Data driven methods provide breakthrough

To develop a hydrogel with a strong and lasting underwater adhesion, the researchers mined data from the National Center for Biotechnology Information (NCBI) Protein database. This database contains the amino acid sequences responsible for adhesion in underwater biological systems – such as those found in bacteria, viruses, archaea and eukaryotes. The protein sequences were synthetically mimicked and adapted for the polymer strands in hydrogels.

“We were inspired by nature’s adhesive proteins, but we wanted to go beyond mimicking a few examples. By mining the entire protein database, we aimed to systematically explore new design rules and see how far AI could push the boundaries of underwater adhesion,” says co-lead author Hailong Fan.

The researchers used information from the database to initially design and synthesize 180 bioinspired hydrogels, each with a unique polymer network and all of which showed adhesive properties beyond other hydrogels. To improve them further, the team employed machine learning to create hydrogels demonstrating the strongest underwater adhesive properties to date, with instant and repeatable adhesive strengths exceeding 1 MPa – an order-of-magnitude improvement over previous underwater adhesives. In addition, the AI-designed hydrogels were found to be functional across many different surfaces in both fresh and saline water.

“The key achievement is not just creating a record-breaking underwater adhesive hydrogel but demonstrating a new pathway – moving from biomimetic experience to data-driven, AI-guided material design,” says Fan.

A versatile adhesive

The researchers took the three best performing hydrogels and tested them in different wet environments to show that they could maintain their adhesive properties for long time periods. One hydrogel was used to stick a rubber duck to a rock by the sea, which remained in place despite continuous wave impacts over many tide cycles. A second hydrogel was used to patch up a 20 mm hole on a pipe filled with water and instantly stopped a high-pressure leak. This hydrogel remained in place for five months without issue. The third hydrogel was placed under the skin of mice to demonstrate biocompatibility.

The super strong adhesive properties in wet environments could have far ranging applications, from biomedical engineering (prosthetic coatings or wearable biosensors) to deep-sea exploration and marine farming. The researchers also note that this data-driven approach could be adapted for designing other functional soft materials.

When asked about what’s next for this research, Fan says that “our next step is to study the molecular mechanisms behind these adhesives in more depth, and to expand this data-driven design strategy to other soft materials, such as self-healing and biomedical hydrogels”.

The post Super sticky underwater hydrogels designed using data mining and AI appeared first on Physics World.

From a laser lab to The Economist: physicist Jason Palmer on his move to journalism

28 août 2025 à 15:55

My guest in this episode of the Physics World Weekly podcast is the journalist Jason Palmer, who co-hosts “The Intelligence” podcast at The Economist.

Palmer did a PhD in chemical physics at Imperial College London before turning his hand to science writing with stints at the BBC and New Scientist.

He explains how he made the transition from the laboratory to the newsroom and offers tips for scientists planning to make the same career journey. We also chat about how artificial intelligence is changing how journalists work.

The post From a laser lab to <em>The Economist</em>: physicist Jason Palmer on his move to journalism appeared first on Physics World.

Crainio’s Panicos Kyriacou explains how their light-based instrument can help diagnose brain injury

28 août 2025 à 12:55

Traumatic brain injury (TBI), caused by a sudden impact to the head, is a leading cause of death and disability. After such an injury, the most important indicator of how severe the injury is intracranial pressure – the pressure inside the skull. But currently, the only way to assess this is by inserting a pressure sensor into the patient’s brain. UK-based startup Crainio aims to change this by developing a non-invasive method to measure intracranial pressure using a simple optical probe attached to the patient’s forehead.

Can you explain why diagnosing TBI is such an important clinical challenge?

Every three minutes in the UK, someone is admitted to hospital with a head injury, it’s a very common problem. But when someone has a blow to the head, nobody knows how bad it is until they actually reach the hospital. TBI is something that, at the moment, cannot be assessed at the point of injury.

From the time of impact to the time that the patient receives an assessment by a neurosurgical expert is known as the golden hour. And nobody knows what’s happening to the brain during this time – you don’t know how best to manage the patient, whether they have a severe TBI with intracranial pressure rising in the head, or just a concussion or a medium TBI.

Once at the hospital, the neurosurgeons have to assess the patient’s intracranial pressure, to determine whether it is above the threshold that classifies the injury as severe. And to do that, they have to drill a hole in the head – literally – and place an electrical probe into the brain. This really is one of the most invasive non-therapeutic procedures, and you obviously can’t do this to every patient that comes with a blow in the head. It has its risks, there is a risk of haemorrhage or of infection.

Therefore, there’s a need to develop technologies that can measure intracranial pressure more effectively, earlier and in a non-invasive manner. For many years, this was almost like a dream: “How can you access the brain and see if the pressure is rising in the brain, just by placing an optical sensor on the forehead?”

Crainio has now created such a non-invasive sensor; what led to this breakthrough?

The research goes back to 2016, at the Research Centre for Biomedical Engineering at City, University of London (now City St George’s, University of London), when the National Institute for Health Research (NIHR) gave us our first grant to investigate the feasibility of a non-invasive intracranial sensor based on light technologies. We developed a prototype, secured the intellectual property and conducted a feasibility study on TBI patients at the Royal London Hospital, the biggest trauma hospital in the UK.

It was back in 2021, before Crainio was established, that we first discovered that after we shone certain frequencies of light, like near-infrared, into the brain through the forehead, the optical signals coming back – known as the photoplethysmogram, or PPG – contained information about the physiology or the haemodynamics of the brain.

When the pressure in the brain rises, the brain swells up, but it cannot go anywhere because the skull is like concrete. Therefore, the arteries and vessels in the brain are compressed by that pressure. PPG measures changes in blood volume as it pulses through the arteries during the cardiac cycle. If you have a viscoelastic artery that is opening and closing, the volume of blood changes and this is captured by the PPG. Now, if you have an artery that is compromised, pushed down because of pressure in the brain, that viscoelastic property is impacted and that will impact the PPG.

Changes in the PPG signal due to changes arising from compression of the vessels in the brain, can give us information about the intracranial pressure. And we developed algorithms to interrogate this optical signal and machine learning models to estimate intracranial pressure.

How did the establishment of Crainio help to progress the sensor technology?

Following our research within the university, Crainio was set up in 2022. It brought together a team of experts in medical devices and optical sensors to lead the further development and commercialization of this device. And this small team worked tirelessly over the last few years to generate funding to progress the development of the optical sensor technology and bring it to a level that is ready for further clinical trials.

Panicos Kyriacou
Panicos Kyriacou “At Crainio we want to create a technology that could be used widely, because there is a massive need, but also because it’s affordable.” (Courtesy: Crainio)

In 2023, Crainio was successful with an Innovate UK biomedical catalyst grant, which will enable the company to engage in a clinical feasibility study, optimize the probe technology and further develop the algorithms. The company was later awarded another NIHR grant to move into a validation study.

The interest in this project has been overwhelming. We’ve had a very positive feedback from the neurocritical care community. But we also see a lot of interest from communities where injury to the brain is significant, such as rugby associations, for example.

Could the device be used in the field, at the site of an accident?

While Crainio’s primary focus is to deliver a technology for use in critical care, the system could also be used in ambulances, in helicopters, in transfer patients and beyond. The device is non-invasive, the sensor is just like a sticking plaster on the forehead and the backend is a small box containing all the electronics. In the past few years, working in a research environment, the technology was connected into a laptop computer. But we are now transferring everything into a graphical interface, with a monitor to be able to see the signals and the intracranial pressure values in a portable device.

Following preliminary tests on patients, Crainio is now starting a new clinical trial. What do you hope to achieve with the next measurements?

The first study, a feasibility study on the sensor technology, was done during the time when the project was within the university. The second round is led by Crainio using a more optimized probe. Learning from the technical challenges we had in the first study, we tried to mitigate them with a new probe design. We’ve also learned more about the challenges associated with the acquisition of signals, the type of patients, how long we should monitor.

We are now at the stage where Crainio has redeveloped the sensor and it looks amazing. The technology has received approval by MHRA, the UK regulator, for clinical studies and ethical approvals have been secured. This will be an opportunity to work with the new probe, which has more advanced electronics that enable more detailed acquisition of signals from TBI patients.

We are again partnering with the Royal London Hospital, as well as collaborators from the traumatic brain injury team at Cambridge and we’re expecting to enter clinical trials soon. These are patients admitted into neurocritical trauma units and they all have an invasive intracranial pressure bolt. This will allow us to compare the physiological signal coming from our intracranial pressure sensor with the gold standard.

The signals will be analysed by Crainio’s data science team, with machine learning algorithms used to look at changes in the PPG signal, extract morphological features and build models to develop the technology further. So we’re enriching the study with a more advanced technology, and this should lead to more accurate machine learning models for correctly capturing dynamic changes in intracranial pressure.

The primary motivation of Crainio is to create solutions for healthcare, developing a technology that can help clinicians to diagnose traumatic brain injury effectively, faster, accurately and earlier

This time around, we will also record more information from the patients. We will look at CT scans to see whether scalp density and thickness have an impact. We will also collect data from commercial medical monitors within neurocritical care to see the relation between intracranial pressure and other physiological data acquired in the patients. We aim to expand our knowledge of what happens when a patient’s intracranial pressure rises – what happens to their blood pressures? What happens to other physiological measurements?

How far away is the system from being used as a standard clinical tool?

Crainio is very ambitious. We’re hoping that within the next couple of years we will progress adequately in order to achieve CE marking and all meet the standards that are necessary to launch a medical device.

The primary motivation of Crainio is to create solutions for healthcare, developing a technology that can help clinicians to diagnose TBI effectively, faster, accurately and earlier. This can only yield better outcomes and improve patients’ quality-of-life.

Of course, as a company we’re interested in being successful commercially. But the ambition here is, first of all, to keep the cost affordable. We live in a world where medical technologies need to be affordable, not only for Western nations, but for nations that cannot afford state-of-the-art technologies. So this is another of Crainio’s primary aims, to create a technology that could be used widely, because there is a massive need, but also because it’s affordable.

The post Crainio’s Panicos Kyriacou explains how their light-based instrument can help diagnose brain injury appeared first on Physics World.

Extremely stripped star reveals heavy elements as it explodes

28 août 2025 à 10:00
Artist's impression of a star just before it explodes
Stripped star Artist’s impression of the star that exploded to create SN 2021yfj. Shown are the ejection of silicon (grey), sulphur (yellow) and argon (purple) just before the final explosion. (Courtesy: WM Keck Observatory/Adam Makarenko)

For the first time, astronomers have observed clear evidence for a heavily stripped star that has shed many of its outer layers before its death in a supernova explosion. Led by Steve Schulze at Northwestern University, the team has spotted the spectral signatures of heavier elements that are usually hidden deep within stellar interiors.

Inside a star, atomic nuclei fuse together to form heavier elements in a process called nucleosynthesis. This releases a vast amount of energy that offsets the crushing force of gravity.

As stars age, different elements are consumed and produced. “Observations and models of stars tell us that stars are enormous balls of hydrogen when they are born,” Schulze explains. “The temperature and density at the core are so high that hydrogen is fused into helium. Subsequently, helium fuses into carbon, and this process continues until iron is produced.”

Ageing stars are believed to have an onion-like structure, with a hydrogen outer shell enveloping deeper layers of successively heavier elements. Near the end of a star’s life, inner-shell elements including silicon, sulphur, and argon fuse to form a core of iron. Unlike lighter elements, iron does not release energy as it fuses, but instead consumes energy from its surroundings. As a result, the star can no longer withstand its own gravity and it will collapse rapidly in and then explode in a dramatic supernova.

Hidden elements

Rarely, astronomers can observe an old star that has blown out its outer layers before exploding. When the explosion finally occurs, heavier elements that are usually hidden within deeper shells create absorption lines in the supernova’s light spectrum, allowing astronomers to determine the compositions of these inner layers. So far, inner-layer elements as heavy as carbon and oxygen have been observed, but not direct evidence for elements in deeper layers.

Yet in 2021, a mysterious new observation was made by a programme of the Zwicky Transient Facility headed by Avishay Gal-Yam at the Weizmann Institute of Science in Israel. The team was scanning the sky for signs of infant supernovae at the very earliest stages following their initial explosion.

“On 7 September 2021 it was my duty to look for infant supernovae,” Schulze recounts. “We discovered SN 2021yfj due to its rapid increase in brightness. We immediately contacted Alex Filippenko’s group at the University of California Berkeley to ask whether they could obtain a spectrum of this supernova.”

When the results arrived, the team realised that the absorption lines in the supernova’s spectrum were unlike anything they had encountered previously. “We initially had no idea that most of the features in the spectrum were produced by silicon, sulphur, and argon,” Schulze continues. Gal-Yam took up the challenge of identifying the mysterious features in the spectrum.

Shortly before death

In the meantime, the researchers examined simultaneous observations of SN 2021yfj, made by a variety of ground- and space-based telescopes. When Gal-Yam’s analysis was complete, all of the team’s data confirmed the same result. “We had detected a supernova embedded in a shell of material rich in silicon, sulphur, and argon,” Schulze describes. “These elements are formed only shortly before a star dies, and are often hidden beneath other materials – therefore, they are inaccessible under normal circumstances.”

The result provided clear evidence that the star had been more heavily stripped back towards the end of its life than any other observed previously: shedding many of its outer layers before the final explosion.

“SN 2021yfj demonstrates that stars can die in far more extreme ways than previously imagined,” says Schulze. “It reveals that our understanding of how stars evolve and die is still not complete, despite billions of them having already been studied.” By studying their results, the team now hopes that astronomers can better understand the later stages of stellar evolution, and the processes leading up to these dramatic ends.

The research is described in Nature.

The post Extremely stripped star reveals heavy elements as it explodes appeared first on Physics World.

❌