How it works Researchers use twisted surfaces to manipulate mechanical waves, enabling new technologies for imaging, electronics and sensors. (Courtesy: A Alù)
By simply placing two identical elastic metasurfaces atop each other and then rotating them relative to each other, the topology of the elastic waves dispersing through the resulting stacked structure can be changed – from elliptic to hyperbolic. This new control technique, from physicists at the CUNY Advanced Science Research Center in the US, works over a broad frequency range and has been dubbed “twistelastics”. It could allow for advanced reconfigurable phononic devices with potential applications in microelectronics, ultrasound sensing and microfluidics.
The researchers, led by Andrea Alù, say they were inspired by the recent advances in “twistronics” and its “profound impact” on electronic and photonic systems. “Our goal in this work was to explore whether similar twist-induced topological phenomena could be harnessed in elastodynamics in which phonons (vibrations of the crystal lattice) play a central role,” says Alù.
In twistelastics, the rotations between layers of identical, elastic engineered surfaces are used to manipulate how mechanical waves travel through the materials. The new approach, say the CUNY researchers, allows them to reconfigure the behaviour of these waves and precisely control them. “This opens the door to new technologies for sensing, communication and signal processing,” says Alù.
From elliptic to hyperbolic
In their work, the researchers used computer simulations to design metasurfaces patterned with micron-sized pillars. When they stacked one such metasurface atop the other and rotated them at different angles, the resulting combined structure changed the way phonons spread. Indeed, their dispersion topology went from elliptic to hyperbolic.
At a specific rotation angle, known as the “magic angle” (just like in twistronics), the waves become highly focused and begin to travel in one direction. This effect could allow for more efficient signal processing, says Alù, with the signals being easier to control over a wide range of frequencies.
The new twistelastic platform offers broadband, reconfigurable, and robust control over phonon propagation,” he tells Physics World. “This may be highly useful for a wide range of application areas, including surface acoustic wave (SAW) technologies, ultrasound imaging and sensing, microfluidic particle manipulation and on-chip phononic signal processing.
New frontiers
Since the twist-induced transitions are topologically protected, again like in twistronics, the system is resilient to fabrication imperfections, meaning it can be miniaturized and integrated into real-world devices, he adds. “We are part of an exciting science and technology centre called ‘New Frontiers of Sound’, of which I am one of the leaders. The goal of this ambitious centre is to develop new acoustic platforms for the above applications enabling disruptive advances for these technologies.”
Looking ahead, the researchers say they are looking into miniaturizing their metasurface design for integration into microelectromechanical systems (MEMS). They will also be studying multi-layer twistelastic architectures to improve how they can control wave propagation and investigating active tuning mechanisms, such as electromechanical actuation, to dynamically control twist angles. “Adding piezoelectric phenomena for further control and coupling to the electromagnetic waves,” is also on the agenda says Alù.
Crystal structure In the new high-Tc superconductor, lanthanum and scandium atoms constitute the MgB2-type sublattice, while the surrounding hydrogen atoms form two types of cage-like configurations. (Courtesy: Guangtao Liu, Jilin University)
Researchers in China claim to have made the first ever room-temperature superconductor by compressing an alloy of lanthanum-scandium (La-Sc) and the hydrogen-rich material ammonia borane (NH3BH3) together at pressures of 250–260 GPa, observing superconductivity with a maximum onset temperature of 298 K. While these high pressures are akin to those at the centre of the Earth, the work marks a milestone in the field of superconductivity, they say.
Superconductors conduct electricity without resistance and many materials do this when cooled below a certain transition temperature, Tc. In most cases this temperature is very low – for example, solid mercury, the first superconductor to be discovered, has a Tc of 4.2 K. Researchers have therefore been looking for superconductors that operate at higher temperatures – perhaps even at room temperature. Such materials could revolutionize a host of application areas, including increasing the efficiency of electrical generators and transmission lines through lossless electricity transmission. They would also greatly simplify technologies such as MRI, for instance, that rely on the generation or detection of magnetic fields.
Researchers made considerable progress towards this goal in the 1980s and 1990s with the discovery of the “high-temperature” copper oxide superconductors, which have Tc values between 30 and 133 K. Fast-forward to 2015 and the maximum known critical temperature rose even higher thanks to the discovery of a sulphide material, H3S, that has a Tc of 203 K when compressed to pressures of 150 GPa.
This result sparked much interest in solid materials containing hydrogen atoms bonded to other elements and in 2019, the record was broken again, this time by lanthanum decahydride (LaH10), which was found to have a Tc of 250–260 K, albeit again at very high pressures. Then in 2021, researchers observed high-temperature superconductivity in the cerium hydrides, CeH9 and CeH10, which are remarkable because they are stable and boast high-temperature superconductivity at lower pressures (about 80 GPa, or 0.8 million atmospheres) than the other so-called “superhydrides”.
Ternary hydrides
In recent years, researchers have started turning their attention to ternary hydrides – substances that comprise three different atomic species rather than just two. Compared with binary hydrides, ternary hydrides are more structurally complex, which may allow them to have higher Tc values. Indeed, Li2MgH16 has been predicted to exhibit “hot” superconductivity with a Tc of 351–473 K under multimegabar pressures and several other high-Tc hydrides, including MBxHy, MBeH8 and Mg2IrH6-7, have been predicted to be stable under comparatively lower pressures.
In the new work, a team led by physicist Yanming Ma of Jilin University, studied LaSc2H24 – a compound that’s made by doping Sc into the well-known La-H binary system. Ma and colleagues had already predicted in theory – using the crystal structure prediction (CALYPSO) method – that this ternary material should feature a hexagonal P6/mmm symmetry. Introducing Sc into the La-H results in the formation of two novel interlinked H24 and H30 hydrogen clathrate “cages” with the H24 surrounding Sc and the H30 surrounding La.
The researchers predicted that these two novel hydrogen frameworks should produce an exceptionally large hydrogen-derived density of states at the Fermi level (the highest energy level that electrons can occupy in a solid at a temperature of absolute zero), as well as enhancing coupling between electrons and phonons (vibrations of the crystal lattice) in the material, leading to an exceptionally high Tc of up to 316 K at high pressure.
To characterize their material, the researchers placed it in a diamond-anvil cell, a device that generates extreme pressures as it squeezes the sample between two tiny, gem-grade crystals of diamond (one of the hardest substances known) while heating it with a laser. In situ X-ray diffraction experiments revealed that the compound crystallizes into a hexagonal structure, in excellent agreement with the predicted P6/mmm LaSc2H24 structure.
A key piece of experimental evidence for superconductivity in the La-Sc-H ternary system, says co-author Guangtao Liu, came from measurements that repeatedly demonstrated the onset of zero electrical resistance below the Tc.
Another significant proof, Liu adds, is that the Tc decreases monotonically with the application of an external magnetic field in a number of independently synthesized samples. “This behaviour is consistent with the conventional theory of superconductivity since an external magnetic field disrupts Cooper pairs – the charge carriers responsible for the zero-resistance state – thereby suppressing superconductivity.”
“These two main observations demonstrate the superconductivity in our synthesized La-Sc-H compound,” he tells Physics World.
Difficult experiments
The experiments were not easy, Liu recalls. The first six months of attempting to synthesize LaSc2H24 below 200 GPa yielded no obvious Tc enhancement. “We then tried higher pressure and above 250 GPa, we had to manually deposit three precursor layers and ensure that four electrodes (for subsequent conductance measurements) were properly connected to the alloy in an extremely small sample chamber, just 10 to 15 µm in size,” he says. “This required hundreds of painstaking repetitions.”
And that was not all: to synthesize the LaSc2H24, the researchers had to prepare the correct molar ratios of a precursor alloy. The Sc and La elements cannot form a solid solution because of their different atomic radii, so using a normal melting method makes it hard to control this ratio. “After about a year of continuous investigations, we finally used the magnetron sputtering method to obtain films of LaSc2H24 with the molar ratios we wanted,” Liu explains. “During the entire process, most of our experiments failed and we ended up damaging at least 70 pairs of diamonds.”
Sven Friedemann of the University of Bristol, who was not involved in this work, says that the study is “an important step forward” for the field of superconductivity with a new record transition temperature of 295 K. “The new measurements show zero resistance (within resolution) and suppression in magnetic fields, thus strongly suggesting superconductivity,” he comments. “It will be exciting to see future work probing other signatures of superconductivity. The X-ray diffraction measurements could be more comprehensive and leave some room for uncertainty to whether it is indeed the claimed LaSc2H24 structure giving rise to the superconductivity.”
Ma and colleagues say they will continue to study the properties of this compound – and in particular, verify the isotope effect (a signature of conventional superconductors) or measure the superconducting critical current. “We will also try to directly detect the Meissner effect – a key goal for high-temperature superhydride superconductors in general,” says Ma. “Guided by rapidly advancing theoretical predictions, we will also synthesize new multinary superhydrides to achieve better superconducting properties under much lower pressures.”
Due to government shutdown restrictions currently in place in the US, the researchers who headed up this study have not been able to comment on their work
Laser plasma acceleration (LPA) may be used to generate multi-gigaelectronvolt muon beams, according to physicists at the Lawrence Berkeley National Laboratory (LBNL) in the US. Their work might help in the development of ultracompact muon sources for applications such as muon tomography – which images the interior of large objects that are inaccessible to X-ray radiography.
Muons are charged subatomic particles that are produced in large quantities when cosmic rays collide with atoms 15–20 km high up in the atmosphere. Muons have the same properties as electrons but are around 200 times heavier. This means they can travel much further through solid structures than electrons. This property is exploited in muon tomography, which analyses how muons penetrate objects and then exploits this information to produce 3D images.
The technique is similar to X-ray tomography used in medical imaging, with the cosmic-ray radiation taking the place of artificially generated X-rays and muon trackers the place of X-ray detectors. Indeed, depending on their energy, muons can traverse metres of rock or other materials, making them ideal for imaging thick and large structures. As a result, the technique has been used to peer inside nuclear reactors, pyramids and volcanoes.
As many as 10,000 muons from cosmic rays reach each square metre of the Earth’s surface every minute. These naturally produced particles have unpredictable properties, however, and they also only come from the vertical direction. This fixed directionality means that can take months to accumulate enough data for tomography.
Another option is to use the large numbers of low-energy muons that can be produced in proton accelerator facilities by smashing a proton beam onto a fixed carbon target. However, these accelerators are large and expensive facilities, limiting their use in muon tomography.
A new compact source
Physicists led by Davide Terzani have now developed a new compact muon source based on LPA-generated electron beams. Such a source, if optimized, could be deployed in the field and could even produce muon beams in specific directions.
In LPA, an ultra-intense, ultra-short, and tightly focused laser pulse propagates into an “under-dense” gas. The pulse’s extremely high electric field ionizes the gas atoms, freeing the electrons from the nuclei, so generating a plasma. The ponderomotive force, or radiation pressure, of the intense laser pulse displaces these electrons and creates an electrostatic wave that produces accelerating fields orders of magnitude higher than what is possible in the traditional radio-frequency cavities used in conventional accelerators.
LPAs have all the advantages of an ultra-compact electron accelerator that allows for muon production in a small-size facility such as BeLLA, where Terzani and his colleagues work. Indeed, in their experiment, they succeeded in generating a 10 GeV electron beam in a 30 cm gas target for the first time.
The researchers collided this beam with a dense target, such as tungsten. This slows the beam down so that it emits Bremsstrahlung, or braking radiation, which interacts with the material, producing secondary products that include lepton–antilepton pairs, such as electron–positron and muon–antimuon pairs. Behind the converter target, there is also a short-lived burst of muons that propagates roughly along the same axis as the incoming electron beam. A thick concrete shielding then filters most of the secondary products, letting the majority of muons pass through it.
Crucially, Terzani and colleagues were able to separate the muon signal from the large background radiation – something that can be difficult to do because of the inherent inefficiency of the muon production process. This allowed them to identify two different muon populations coming from the accelerator. These were a collimated, forward directed population, generated by pair production; and a low-energy, isotropic, population generated by meson decay.
Many applications
Muons can ne used in a range of fields, from imaging to fundamental particle physics. As mentioned, muons from cosmic rays are currently used to inspect large and thick objects not accessible to regular X-ray radiography – a recent example of this is the discovery of a hidden chamber in Khufu’s Pyramid. They can also be used to image the core of a burning blast furnace or nuclear waste storage facilities.
While the new LPA-based technique cannot yet produce muon fluxes suitable for particle physics experiments – to replace a muon injector, for example – it could offer the accelerator community a convenient way to test and develop essential elements towards making a future muon collider.
The experiment in this study, which is detailed in Physical Review Accelerators and Beams, focused on detecting the passage of muons, unequivocally proving their signature. The researchers conclude that they now have a much better understanding of the source of these muons.
Unfortunately, the original programme that funded this research has ended, so future studies are limited at the moment. Not to be disheartened, the researchers say they strongly believe in the potential of LPA-generated muons and are working on resuming some of their experiments. For example, they aim to measure the flux and the spectrum of the resulting muon beam using completely different detection techniques based on ultra-fast particle trackers, for example.
The LBNL team also wants to explore different applications, such as imaging deep ore deposits – something that will be quite challenging because it poses strict limitations on the minimum muon energy required to penetrate soil. Therefore, they are looking into how to increase the muon energy of their source.
When a star rapidly accumulates gas and dust during its early growth phase, it’s called an accretion burst. Now, for the first time, astronomers have observed a planet doing the same thing. The discovery, made using the European Southern Observatory’s Very Large Telescope (VLT) and the James Webb Space Telescope (JWST), shows that the infancy of certain planetary-mass objects and that of newborn stars may share similar characteristics.
Like other rogue planets, Cha1107-7626 was known to be surrounded by a disk of dust and gas. When material from this disk spirals, or accretes, onto the planet, the planet grows.
What Almendros-Abad and colleagues discovered is that this process is not uniform. Using the VLT’s XSHOOTER and the NIRSpec and MIRI instruments on JWST, they found that Cha1107-7626 experienced a burst of accretion beginning in June 2025. This is the first time anyone has seen an accretion burst in an object with such a low mass, and the peak accretion rate of six billion tonnes per second makes it the strongest accretion episode ever recorded in a planetary-mass object. It may not be over, either. At the end of August, when the observing campaign ended, the burst was still ongoing.
An infancy similar to a star’s
The team identified several parallels between Cha1107-7626’s accretion burst and those that young stars experience. Among them were clear signs that gas is being funnelled onto the planet. “This indicates that magnetic fields structure the flow of gas, which is again something well known from stars,” explains Scholz. “Overall, our discovery is establishing interesting, perhaps surprising parallels between stars and planets, which I’m not sure we fully understand yet.”
The astronomers also found that the chemistry of the disc around the planet changed during accretion, with water being present in this phase even though it hadn’t been before. This effect has previously been spotted in stars, but never in a planet until now.
“We’re struck by quite how much the infancy of free-floating planetary-mass objects resembles that of stars like the Sun,” Jayawardhana says. “Our new findings underscore that similarity and imply that some objects comparable to giant planets form the way stars do, from contracting clouds of gas and dust accompanied by disks of their own, and they go through growth episodes just like newborn stars.”
The researchers have been studying similar objects for many years and earlier this year published results based on JWST observations that featured a small sample of planetary-mass objects. “This particular study is part of that sample,” Scholz tells Physics World, “and we obtained the present results because Victor wanted to look in detail at the accretion flow onto Cha1107-7626, and in the process discovered the burst.”
The researchers say they are “keeping an eye” on Cha1107-7626 and other such objects that are still growing because their environment is dynamic and unstable. “More to the point, we really don’t understand what drives these accretion events, and we need detailed follow-up to figure out the underlying reasons for these processes,” Scholz says.
Quantum manipulation: The squeezer – an optical parametric oscillator (OPO) that uses a nonlinear crystal inside an optical cavity to manipulate the quantum fluctuations of light – is responsible for the entanglement. (Courtesy: Jonas Schou Neergaard-Nielsen)
Physicists at the Technical University of Denmark have demonstrated what they describe as a “strong and unconditional” quantum advantage in a photonic platform for the first time. Using entangled light, they were able to reduce the number of measurements required to characterize their system by a factor of 1011, with a correspondingly huge saving in time.
“We reduced the time it would take from 20 million years with a conventional scheme to 15 minutes using entanglement,” says Romain Brunel, who co-led the research together with colleagues Zheng-Hao Liu and Ulrik Lund Andersen.
Although the research, which is described in Science, is still at a preliminary stage, Brunel says it shows that major improvements are achievable with current photonic technologies. In his view, this makes it an important step towards practical quantum-based protocols for metrology and machine learning.
From individual to collective measurement
Quantum devices are hard to isolate from their environment and extremely sensitive to external perturbations. That makes it a challenge to learn about their behaviour.
To get around this problem, researchers have tried various “quantum learning” strategies that replace individual measurements with collective, algorithmic ones. These strategies have already been shown to reduce the number of measurements required to characterize certain quantum systems, such as superconducting electronic platforms containing tens of quantum bits (qubits), by as much as a factor of 105.
A photonic platform
In the new study, Brunel, Liu, Andersen and colleagues obtained a quantum advantage in an alternative “continuous-variable” photonic platform. The researchers note that such platforms are far easier to scale up than superconducting qubits, which they say makes them a more natural architecture for quantum information processing. Indeed, photonic platforms have already been crucial to advances in boson sampling, quantum communication, computation and sensing.
The team’s experiment works with conventional, “imperfect” optical components and consists of a channel containing multiple light pulses that share the same pattern, or signature, of noise. The researchers began by performing a procedure known as quantum squeezing on two beams of light in their system. This caused the beams to become entangled – a quantum phenomenon that creates such a strong linkage that measuring the properties of one instantly affects the properties of the other.
The team then measured the properties of one of the beams (the “probe” beam) in an experiment known as a 100-mode bosonic displacement process. According to Brunel, one can imagine this experiment as being like tweaking the properties of 100 independent light modes, which are packets or beams of light. “A ‘bosonic displacement process’ means you slightly shift the amplitude and phase of each mode, like nudging each one’s brightness and timing,” he explains. “So, you then have 100 separate light modes, and each one is shifted in phase space according to a specific rule or pattern.”
By comparing the probe beam to the second (“reference”) beam in a single joint measurement, Brunel explains that he and his colleagues were able to cancel out much of the uncertainties in these measurements. This meant they could extract more information per trial than they could have by characterizing the probe beam alone. This information boost, in turn, allowed them to significantly reduce the number of measurements – in this case, by a factor of 1011.
While the DTU researchers acknowledge that they have not yet studied a practical, real-world system, they emphasize that their platform is capable of “doing something that no classical system will ever be able to do”, which is the definition of a quantum advantage. “Our next step will therefore be to study a more practical system in which we can demonstrate a quantum advantage,” Brunel tells Physics World.
Future versions of the Laser Interferometer Gravitational Wave Observatory (LIGO) will be able to run at much higher laser powers thanks to a sophisticated new system that compensates for temperature changes in optical components. Known as FROSTI (for FROnt Surface Type Irradiator) and developed by physicists at the University of California Riverside, US, the system will enable next-generation machines to detect gravitational waves emitted when the universe was just 0.1% of its current age, before the first stars had even formed.
Gravitational waves are distortions in spacetime that occur when massive astronomical objects accelerate and collide. When these distortions pass through the four-kilometre-long arms of the two LIGO detectors, they create a tiny difference in the (otherwise identical) distance that light travels between the centre of the observatory and the mirrors located at the end of each arm. The problem is that detecting and studying gravitational waves requires these differences in distance to be measured with an accuracy of 10-19 m, which is 1/10 000th the size of a proton.
Observing waves at lower and higher frequencies in the gravitational wave spectrum remains challenging, however. At lower frequencies (around 10–30 Hz), the problem stems from vibrational noise in the mirrors. Although these mirrors are hefty objects – each one measures 34 cm across, is 20 cm thick and has a mass of around 40 kg – the incredible precision required to detect gravitational waves at these frequencies means that even the minute amount of energy they absorb from the laser beam is enough to knock them out of whack.
At higher frequencies (150 – 2000 Hz), measurements are instead limited by quantum shot noise. This is caused by the random arrival time of photons at LIGO’s output photodetectors and is a fundamental consequence of the fact that the laser field is quantized.
A novel adaptive optics device
Jonathan Richardson, the physicist who led this latest study, explains that FROSTI is designed to reduce quantum shot noise by allowing the mirrors to cope with much higher levels of laser power. At its heart is a novel adaptive optics device that is designed to precisely reshape the surfaces of LIGO’s main mirrors under laser powers exceeding 1 megawatt (MW), which is nearly five times the power used at LIGO today.
Though its name implies cooling, FROSTI actually uses heat to restore the mirror’s surface to its original shape. It does this by projecting infrared radiation onto test masses in the interferometer to create a custom heat pattern that “smooths out” distortions and so allows for fine-tuned, higher-order corrections.
The single most challenging aspect of FROSTI’s design, and one that Richardson says shaped its entire concept, is the requirement that it cannot introduce even more noise into the LIGO interferometer. “To meet this stringent requirement, we had to use the most intensity-stable radiation source available – that is, an internal blackbody emitter with a long thermal time constant,” he tells Physics World. “Our task, from there, was to develop new non-imaging optics capable of reshaping the blackbody thermal radiation into a complex spatial profile, similar to one that could be created with a laser beam.”
Richardson anticipates that FROSTI will be a critical component for future LIGO upgrades – upgrades that will themselves serve as blueprints for even more sensitive next-generation observatories like the proposed Cosmic Explorer in the US and the Einstein Telescope in Europe. “The current prototype has been tested on a 40-kg LIGO mirror, but the technology is scalable and will eventually be adapted to the 440-kg mirrors envisioned for Cosmic Explorer,” he says.
Jan Harms, a physicist at Italy’s Gran Sasso Science Institute who was not involved in this work, describes FROSTI as “an ingenious concept to apply higher-order corrections to the mirror profile.” Though it still needs to pass the final test of being integrated into the actual LIGO detectors, Harms notes that “the results from the prototype are very promising”.
Richardson and colleagues are continuing to develop extensions to their technology, building on the successful demonstration of their first prototype. “In the future, beyond the next upgrade of LIGO (A+), the FROSTI radiation will need to be shaped into an even more complex spatial profile to enable the highest levels of laser power (1.5 MW) ultimately targeted,” explains Richardson. “We believe this can be achieved by nesting two or more FROSTI actuators together in a single composite, with each targeting a different radial zone of the test mass surfaces. This will allow us to generate extremely finely-matched optical wavefront corrections.”