↩ Accueil

Vue normale

Reçu hier — 27 octobre 2025

Fingerprint method can detect objects hidden in complex scattering media

27 octobre 2025 à 15:00
Buried metal spheres can be seen using new fingerprint imaging method
Imaging buried objects Left: artistic impression of metal spheres buried in small glass beads; centre: conventional ultrasound image; right: the new technology can precisely determine the positions of the metal spheres. (Courtesy: TU Wien/Arthur Le Ber)

Physicists have developed a novel imaging technique for detecting and characterizing objects hidden within opaque, highly scattering material. The researchers, from France and Austria, showed that their new mathematical approach, which utilizes the fact that hidden objects generate their own complex scattering pattern, or “fingerprint”, can work on biological tissue.

Viewing the inside of the human body is challenging due to the scattering nature of tissue. With ultrasound, when waves propagate through tissue they are reflected, bounce around and scatter chaotically, creating noise that obscures the signal from the object that the medical practitioner is trying to see. The further you delve into the body the more incoherent the image becomes.

There are techniques for overcoming these issues, but as scattering increases – in more complex media or as you push deeper through tissue – they struggle and unpicking the required signal becomes too complex.

The scientists behind the latest research, from the Institut Langevin in Paris, France and TU Wien in Vienna, Austria, say that rather than compensating for scattering, their technique instead relies on detecting signals from the hidden object in the disorder.

Objects buried in a material create their own complex scattering pattern, and the researchers found that if you know an object’s specific acoustic signal it’s possible to find it in the noise created by the surrounding environment.

“We cannot see the object, but the backscattered ultrasonic wave that hits the microphones of the measuring device still carries information about the fact that it has come into contact with the object we are looking for,” explains Stefan Rotter, a theoretical physicist at TU Wien.

Rotter and his colleagues examined how a series of objects scattered ultrasound waves in an interference-free environment. This created what they refer to as fingerprint matrices: measurements of the specific, characteristic way in which each object scattered the waves.

The team then developed a mathematical method that allowed them to calculate the position of each object when hidden in a scattering medium, based on its fingerprint matrix.

“From the correlations between the measured reflected wave and the unaltered fingerprint matrix, it is possible to deduce where the object is most likely to be located, even if the object is buried,” explains Rotter.

The team tested the technique in three different scenarios. The first experiment trialled the ultrasound imaging of metal spheres in a dense suspension of glass beads in water. Conventional ultrasound failed in this setup and the spheres were completely invisible, but with their novel fingerprint method the researchers were able to accurately detect them.

Next, to examine a medical application for the technique, the researchers embedded lesion markers often used to monitor breast tumours in a foam designed to mimic the ultrasound scattering of soft tissue. These markers can be challenging to detect due to scatterers randomly distributed in human tissue. With the fingerprint matrix, however, the researchers say that the markers were easy to locate.

Finally, the team successfully mapped muscle fibres in a human calf using the technique. They claim this could be useful for diagnosing and monitoring neuromuscular diseases.

According to Rotter and his colleagues, their fingerprint matrix method is a versatile and universal technique that could be applied beyond ultrasound to all fields of wave physics. They highlight radar and sonar as examples of sensing techniques where target identification and detection in noisy environments are long-standing challenges.

“The concept of the fingerprint matrix is very generally applicable – not only for ultrasound, but also for detection with light,” Rotter says. “It opens up important new possibilities in all areas of science where a reflection matrix can be measured.”

The researchers report their findings in Nature Physics.

The post Fingerprint method can detect objects hidden in complex scattering media appeared first on Physics World.

Reçu avant avant-hier

A SMART approach to treating lung cancers in challenging locations

24 octobre 2025 à 14:00

Radiation treatment for patients with lung cancer represents a balancing act, particularly if malignant lesions are centrally located near to critical structures. The radiation may destroy the tumour, but vital organs may be seriously damaged as well.

The standard treatment for non-small cell lung cancer (NSCLC) is stereotactic ablative body radiotherapy (SABR), which delivers intense radiation doses in just a few treatment sessions and achieves excellent local control. For ultracentral lung legions, however – defined as having a planning target volume (PTV) that abuts or overlaps the proximal bronchial tree, oesophagus or pulmonary vessels – the high risk of severe radiation toxicity makes SABR highly challenging.

A research team at GenesisCare UK, an independent cancer care provider operating nine treatment centres in the UK, has now demonstrated that stereotactic MR-guided adaptive radiotherapy (SMART)-based SABR may be a safer and more effective option for treating ultracentral metastatic lesions in patients with histologically confirmed NSCLC. They report their findings in Advances in Radiation Oncology.

SMART uses diagnostic-quality MR scans to provide real-time imaging, 3D multiplanar soft-tissue tracking and automated beam control of an advanced linear accelerator. The idea is to use daily online volume adaptation and plan re-optimization to account for any changes in tumour size and position relative to organs-at-risk (OAR). Real-time imaging enables treatment in breath-hold with gated beam delivery (automatically pausing delivery if the target moves outside a defined boundary), eliminating the need for an internal target volume and enabling smaller PTV margins.

The approach offers potential to enhance treatment precision and target coverage while improving sparing of adjacent organs compared with conventional SABR, first author Elena Moreno-Olmedo and colleagues contend.

A safer treatment option

The team conducted a study to assess the incidence of SABR-related toxicities in patients with histologically confirmed NSCLC undergoing SMART-based SABR. The study included 11 patients with 18 ultracentral lesions, the majority of whom had oligometastatic or olioprogressive disease.

Patients received five to eight treatment fractions, to a median dose of 40 Gy (ranging from 30 to 60 Gy). The researchers generated fixed-field SABR plans with dosimetric aims including a PTV V100% (the volume receiving at least 100% of the prescription dose) of 95% or above, a PTV V95% of 98% or above and a maximum dose of between110% and 140%. PTV coverage was compromised where necessary to meet OAR constraints, with a minimum PTV V100% of at least 70%.

SABR was performed using a 6 MV 0.35 T MRIdian linac with gated delivery during repeated breath-holds, under continuous MR guidance. Based on daily MRI scans, online plan adaptation was performed for all of the 78 delivered fractions.

The researchers report that both the PTV volume and PTV overlap with ultracentral OARs were reduced in SMART treatments compared with conventional SABR. The median SMART PTV was 10.1 cc, compared with 30.4 cc for the simulated SABR PTV, while the median PTV overlap with OARs was 0.85 cc for SMART (8.4% of the PTV) and 4.7 cc for conventional SABR.

In terms of treatment-related side effects for SMART, the rates of acute and late grade 1–2 toxicities were 54% and 18%, respectively, with no grade 3–5 toxicities observed. This demonstrates the technique’s increased safety compared with non-adaptive SABR treatments, which have exhibited severe rates of toxicity, including treatment-related deaths, in ultracentral tumours.

Two-thirds of patients were alive at the median follow-up point of 28 months, and 93% were free from local progression at 12 months. The median progression-free survival was 5.8 months and median overall survival was 20 months.

Acknowledging the short follow-up time frame, the researchers note that additional late toxicities may occur. However, they are hopeful that SMART will be considered as a favourable treatment option for patients with ultracentral NSCLC lesions.

“Our analysis demonstrates that hypofractionated SMART with daily online adaptation for ultracentral NSCLC achieved comparable local control to conventional non-adaptive SABR, with a safer toxicity profile,” they write. “These findings support the consideration of SMART as a safer and effective treatment option for this challenging subgroup of thoracic tumours.”

The SUNSET trial

SMART-based SABR radiotherapy remains an emerging cancer treatment that’s not available yet in many cancer treatment centres. Despite the high risk for patients with ultracentral tumours, SABR is the standard treatment for inoperable NSCLC.

The phase 1 clinical trial, Stereotactic radiation therapy for ultracentral NSCLC: a safety and efficacy trial (SUNSET), assessed the use of SBRT for ultracentral tumours in 30 patients with early-stage NSCLC treated at five Canadian cancer centres. In all cases, the PTVs touched or overlapped the proximal bronchial tree, the pulmonary artery, the pulmonary vein or the oesophagus. Led by Meredith Giuliani of the Princess Margaret Cancer Centre, the trial aimed to determine the maximum tolerated radiation dose associated with a less than 30% rate of grade 3–5 toxicity within two years of treatment.

All patients received 60 Gy in eight fractions. Dose was prescribed to deliver a PTV V100% of 95%, a PTV V90% of 99% and a maximum dose of no more than 120% of the prescription dose, with OAR constraints prioritized over PTV coverage. All patients had daily cone-beam CT imaging to verify tumour position before treatment.

At a median follow-up of 37 months, two patients (6.7%) experienced dose-limiting grade 3–5 toxicities – an adverse event rate within the prespecified acceptability criteria. The three-year overall survival was 72.5% and the three-year progression-free survival was 66.1%.

In a subsequent dosimetric analysis, the researchers report that they did not identify any relationship between OAR dose and toxicity, within the dose constraints used in the SUNSET trial. They note that 73% of patients could be treated without compromise of the PTV, and where compromise was needed, the mean PTV D95 (the minimum dose delivered to 95% of the PTV) remained high at 52.3 Gy.

As expected, plans that overlapped with central OARs were associated with worse local control, but PTV undercoverage was not. “[These findings suggest] that the approach of reducing PTV coverage to meet OAR constraints does not appear to compromise local control, and that acceptable toxicity rates are achievable using 60 Gy in eight fractions,” the team writes. “In the future, use of MRI or online adaptive SBRT may allow for safer treatment delivery by limiting dose variation with anatomic changes.”

The post A SMART approach to treating lung cancers in challenging locations appeared first on Physics World.

Spiral catheter optimizes drug delivery to the brain

24 octobre 2025 à 10:00

Researchers in the United Arab Emirates have designed a new catheter that can deliver drugs to entire regions of the brain. Developed by Batoul Khlaifat and colleagues at New York University Abu Dhabi, the catheter’s helical structure and multiple outflow ports could make it both safer and more effective for treating a wide range of neurological disorders.

Modern treatments for brain-related conditions including Parkinson’s disease, epilepsy, and tumours often involve implanting microfluidic catheters that deliver controlled doses of drug-infused fluids to highly localized regions of the brain. Today, these implants are made from highly flexible materials that closely mimic the soft tissue of the brain. This makes them far less invasive than previous designs.

However, there is still much room for improvement, as Khlaifat explains. “Catheter design and function have long been limited by the neuroinflammatory response after implantation, as well as the unequal drug distribution across the catheter’s outlets,” she says.

A key challenge with this approach is that each of the brain’s distinct regions has highly irregular shapes, which makes it incredibly difficult to target via single drug doses. Instead, doses must be delivered either through repeated insertions from a single port at the end of a catheter, or through single insertions across multiple co-implanted catheters. Either way, the approach is highly invasive, and runs the risk of further trauma to the brain.

Multiple ports

In their study, Khlaifat’s team explored how many of these problems stem from existing catheter designs. They tend to be simple tubes with single input and output ports at either end. Using fluid dynamics simulations, they started by investigating how drug outflow would change when multiple output ports are positioned along the length of the catheter.

To ensure this outflow is delivered evenly, they carefully adjusted the diameter of each port to account for the change in fluid pressure along the catheter’s length – so that four evenly spaced ports could each deliver roughly one quarter of the total flow. Building on this innovation, the researchers then explored how the shape of the catheter itself could be adjusted to optimize delivery even further.

“We varied the catheter design from a straight catheter to a helix of the same small diameter, allowing for a larger area of drug distribution in the target implantation region with minimal invasiveness,” explains team member Khalil Ramadi. “This helical shape also allows us to resist buckling on insertion, which is a major problem for miniaturized straight catheters.”

Helical catheter

Based on their simulations, the team fabricated a helical catheter the call Strategic Precision Infusion for Regional Administration of Liquid, or SPIRAL. In their first set of experiments, they tested their simulations in controlled lab conditions. They verified their prediction of even outflow rates across the catheter’s outlets.

“Our helical device was also tested in mouse models alongside its straight counterpart to study its neuroinflammatory response,” Khlaifat says. “There were no significant differences between the two designs.”

Having validated the safety of their approach, the researchers are now hopeful that SPIRAL could pave the way for new and improved methods for targeted drug delivery within the brain. With the ability to target entire regions of the brain with smaller, more controlled doses, this future generation of implanted catheters could ultimately prove to be both safer and more effective than existing designs.

“These catheters could be optimized for each patient through our computational framework to ensure only regions that require dosing are exposed to therapy, all through a single insertion point in the skull,” describes team member Mahmoud Elbeh. “This tailored approach could improve therapies for brain disorders such as epilepsy and glioblastomas.”

The research is described in the Journal of Neural Engineering.

The post Spiral catheter optimizes drug delivery to the brain appeared first on Physics World.

Performance metrics and benchmarks point the way to practical quantum advantage

23 octobre 2025 à 17:35
Quantum connections Measurement scientists are seeking to understand and quantify the relative performance of quantum computers from different manufacturers as well as across the myriad platform technologies. (Courtesy: iStock/Bartlomiej Wroblewski)

From quantum utility today to quantum advantage tomorrow: incumbent technology companies – among them Google, Amazon, IBM and Microsoft – and a wave of ambitious start-ups are on a mission to transform quantum computing from applied research endeavour to mainstream commercial opportunity. The end-game: quantum computers that can be deployed at-scale to perform computations significantly faster than classical machines while addressing scientific, industrial and commercial problems beyond the reach of today’s high-performance computing systems.

Meanwhile, as technology translation gathers pace across the quantum supply chain, government laboratories and academic scientists must maintain their focus on the “hard yards” of precompetitive research. That means prioritizing foundational quantum hardware and software technologies, underpinned by theoretical understanding, experimental systems, device design and fabrication – and pushing out along all these R&D pathways simultaneously.

Bringing order to disorder

Equally important is the requirement to understand and quantify the relative performance of quantum computers from different manufacturers as well as across the myriad platform technologies – among them superconducting circuits, trapped ions, neutral atoms as well as photonic and semiconductor processors. A case study in this regard is a broad-scope UK research collaboration that, for the past four years, has been reviewing, collecting and organizing a holistic taxonomy of metrics and benchmarks to evaluate the performance of quantum computers against their classical counterparts as well as the relative performance of competing quantum platforms.

Funded by the National Quantum Computing Centre (NQCC), which is part of the UK National Quantum Technologies Programme (NQTP), and led by scientists at the National Physical Laboratory (NPL), the UK’s National Metrology Institute, the cross-disciplinary consortium has taken on an endeavour that is as sprawling as it is complex. The challenge lies in the diversity of quantum hardware platforms in the mix; also the emergence of two different approaches to quantum computing – one being a gate-based framework for universal quantum computation, the other an analogue approach tailored to outperforming classical computers on specific tasks.

“Given the ambition of this undertaking, we tapped into a deep pool of specialist domain knowledge and expertise provided by university colleagues at Edinburgh, Durham, Warwick and several other centres-of-excellence in quantum,” explains Ivan Rungger, a principal scientist at NPL, professor in computer science at Royal Holloway, University of London, and lead scientist on the quantum benchmarking project. That core group consulted widely within the research community and with quantum technology companies across the nascent supply chain. “The resulting study,” adds Rungger, “positions transparent and objective benchmarking as a critical enabler for trust, comparability and commercial adoption of quantum technologies, aligning closely with NPL’s mission in quantum metrology and standards.”

Not all metrics are equal – or mature

2025-10-npl-na-aqml-image
Made to measure NPL’s Institute for Quantum Standards and Technology (above) is the UK’s national metrology institute for quantum science. (Courtesy: NPL)

For context, a number of performance metrics used to benchmark classical computers can also be applied directly to quantum computers, such as the speed of operations, the number of processing units, as well as the probability of errors to occur in the computation. That only goes so far, though, with all manner of dedicated metrics emerging in the past decade to benchmark the performance of quantum computers – ranging from their individual hardware components to entire applications.

Complexity reigns, it seems, and navigating the extensive literature can prove overwhelming, while the levels of maturity for different metrics varies significantly. Objective comparisons aren’t straightforward either – not least because variations of the same metric are commonly deployed; also the data disclosed together with a reported metric value is often not sufficient to reproduce the results.

“Many of the approaches provide similar overall qualitative performance values,” Rungger notes, “but the divergence in the technical implementation makes quantitative comparisons difficult and, by extension, slows progress of the field towards quantum advantage.”

The task then is to rationalize the metrics used to evaluate the performance for a given quantum hardware platform to a minimal yet representative set agreed across manufacturers, algorithm developers and end-users. These benchmarks also need to follow some agreed common approaches to fairly and objectively evaluate quantum computers from different equipment vendors.

With these objectives in mind, Rungger and colleagues conducted a deep-dive review that has yielded a comprehensive collection of metrics and benchmarks to allow holistic comparisons of quantum computers, assessing the quality of hardware components all the way to system-level performance and application-level metrics.

Drill down further and there’s a consistent format for each metric that includes its definition, a description of the methodology, the main assumptions and limitations, and a linked open-source software package implementing the methodology. The software transparently demonstrates the methodology and can also be used in practical, reproducible evaluations of all metrics.

“As research on metrics and benchmarks progresses, our collection of metrics and the associated software for performance evaluation are expected to evolve,” says Rungger. “Ultimately, the repository we have put together will provide a ‘living’ online resource, updated at regular intervals to account for community-driven developments in the field.”

From benchmarking to standards

Innovation being what it is, those developments are well under way. For starters, the importance of objective and relevant performance benchmarks for quantum computers has led several international standards bodies to initiate work on specific areas that are ready for standardization – work that, in turn, will give manufacturers, end-users and investors an informed evaluation of the performance of a range of quantum computing components, subsystems and full-stack platforms.

What’s evident is that the UK’s voice on metrics and benchmarking is already informing the collective conversation around standards development. “The quantum computing community and international standardization bodies are adopting a number of concepts from our approach to benchmarking standards,” notes Deep Lall, a quantum scientist in Rungger’s team at NPL and lead author of the study. “I was invited to present our work to a number of international standardization meetings and scientific workshops, opening up widespread international engagement with our research and discussions with colleagues across the benchmarking community.”

He continues: “We want the UK effort on benchmarking and metrics to shape the broader international effort. The hope is that the collection of metrics we have pulled together, along with the associated open-source software provided to evaluate them, will guide the development of standardized benchmarks for quantum computers and speed up the progress of the field towards practical quantum advantage.”

That’s a view echoed – and amplified – by Cyrus Larijani, NPL’s head of quantum programme. “As we move into the next phase of NPL’s quantum strategy, the importance of evidence-based decision making becomes ever-more critical,” he concludes. “By grounding our strategic choices in robust measurement science and real-world data, we ensure that our innovations not only push the boundaries of quantum technology but also deliver meaningful impact across industry and society.”

Further reading

Deep Lall et al. 2025 A  review and collection of metrics and benchmarks for quantum computers: definitions, methodologies and software https://arxiv.org/abs/2502.06717

The headline take from NQCC

Quantum computing technology has reached the stage where a number of methods for performance characterization are backed by a large body of real-world implementation and use, as well as by theoretical proofs. These mature benchmarking methods will benefit from commonly agreed-upon approaches that are the only way to fairly, unambiguously and objectively benchmark quantum computers from different manufacturers.

“Performance benchmarks are a fundamental enabler of technology innovation in quantum computing,” explains Konstantinos Georgopoulos, who heads up the NQCC’s quantum applications team and is responsible for the centre’s liaison with the NPL benchmarking consortium. “How do we understand performance? How do we compare capabilities? And, of course, what are the metrics that help us to do that? These are the leading questions we addressed through the course of this study.

”If the importance of benchmarking is a given, so too is collaboration and the need to bring research and industry stakeholders together from across the quantum ecosystem. “I think that’s what we achieved here,” says Georgopoulos. “The long list of institutions and experts who contributed their perspectives on quantum computing was crucial to the success of this project. What we’ve ended up with are better metrics, better benchmarks, and a better collective understanding to push forward with technology translation that aligns with end-user requirements across diverse industry settings.”

End note: NPL retains copyright on this article.

The post Performance metrics and benchmarks point the way to practical quantum advantage appeared first on Physics World.

Master’s programme takes microelectronics in new directions

23 octobre 2025 à 10:28
hong-kong-university-na-main-image
Professor Zhao Jiong, who leads a Master’s programme in microelectronics technology and material, has been recognized for his pioneering research in 2d ferroelectronics (Courtesy: PolyU)

The microelectronics sector is known for its relentless drive for innovation, continually delivering performance and efficiency gains within ever more compact form factors. Anyone aspiring to build a career in this fast-moving field needs not just a thorough grounding in current tools and techniques, but also an understanding of the next-generation materials and structures that will propel future progress.

That’s the premise behind a Master’s programme in microelectronics technology and materials at the Hong Kong Polytechnic University (PolyU). Delivered by the Department for Applied Physics, globally recognized for its pioneering research in technologies such as two-dimensional materials, nanoelectronics and artificial intelligence, the aim is to provide students with both the fundamental knowledge and practical skills they need to kickstart their professional future – whether they choose to pursue further research or to find a job in industry.

“The programme provides students with all the key skills they need to work in microelectronics, such as circuit design, materials processing and failure analysis,” says programme leader Professor Zhao Jiong, who research focuses on 2D ferroelectrics. “But they also have direct access to more than 20 faculty members who are actively investigating novel materials and structures that go beyond silicon-based technologies.”

The course in also unusual in providing a combined focus on electronics engineering and materials science, providing students with a thorough understanding of the underlying semiconductors and device structures as well as their use in mass-produced integrated circuits. That fundamental knowledge is reinforced through regular experimental work, providing the students with hands-on experience of fabricating and testing electronic devices. “Our cleanroom laboratory is equipped with many different instruments for microfabrication, including thin-film deposition, etching and photolithography, as well as advanced characterization tools for understanding their operating mechanisms and evaluating their performance,” adds Zhao.

In a module focusing on thin-film materials, for example, students gain valuable experience from practical sessions that enable them to operate the equipment for different growth techniques, such as sputtering, molecular beam epitaxy, and both physical and chemical vapour deposition. In another module on materials analysis and characterization, the students are tasked with analysing the layered structure of a standard computer chip by making cross-sections that can be studied with a scanning electron microscope.

During the programme students have access to a cleanroom laboratory that gives them hand-on experience of using advanced tools for fabricating and characterizing electronic materials and structures (Courtesy: PolyU)

That practical experience extends to circuit design, with students learning how to use state-of-the-art software tools for configuring, simulating and analysing complex electronic layouts. “Through this experimental work students gain the technical skills they need to design and fabricate integrated circuits, and to optimize their performance and reliability through techniques like failure analysis,” says Professor Dai Jiyan, PolyU Associate Dean of Students, who also teaches the module on thin-film materials. “This hands-on experience helps to prepare them for working in a manufacturing facility or for continuing their studies at the PhD level.”

Also integrated into the teaching programme is the use of artificial intelligence to assist key tasks, such as defect analysis, materials selection and image processing. Indeed, PolyU has established a joint laboratory with Huawei to investigate possible applications of AI tools in electronic design, providing the students with early exposure to emerging computational methods that are likely to shape the future of the microelectronics industry. “One of our key characteristics is that we embed AI into our teaching and laboratory work,” says Dai. “Two of the modules are directly related to AI, while the joint lab with Huawei helps students to experiment with using AI in circuit design.”

Now in its third year, the Master’s programme was designed in collaboration with Hong Kong’s Applied Science and Technology Research Institute (ASTRI), established in 2000 to enhance the competitiveness of the region through the use of advanced technologies. Researchers at PolyU already pursue joint projects with ASTRI in areas like chip design, microfabrication and failure analysis,. As part of the programme, these collaborators are often invited to give guest lectures or to guide the laboratory work. “Sometimes they even provide some specialized instruments for the students to use in their experiments,” says Zhao. “We really benefit from this collaboration.”

Once primed with the knowledge and experience from the taught modules, the students have the opportunity to work alongside one of the faculty members on a short research project. They can choose whether to focus on a topic that is relevant to present-day manufacturing, such as materials processing or advanced packaging technologies, or to explore the potential of emerging materials and devices across applications ranging from solar cells and microfluidics to next-generation memories and neuromorphic computing.

“It’s very interesting for the students to get involved in these projects,” says Zhao. “They learn more about the research process, which can make them more confident to take their studies to the next level. All of our faculty members are engaged in important work, and we can guide the students towards a future research field if that’s what they are interested in.”

There are also plenty of progression opportunities for those who are more interested in pursuing a career in industry. As well as providing support and advice through its joint lab in AI, Huawei arranges visits to its manufacturing facilities and offers some internships to interested students. PolyU also organizes visits to Hong Kong’s Science Park, home to multinational companies such as Infineon as well as a large number of start-up companies in the microelectronics sector. Some of these might support a student’s research project, or offer an internship in areas such as circuit design or microfabrication.

The international outlook offered by PolyU has made the Master’s programme particularly appealing to students from mainland China, but Zhao and Dai believe that the forward-looking ethos of the course should make it an appealing option for graduates across Asia and beyond. “Through the programme, the students gain knowledge about all aspects of the microelectronics industry, and how it is likely to evolve in the future,” says Dai. “The knowledge and technical skills gained by the students offer them a competitive edge for building their future career, whether they want to find a job in industry or to continue their research studies.”

The post Master’s programme takes microelectronics in new directions appeared first on Physics World.

NASA’s Jet Propulsion Lab lays off a further 10% of staff

22 octobre 2025 à 14:02

NASA’s Jet Propulsion Laboratory (JPL) is to lay off some 550 employees as part of a restructuring that began in July. The action affects about 11% of JPL’s employees and represents the lab’s third downsizing in the past 20 months. When the layoffs are complete by the end of the year, the lab will have roughly 4500 employees, down from about 6500 at the start of 2024. A further 4000 employees have already left NASA during the past six months via sacking, retirement or voluntary buyouts.

Managed by the California Institute of Technology in Pasadena, JPL oversees scientific missions such as the Psyche asteroid probe, the Europa Clipper and the Perseverance rover on Mars. The lab also operates the Deep Space Network that keeps Earth in communication with unmanned space missions. JPL bosses already laid off about 530 staff – and 140 contractors – in February last year followed by another 325 people in November 2024.

JPL director Dave Gallagher insists, however, that the new layoffs are not related to the current US government shutdown that began on 1 October. “[They are] essential to securing JPL’s future by creating a leaner infrastructure, focusing on our core technical capabilities, maintaining fiscal discipline, and positioning us to compete in the evolving space ecosystem,” he says in a message to employees.

Judy Chu, Democratic Congresswoman for the constituency that includes JPL, is less optimistic. “Every layoff devastates the highly skilled and uniquely talented workforce that has made these accomplishments possible,” she says. “Together with last year’s layoffs, this will result in an untold loss of scientific knowledge and expertise that threatens the very future of American leadership in space exploration and scientific discovery.”

John Logsdon, professor emeritus at George Washington University and founder of the university’s Space Policy Institute, says that the cuts are a direct result of the Trump administration’s approach to science and technology. “The administration gives low priority to robotic science and exploration, and has made draconic cuts to the science budget; that budget supports JPL’s work,” he told Physics World. “With these cuts, there is not enough money to support a JPL workforce sized for more ambitious activities. Ergo, staff cuts.”

The post NASA’s Jet Propulsion Lab lays off a further 10% of staff appeared first on Physics World.

How to solve the ‘future of physics’ problem

22 octobre 2025 à 12:00

I hugely enjoyed physics when I was a youngster. I had the opportunity both at home and school to create my own projects, which saw me make electronic circuits, crazy flying models like delta-wings and autogiros, and even a gas chromatograph with a home-made chart recorder. Eventually, this experience made me good enough to repair TV sets, and work in an R&D lab in the holidays devising new electronic flow controls.

That enjoyment continued beyond school. I ended up doing a physics degree at the University of Oxford before working on the discovery of the gluon at the DESY lab in Hamburg for my PhD. Since then I have used physics in industry – first with British Oxygen/Linde and later with Air Products & Chemicals – to solve all sorts of different problems, build innovative devices and file patents.

While some students have a similarly positive school experience and subsequent career path, not enough do. Quite simply, physics at school is the key to so many important, useful developments, both within and beyond physics. But we have a physics education problem, or to put it another way – a “future of physics” problem.

There are just not enough school students enjoying and learning physics. On top of that there are not enough teachers enjoying physics and not enough students doing practical physics. The education problem is bad for physics and for many other subjects that draw on physics. Alas, it’s not a new problem but one that has been developing for years.

Problem solving

Many good points about the future of physics learning were made by the Institute of Physics in its 2024 report Fundamentals of 11 to 19 Physics. The report called for more physics lessons to have a practical element and encouraged more 16-year-old students in England, Wales and Northern Ireland to take AS-level physics at 17 so that they carry their GCSE learning at least one step further.

Doing so would furnish students who are aiming to study another science or a technical subject with the necessary skills and give them the option to take physics A-level. Another recommendation is to link physics more closely to T-levels – two-year vocational courses in England for 16–19 year olds that are equivalent to A-levels – so that students following that path get a background in key aspects of physics, for example in engineering, construction, design and health.

But do all these suggestions solve the problem? I don’t think they are enough and we need to go further. The key change to fix the problem, I believe, is to have student groups invent, build and test their own projects. Ideally this should happen before GCSE level so that students have the enthusiasm and background knowledge to carry them happily forward into A-level physics. They will benefit from “pull learning” – pulling in knowledge and active learning that they will remember for life. And they will acquire wider life skills too.

Developing skillsets

During my time in industry, I did outreach work with schools every few weeks and gave talks with demonstrations at the Royal Institution and the Franklin Institute. For many years I also ran a Saturday Science club in Guildford, Surrey, for pupils aged 8–15.

Based on this, I wrote four Saturday Science books about the many playful and original demonstrations and projects that came out of it. Then at the University of Surrey, as a visiting professor, I had small teams of final-year students who devised extraordinary engineering – designing superguns for space launches, 3D printers for full-size buildings and volcanic power plants inter alia. A bonus was that other staff working with the students got more adventurous too.

But that was working with students already committed to a scientific path. So lately I’ve been working with teachers to get students to devise and build their own innovative projects. We’ve had 14–15-year-old state-school students in groups of three or four, brainstorming projects, sketching possible designs, and gathering background information. We help them and get A-level students to help too (who gain teaching experience in the process). Students not only learn physics better but also pick up important life skills like brainstorming, team-working, practical work, analysis and presentations.

We’ve seen lots of ingenuity and some great projects such as an ultrasonic scanner to sense wetness of cloth; a system to teach guitar by lighting up LEDs along the guitar neck; and measuring breathing using light passing through a band of Lycra around the patient below the ribs. We’ve seen the value of failure, both mistakes and genuine technical problems.

Best of all, we’ve also noticed what might be dubbed the “combination bonus” – students having to think about how they combine their knowledge of one area of physics with another.  A project involving a sensor, for example, will often involve electronics as well the physics of the sensor and so student knowledge of both areas is enhanced.

Some teachers may question how you mark such projects. The answer is don’t mark them! Project work and especially group work is difficult to mark fairly and accurately, and the enthusiasm and increased learning by students working on innovative projects will feed through into standard school exam results.

Not trying to grade such projects will mean more students go on to study physics further, potentially to do a physics-related extended project qualification – equivalent to half an A-level where students research a topic to university level – and do it well. Long term, more students will take physics with them into the world of work, from physics to engineering or medicine, from research to design or teaching.

Such projects are often fun for students and teachers. Teachers are often intrigued and amazed by students’ ideas and ingenuity. So, let’s choose to do student-invented project work at school and let’s finally solve the future of physics problem.

The post How to solve the ‘future of physics’ problem appeared first on Physics World.

‘Science needs all perspectives – male, female and everything in-between’: Brazilian astronomer Thaisa Storchi Bergmann

21 octobre 2025 à 15:30

As a teenager in her native Rio Grande do Sul, a state in Southern Brazil, Thaisa Storchi Bergmann enjoyed experimenting in an improvised laboratory her parents built in their attic. They didn’t come from a science background – her father was an accountant, her mother a primary school teacher – but they encouraged her to do what she enjoyed. With a friend from school, Storchi Bergmann spent hours looking at insects with a microscope and running experiments from a chemistry toy kit. “We christened the lab Thasi-Cruz after a combination of our names,” she chuckles.

At the time, Storchi Bergmann could not have imagined that one day this path would lead to cosmic discoveries and international recognition at the frontiers of astrophysics. “I always had the curiosity inside me,” she recalls. “It was something I carried since adolescence.”

That curiosity almost got lost to another discipline. By the time Storchi Bergmann was about to enter university, she was swayed by a cousin living with her family who was passionate about architecture. By 1974 she began studying architecture at the Federal University of Rio Grande do Sul (UFRGS). “But I didn’t really like technical drawing. My favourite part of the course were physics classes,” she says. Within a semester, she switched to physics.

There she met Edemundo da Rocha Vieira, the first astrophysicist UFRGS ever hired – who later went on to structure the university’s astronomy department. He nurtured Storchi Bergmann’s growing fascination with the universe and introduced her to research.

In 1977, newly married after graduation, Storchi Bergmann followed her husband to Rio de Janeiro, where she did a master’s degree and worked with William Kunkel, an American astronomer who was in Rio to help establish Brazil’s National Astrophysics Laboratory. She began working on data from a photometric system to measure star radiation. “But Kunkel said galaxies were a lot more interesting to study, and that stuck in my head,” she says.

Three years after moving to Rio, she returned to Porto Alegre, in Rio Grande do Sul, to start her doctoral research and teach at UFRGS. Vital to her career was her decision to join the group of Miriani Pastoriza, one of the pioneers of extragalactic astrophysics in Latin America. “She came from Argentina, where [in the late 1970s and early 1980s] scientists were being strongly persecuted [by the country’s military dictatorship] at the time,” she recalls. Pastoriza studied galaxies with “peculiar nuclei” – objects later known to harbour supermassive black holes. Under Pastoriza’s guidance, she moved from stars to galaxies, laying the foundation for her career.

Between 1986 and 1987, Storchi Bergmann often travelled to Chile to make observations and gather data for her PhD, using some of the largest telescopes available at the time. Then came a transformative period – a postdoc fellowship in Maryland, US, just as the Hubble Space Telescope was launched in 1990. “Each Thursday, I would drive to Baltimore for informal bag-lunch talks at the Space Telescope Science Institute, absorbing new results on active galactic nuclei (AGN) and supermassive black holes,” Storchi Bergmann recalls.

Discoveries and insights

In 1991, during an observing campaign, she and a collaborator saw something extraordinary in the galaxy NGC 1097: gas moving at immense speeds, captured by the galaxy’s central black hole. The work, published in 1993, became one of the earliest documented cases of what are now called “tidal disruption events”, in which a star or cloud gets too close to a black hole and is torn apart.

Her research also contributed to one of the defining insights of the Hubble era: that every massive galaxy hosts a central black hole. “At first, we didn’t know if they were rare,” she explains. “But gradually it became clear: these objects are fundamental to galaxy evolution.”

Another collaboration brought her into contact with Daniela Calzetti, whose work on the effects of interstellar dust led to the formulation of the widely used “Calzetti law”. These and other contributions placed Storchi Bergmann among the most cited scientists worldwide, recognition of which came in 2015 when she received the L’Oréal-UNESCO Award for Women in Science.

Her scientific achievements, however, unfolded against personal and structural obstacles. As a young mother, she often brought her baby to observatories and conferences so she could breastfeed. This kind of juggling is no stranger to many women in science.

“It was never easy,” Storchi Bergmann reflects. “I was always running, trying to do 20 things at once.” The lack of childcare infrastructure in universities compounded the challenge. She recalls colleagues who succeeded by giving up on family life altogether. “That is not sustainable,” she insists. “Science needs all perspectives – male, female and everything in-between. Otherwise, we lose richness in our vision of the universe.”

When she attended conferences early in her career, she was often the only woman in the room. Today, she says, the situation has greatly improved, even if true equality remains distant.

Now a tenured professor at UFRGS and a member of the Brazilian Academy of Sciences, Storchi Bergmann continues to push at the cosmic frontier. Her current focus is the Legacy Survey of Space and Time (LSST), about to begin at the Vera Rubin Observatory in Chile.

Her group is part of the AGN science collaboration, developing methods to analyse the characteristic flickering of accreting black holes. With students, she is experimenting with automated pipelines and artificial intelligence to make sense of and manage the massive amounts of data.

Challenges ahead

Yet this frontier science is not guaranteed. Storchi Bergmann is frustrated by the recent collapse in research scholarships. Historically, her postgraduate programme enjoyed a strong balance of grants from both of Brazil’s federal research funding agencies, CNPq (from the Ministry of Science) and CAPES (from the Ministry of Education). But cuts at CNPq, she says, have left students without support, and CAPES has not filled the gap.

“The result is heartbreaking,” she says. “I have brilliant students ready to start, including one from Piauí (a state in north-eastern Brazil), but without a grant, they simply cannot continue. Others are forced to work elsewhere to support themselves, leaving no time for research.”

She is especially critical of the policy of redistributing scarce funds away from top-rated programmes to newer ones without expanding the overall budget. “You cannot build excellence by dismantling what already exists,” she argues.

For her, the consequences go beyond personal frustration. They risk undermining decades of investment that placed Brazil on the international astrophysics map. Despite these challenges, Storchi Bergmann remains driven and continues to mentor master’s and PhD students, determined to prepare them for the LSST era.

At the heart of her research is a question as grand as any in cosmology: which came first – the galaxy or its central black hole? The answer, she believes, will reshape our understanding of how the universe came to be. And it will carry with it the fingerprint of her work: the persistence of a Brazilian scientist who followed her curiosity from a home-made lab to the centres of galaxies, overcoming obstacles along the way.

The post ‘Science needs all perspectives – male, female and everything in-between’: Brazilian astronomer Thaisa Storchi Bergmann appeared first on Physics World.

Illuminating quantum worlds: a Diwali conversation with Rupamanjari Ghosh

21 octobre 2025 à 18:25

Homes and cities around the world are this week celebrating Diwali or Deepavali – the Indian “festival of lights”. For Indian physicist Rupamanjari Ghosh, who is the former vice chancellor of Shiv Nadar University Delhi-NCR, this festival sheds light on the quantum world. Known for her work on nonlinear optics and entangled photons, Ghosh finds a deep resonance between the symbolism of Diwali and the ongoing revolution in quantum science.

“Diwali comes from Deepavali, meaning a ‘row of lights’. It marks the triumph of light over dark; good over evil; and knowledge over ignorance,” Ghosh explains. “In science too, every discovery is a Diwali –  a victory of knowledge over ignorance.”

With 2025 being marked by the International Year of Quantum Science and Technology, a victory of knowledge over ignorance couldn’t ring truer. “It has taken us a hundred years since the birth of quantum mechanics to arrive at this point, where quantum technologies are poised to transform our lives,” says Ghosh.

Ghosh has another reason to celebrate, having been named as this year’s Institute of Physics (IOP) Homi Bhabha lecturer. The IOP and the Indian Physical Association (IPA) jointly host the Homi Bhabha and Cockcroft Walton bilateral exchange of lecturers. Running since 1998, these international programmes aim to promote dialogue on global challenges through physics and provide physicists with invaluable opportunities for global exposure and professional growth. Ghosh’s online lecture, entitled “Illuminating quantum frontiers: from photons to emerging technologies”, will be aired at 3 p.m. GMT on Wednesday 22 October.

From quantum twins to quantum networks

Ghosh’s career in physics took off in the mid-1980s, when she and American physicist Leonard Mandel – who is often referred to as one of the founding fathers of quantum optics – demonstrated a new quantum source of twin photons through spontaneous parametric down-conversion: a process where a high-energy photon splits into two lower-energy, correlated photons (Phys. Rev. Lett. 59, 1903).

“Before that,” she recalls, “no-one was looking for quantum effects in this nonlinear optical process. The correlations between the photons defied classical explanation. It was an elegant early verification of quantum nonlocality.”

Those entangled photon pairs are now the building blocks of quantum communication and computation. “We’re living through another Diwali of light,” she says, “where theoretical understanding and experimental innovation illuminate each other.”

Entangled light

During Diwali, lamps unite households in a shimmering network of connection,  and so too does entanglement of photons. “Quantum entanglement reminds us that connection transcends locality,” Ghosh says. “In the same way, the lights of Diwali connect us across borders and cultures through shared histories.”

Her own research extends that metaphor further. Ghosh’s team has worked on mapping quantum states of light onto collective atomic excitations. These “slow-light” techniques –  using electromagnetically induced transparency or Raman interactions –  allow photons to be stored and retrieved, forming the backbone of long-distance quantum communication (Phys. Rev. A. 88 023852, EPL 105 44002)

“Symbolically,” she adds, “it’s like passing the flame from one diya (lamp) to another. We’re not just spreading light –  we’re preserving, encoding and transmitting it. Success comes through connection and collaboration.”

Rupamanjari Ghosh
Beyond the shadows: Ghosh calls for the bright light of inclusivity in science. (Courtesy: Rupamanjari Ghosh)

The dark side of light

Ghosh is quick to note that in quantum physics, “darkness” is far from empty. “In quantum optics, even the vacuum is rich –  with fluctuations that are essential to our understanding of the universe.”

Her group studies the transition from quantum to classical systems, using techniques such as error correction, shielding and coherence-preserving materials. “Decoherence –  the loss of quantum behaviour through environmental interaction –  is a constant threat. To build reliable quantum technologies, we must engineer around this fragility,” Ghosh explains.

There are also human-engineered shadows: some weaknesses in quantum communication devices aren’t due to the science itself – they come from mistakes or flaws in how humans built them. Hackers can exploit these “side channels” to get around security. “Security,” she warns, “is only as strong as the weakest engineering link.”

Beyond the lab, Ghosh finds poetic meaning in these challenges. “Decoherence isn’t just a technical problem –  it helps us understand the arrows of time, why the universe evolves irreversibly. The dark side has its own lessons.”

Lighting every corner

For Ghosh, Diwali’s illumination is also a call for inclusivity in science. “No corner should remain dark,” she says. “Science thrives on diversity. Diverse teams ask broader questions and imagine richer answers. It’s not just morally right – it’s good for science.”

She argues that equity is not sameness but recognition of uniqueness. “Innovation doesn’t come from conformity. Gender diversity, for example, brings varied cognitive and collaborative styles – essential in a field like quantum science, where intuition is constantly stretched.”

The shadows she worries most about are not in the lab, but in academia itself. “Unconscious biases in mentorship or gatekeeping in opportunity can accumulate to limit visibility. Institutions must name and dismantle these hidden shadows through structural and cultural change.”

Her vision of inclusion extends beyond gender. “We shouldn’t think of work and life as opposing realms to ‘balance’,” she says. “It’s about creating harmony among all dimensions of life – work, family, learning, rejuvenation. That’s where true brilliance comes from.”

As the rows of diyas are lit this Diwali, Ghosh’s reflections remind us that light –  whether classical or quantum –  is both a physical and moral force: it connects, illuminates and endures. “Each advance in quantum science,” she concludes, “is another step in the age-old journey from darkness to light.”

This article forms part of Physics World‘s contribution to the 2025 International Year of Quantum Science and Technology (IYQ), which aims to raise global awareness of quantum physics and its applications.

Stayed tuned to Physics World and our international partners throughout the year for more coverage of the IYQ.

Find out more on our quantum channel.

The post Illuminating quantum worlds: a Diwali conversation with Rupamanjari Ghosh appeared first on Physics World.

Wearable UVA sensor warns about overexposure to sunlight

17 octobre 2025 à 10:09
Illustration showing the operation of the UVA detector
Transparent healthcare Illustration of the fully transparent sensor that reacts to sunlight and allows real-time monitoring of UVA exposure on the skin. The device could be integrated into wearable items, such as glasses or patches. (Courtesy: Jnnovation Studio)

A flexible and wearable sensor that allows the user to monitor their exposure to ultraviolet (UV) radiation has been unveiled by researchers in South Korea. Based on a heterostructure of four different oxide semiconductors, the sensor’s flexible, transparent design could vastly improve the real-time monitoring of skin health.

UV light in the A band has wavelengths of 315–400 nm and comprises about 95% of UV radiation that reaches the surface of the earth. Because of its relatively long wavelength, UVA can penetrate deep into the skin. There it can alter biological molecules, damaging tissue and even causing cancer.

While covering up with clothing and using sunscreen are effective at reducing UVA exposure, researchers are keen on developing wearable sensors that can monitor UVA levels in real time. These can alert users when their UVA exposure reaches a certain level. So far, the most promising advances towards these designs have come from oxide semiconductors.

Many challenges

“For the past two decades, these materials have been widely explored for displays and thin-film transistors because of their high mobility and optical transparency,” explains Seong Jun Kang at Soongsil University, who led the research. “However, their application to transparent ultraviolet photodetectors has been limited by high persistent photocurrent, poor UV–visible discrimination, and instability under sunlight.”

While these problems can be avoided in more traditional UV sensors, such as gallium nitride and zinc oxide, these materials are opaque and rigid – making them completely unsuitable for use in wearable sensors.

In their study, Kang’s team addressed these challenges by introducing a multi-junction heterostructure, made by stacking multiple ultrathin layers of different oxide semiconductors. The four semiconductors they selected each had wide bandgaps, which made them more transparent in the visible spectrum but responsive to UV light.

The structure included zinc and tin oxide layers as n-type semiconductors (doped with electron-donating atoms) and cobalt and hafnium oxide layers as p-type semiconductors (doped with electron-accepting atoms) – creating positively charged holes. Within the heterostructure, this selection created three types of interface: p–n junctions between hafnium and tin oxide; n–n junctions between tin and zinc oxide; and p–p junctions between cobalt and hafnium oxide.

Efficient transport

When the team illuminated their heterostructure with UVA photons, the electron–hole charge separation was enhanced by the p–n junction, while the n–n and p–p junctions allowed for more efficient transport of electrons and holes respectively, improving the design’s response speed. When the illumination was removed, the electron–hole pairs could quickly decay, avoiding any false detections.

To test their design’s performance, the researchers integrated their heterostructure into a wearable detector. “In collaboration with UVision Lab, we developed an integrated Bluetooth circuit and smartphone application, enabling real-time display of UVA intensity and warning alerts when an individual’s exposure reaches the skin-type-specific minimal erythema dose (MED),” Kang describes. “When connected to the Bluetooth circuit and smartphone application, it successfully tracked real-time UVA variations and issued alerts corresponding to MED limits for various skin types.”

As well as maintaining over 80% transparency, the sensor proved highly stable and responsive, even in direct outdoor sunlight and across repeated exposure cycles. Based on this performance, the team is now confident that their design could push the capabilities of oxide semiconductors beyond their typical use in displays and into the fast-growing field of smart personal health monitoring.

“The proposed architecture establishes a design principle for high-performance transparent optoelectronics, and the integrated UVA-alert system paves the way for next-generation wearable and Internet-of-things-based environmental sensors,” Kang predicts.

The research is described in Science Advances.

The post Wearable UVA sensor warns about overexposure to sunlight appeared first on Physics World.

Researchers visualize blood flow in pulsating artificial heart

16 octobre 2025 à 10:00

A research team in Sweden has used real-time imaging technology to visualize the way that blood pumps around a pulsating artificial heart – moving medicine one step closer to the safe use of such devices in people waiting for donor transplants.

The Linköping University (LiU) team used 4D flow MRI to examine the internal processes of a mechanical heart prototype created by Västerås-based technology company Scandinavian Real Heart. The researchers evaluated blood flow patterns and compared them with similar measurements taken in a native human heart, outlining their results in Scientific Reports.

“As the pulsatile total artificial heart contains metal parts, like the motor, we used 3D printing [to replace most metal parts] and a physiological flow loop so we could run it in the MRI scanner under representable conditions,” says first author Twan Bakker, a PhD student at the Center for Medical Image Science and Visualization at LiU.

No elevated risk

According to Bakker, this is first time that a 3D-printed MRI-compatible artificial heart has been built and successfully evaluated using 4D flow MRI. The team was pleased to discover that the results corroborate the findings of previous computational fluid dynamics simulations indicating “low shear stress and low stagnation”. Overall flow patterns also suggest there is no elevated risk for blood complications compared with hearts in healthy humans and those suffering from valvular disease.

“[The] patterns of low blood flow, a risk for thrombosis, were in the same range as for healthy native human hearts. Patterns of turbulent flow, a risk for activation of blood platelets, which can contribute to thrombosis, were lower than those found in patients with valvular disease,” says Bakker.

“4D flow MRI allows us to measure the flow field without altering the function of the total artificial heart, which is therefore a valuable tool to complement computer simulations and blood testing during the development of the device. Our measurements provided valuable information to the design team that could improve the artificial heart prototype further,” he adds.

Improved diagnostics

A key advantage of 4D flow MRI over alternative measurement techniques – such as particle image velocimetry and laser doppler anemometry – is that it doesn’t require the creation of a fully transparent model. This is an important distinction for Bakker, since some components in the artificial heart are made with materials possessing unique mechanical properties, meaning that replication in a see-through version would be extremely challenging.

Visualizing blood flow The central image shows a representation of the full cardiac cycle in the artificial heart, with circulating flow patterns in various locations highlighted at specified time points. (Courtesy: CC BY 4.0/Sci. Rep. 10.1038/s41598-025-18422-y)

“With 4D flow MRI we had to move the motor away from the scanner bore, but the material in contact with the blood and the motion of the device remained as the original design,” says Bakker.

According to Bakker, the velocity measurements can also be used for visualization and analysis of hemodynamic parameters, such as turbulent kinetic energy, wall shear stresses and more in the heart, as well as for larger vessels in our bodies.

“By studying the flow dynamics in patients and healthy subjects, we can better understand its role in health and disease, which can then support improved diagnostics, interventions and surgical therapies,” he explains.

Moving forward, Bakker says that the research team will continue to evaluate the improved heart design, which was recently granted designation as a Humanitarian Use Device (HUD) by the US Food and Drug Administration (FDA).

“This makes it possible to apply for designation as a Humanitarian Device Exemption (HDE) – which may grant the device limited marketing rights and paves the way for the pre-clinical and clinical studies,” he says.

“In addition, we are currently developing tools to compute blood flow using simulations. This may provide us with a deeper understanding of the mechanisms that cause the formation of thrombosis and haemolysis,” he tells Physics World.

The post Researchers visualize blood flow in pulsating artificial heart appeared first on Physics World.

Quantum computing on the verge: a look at the quantum marketplace of today

14 octobre 2025 à 17:40

“I’d be amazed if quantum computing produces anything technologically useful in ten years, twenty years, even longer.” So wrote University of Oxford physicist David Deutsch – often considered the father of the theory of quantum computing – in 2004. But, as he added in a caveat, “I’ve been amazed before.”

We don’t know how amazed Deutsch, a pioneer of quantum computing, would have been had he attended a meeting at the Royal Society in London in February on “the future of quantum information”. But it was tempting to conclude from the event that quantum computing has now well and truly arrived, with working machines that harness quantum mechanics to perform computations being commercially produced and shipped to clients. Serving as the UK launch of the International Year of Quantum Science and Technology (IYQ) 2025, it brought together some of the key figures of the field to spend two days discussing quantum computing as something like a mature industry, even if one in its early days.

Werner Heisenberg – who worked out the first proper theory of quantum mechanics 100 years ago – would surely have been amazed to find that the formalism he and his peers developed to understand the fundamental behaviour of tiny particles had generated new ways of manipulating information to solve real-world problems in computation. So far, quantum computing – which exploits phenomena such as superposition and entanglement to potentially achieve greater computational power than the best classical computers can muster – hasn’t tackled any practical problems that can’t be solved classically.

Although the fundamental quantum principles are well-established and proven to work, there remain many hurdles that quantum information technologies have to clear before this industry can routinely deliver resources with transformative capabilities. But many researchers think that moment of “practical quantum advantage” is fast approaching, and an entire industry is readying itself for that day.

Entangled marketplace

So what are the current capabilities and near-term prospects for quantum computing?

The first thing to acknowledge is that a booming quantum-computing market exists. Devices are being produced for commercial use by a number of tech firms, from the likes of IBM, Google, Canada-based D-Wave, and Rigetti who have been in the field for a decade or more; to relative newcomers like Nord Quantique (Canada), IQM (Finland), Quantinuum (UK and US), Orca (UK) and PsiQuantum (US), Silicon Quantum Computing (Australia), see box below, “The global quantum ecosystem”.

The global quantum ecosystem

Map showing the investments globally into quantum computing
(Courtesy: QURECA)

We are on the cusp of a second quantum revolution, with quantum science and technologies growing rapidly across the globe. This includes quantum computers; quantum sensing (ultra-high precision clocks, sensors for medical diagnostics); as well as quantum communications (a quantum internet). Indeed, according to the State of Quantum 2024 report, a total of 33 countries around the world currently have government initiatives in quantum technology, of which more than 20 have national strategies with large-scale funding. As of this year, worldwide investments in quantum tech – by governments and industry – exceed $55.7 billion, and the market is projected to reach $106 billion by 2040. With the multitude of ground-breaking capabilities that quantum technologies bring globally, it’s unsurprising that governments all over the world are eager to invest in the industry.

With data from a number of international reports and studies, quantum education and skills firm QURECA has summarized key programmes and efforts around the world. These include total government funding spent through 2025, as well as future commitments spanning 2–10 year programmes, varying by country. These initiatives generally represent government agencies’ funding announcements, related to their countries’ advancements in quantum technologies, excluding any private investments and revenues.

A supply chain is also organically developing, which includes manufacturers of specific hardware components, such as Oxford Instruments and Quantum Machines and software developers like Riverlane, based in Cambridge, UK, and QC Ware in Palo Alto, California. Supplying the last link in this chain are a range of eager end-users, from finance companies such as J P Morgan and Goldman Sachs to pharmaceutical companies such as AstraZeneca and engineering firms like Airbus. Quantum computing is already big business, with around 400 active companies and current global investment estimated at around $2 billion.

But the immediate future of all this buzz is hard to assess. When the chief executive of computer giant Nvidia announced at the start of 2025 that “truly useful” quantum computers were still two decades away, the previously burgeoning share prices of some leading quantum-computing companies plummeted. They have since recovered somewhat, but such volatility reflects the fact that quantum computing has yet to prove its commercial worth.

The field is still new and firms need to manage expectations and avoid hype while also promoting an optimistic enough picture to keep investment flowing in. “Really amazing breakthroughs are being made,” says physicist Winfried Hensinger of the University of Sussex, “but we need to get away from the expectancy that [truly useful] quantum computers will be available tomorrow.”

The current state of play is often called the “noisy intermediate-scale quantum” (NISQ) era. That’s because the “noisy” quantum bits (qubits) in today’s devices are prone to errors for which no general and simple correction process exists. Current quantum computers can’t therefore carry out practically useful computations that could not be done on classical high-performance computing (HPC) machines. It’s not just a matter of better engineering either; the basic science is far from done.

IBM quantum computer cryogenic chandelier
Building up Quantum computing behemoth IBM says that by 2029, its fault-tolerant system should accurately run 100 million gates on 200 logical qubits, thereby truly achieving quantum advantage. (Courtesy: IBM)

“We are right on the cusp of scientific quantum advantage – solving certain scientific problems better than the world’s best classical methods can,” says Ashley Montanaro, a physicist at the University of Bristol who co-founded the quantum software company Phasecraft. “But we haven’t yet got to the stage of practical quantum advantage, where quantum computers solve commercially important and practically relevant problems such as discovering the next lithium-ion battery.” It’s no longer if or how, but when that will happen.

Pick your platform

As the quantum-computing business is such an emerging area, today’s devices use wildly different types of physical systems for their qubits, see the box below, “Comparing computing modalities: from qubits to architectures”

. There is still no clear sign as to which of these platforms, if any, will emerge as the winner. Indeed many researchers believe that no single qubit type will ever dominate.

The top-performing quantum computers, like those made by Google (with its 105-qubit Willow chip) and IBM (which has made the 121-qubit Condor), use qubits in which information is encoded in the wavefunction of a superconducting material. Until recently, the strongest competing platform seemed to be trapped ions, where the qubits are individual ions held in electromagnetic traps – a technology being developed into working devices by the US company IonQ, spun out from the University of Maryland, among others.

Comparing computing modalities: from qubits to architectures

Table listing out the different types of qubit, the advantages of each and which company uses which qubit
(Courtesy: PatentVest)

Much like classical computers, quantum computers have a core processor and a control stack – the difference being that the core depends on the type of qubit being used. Currently, quantum computing is not based on a single platform, but rather a set of competing hardware approaches, each with its own physical basis for creating and controlling qubits and keeping them stable.

The data above –  taken from the August 2025 report Quantum Computing at an Inflection Point: Who’s Leading, What They Own, and Why IP Decides Quantum’s Future by US firm Patentvest – shows the key “quantum modalities”, which refers to the different types of qubits and architectures used to build these quantum systems. Differing qubits each have their own pros and cons, with varying factors including the temperature at which they operate, coherence time, gate speed, and how easy they might be to scale up.

But over the past few years, neutral trapped atoms have emerged as a major contender, thanks to advances in controlling the positions and states of these qubits. Here the atoms are prepared in highly excited electronic states called Rydberg atoms, which can be entangled with one another over a few microns. A Harvard startup called QuEra is developing this technology, as is the French start-up Pasqal. In September a team from the California Institute of Technology announced a 6100-qubit array made from neutral atoms. “Ten years ago I would not have included [neutral-atom] methods if I were hedging bets on the future of quantum computing,” says Deutsch’s Oxford colleague, the quantum information theorist Andrew Steane. But like many, he thinks differently now.

Some researchers believe that optical quantum computing, using photons as qubits, will also be an important platform. One advantage here is that there is no need for complex conversion of photonic signals in existing telecommunications networks going to or from the processing units, which is also handy for photonic interconnections between chips. What’s more, photonic circuits can work at room temperature, whereas trapped ions and superconducting qubits need to be cooled. Photonic quantum computing is being developed by firms like PsiQuantum, Orca, and Xanadu.

Other efforts, for example at Intel and Silicon Quantum Computing in Australia, make qubits from either quantum dots (Intel) or precision-placed phosphorus atoms (SQC), both in good old silicon, which benefits from a very mature manufacturing base. “Small qubits based on ions and atoms yield the highest quality processors”, says Michelle Simmons of the University of New South Wales, who is the founder and CEO of SQC. “But only atom-based systems in silicon combine this quality with manufacturability.”

Intel's silicon spin qubits are now being manufactured on an industrial scale
Spinning around Intel’s silicon spin qubits are now being manufactured on an industrial scale. (Courtesy: Intel Corporation)

And it’s not impossible that entirely new quantum computing platforms might yet arrive. At the start of 2025, researchers at Microsoft’s laboratories in Washington State caused a stir when they announced that they had made topological qubits from semiconducting and superconducting devices, which are less error-prone than those currently in use. The announcement left some scientists disgruntled because it was not accompanied by a peer-reviewed paper providing the evidence for these long-sought entities. But in any event, most researchers think it would take a decade or more for topological quantum computing to catch up with the platforms already out there.

Each of these quantum technologies has its own strengths and weaknesses. “My personal view is that there will not be a single architecture that ‘wins’, certainly not in the foreseeable future,” says Michael Cuthbert, founding director of the UK’s National Quantum Computing Centre (NQCC), which aims to facilitate the transition of quantum computing from basic research to an industrial concern. Cuthbert thinks the best platform will differ for different types of computation: cold neutral atoms might be good for quantum simulations of molecules, materials and exotic quantum states, say, while superconducting and trapped-ion qubits might be best for problems involving machine learning or optimization.

Measures and metrics

Given these pros and cons of different hardware platforms, one difficulty in assessing their merits is finding meaningful metrics for making comparisons. Should we be comparing error rates, coherence times (basically how long qubits remain entangled), gate speeds (how fast a single computational step can be conducted), circuit depth (how many steps a single computation can sustain), number of qubits in a processor, or what? “The metrics and measures that have been put forward so far tend to suit one or other platform more than others,” says Cuthbert, “such that it becomes almost a marketing exercise rather than a scientific benchmarking exercise as to which quantum computer is better.”

The NQCC evaluates the performance of devices using a factor known as the “quantum operation” (QuOp). This is simply the number of quantum operations that can be carried out in a single computation, before the qubits lose their coherence and the computation dissolves into noise. “If you want to run a computation, the number of coherent operations you can run consecutively is an objective measure,” Cuthbert says. If we want to get beyond the NISQ era, he adds, “we need to progress to the point where we can do about a million coherent operations in a single computation. We’re now at the level of maybe a few thousand. So we’ve got a long way to go before we can run large-scale computations.”

One important issue is how amenable the platforms are to making larger quantum circuits. Cuthbert contrasts the issue of scaling up – putting more qubits on a chip – with “scaling out”, whereby chips of a given size are linked in modular fashion. Many researchers think it unlikely that individual quantum chips will have millions of qubits like the silicon chips of today’s machines. Rather, they will be modular arrays of relatively small chips linked at their edges by quantum interconnects.

Having made the Condor, IBM now plans to focus on modular architectures (scaling out) – a necessity anyway, since superconducting qubits are micron-sized, so a chip with millions of them would be “bigger than your dining room table”, says Cuthbert. But superconducting qubits are not easy to scale out because microwave frequencies that control and read out the qubits have to be converted into optical frequencies for photonic interconnects. Cold atoms are easier to scale up, as the qubits are small, while photonic quantum computing is easiest to scale out because it already speaks the same language as the interconnects.

To be able to build up so called “fault tolerant” quantum computers, quantum platforms must solve the issue of error correction, which will enable more extensive computations without the results becoming degraded into mere noise.

In part two of this feature, we will explore how this is being achieved and meet the various firms developing quantum software. We will also look into the potential high-value commercial uses for robust quantum computers – once such devices exist.

  • This article was updated with additional content on 22 October 2025.

This article forms part of Physics World‘s contribution to the 2025 International Year of Quantum Science and Technology (IYQ), which aims to raise global awareness of quantum physics and its applications.

Stayed tuned to Physics World and our international partners throughout the year for more coverage of the IYQ.

Find out more on our quantum channel.

The post Quantum computing on the verge: a look at the quantum marketplace of today appeared first on Physics World.

Phase shift in optical cavities could detect low-frequency gravitational waves

13 octobre 2025 à 10:00

A network of optical cavities could be used to detect gravitational waves (GWs) in an unexplored range of frequencies, according to researchers in the UK. Using technology already within reach, the team believes that astronomers could soon be searching for ripples in space–time across the milli-Hz frequency band at 10⁻⁵ Hz–1 Hz.

GWs were first observed a decade ago and since then the LIGO–Virgo–KAGRA detectors have spotted GWs from hundreds of merging black holes and neutron stars. These detectors work in the 10 Hz–30 kHz range. Researchers have also had some success at observing a GW background at nanohertz frequencies using pulsar timing arrays.

However, GWs have yet to be detected in the milli-Hz band, which should include signals from binary systems of white dwarfs, neutron stars, and stellar-mass black holes. Many of these signals would emanate from the Milky Way.

Several projects are now in the works to explore these frequencies, including the space-based interferometers LISA, Taiji, and TianQin; as well as satellite-borne networks of ultra-precise optical clocks. However, these projects are still some years away.

Multidisciplinary effort

Joining these efforts was a collaboration called QSNET, which was within the UK’s Quantum Technology for Fundamental Physics (QTFP) programme. “The QSNET project was a network of clocks for measuring the stability of fundamental constants,” explains Giovanni Barontini at the University of Birmingham. “This programme brought together physics communities that normally don’t interact, such as quantum physicists, technologists, high energy physicists, and astrophysicists.”

QTFP ended this year, but not before Barontini and colleagues had made important strides in demonstrating how milli-Hz GWs could be detected using optical cavities.

Inside an ultrastable optical cavity, light at specific resonant frequencies bounces constantly between a pair of opposing mirrors. When this resonant light is produced by a specific atomic transition, the frequency of the light in the cavity is very precise and can act as the ticking of an extremely stable clock.

“Ultrastable cavities are a main component of modern optical atomic clocks,” Barontini explains. “We demonstrated that they have reached sufficient sensitivities to be used as ‘mini-LIGOs’ and detect gravitational waves.”

When such GW passes through an optical cavity, the spacing between its mirrors does not change in any detectable way. However, QSNET results have led to Barontini’s team to conclude that milli-Hz GWs alter the phase of the light inside the cavity. What is more, they conclude that this effect would be detectable in the most precise optical cavities currently available.

“Methods from precision measurement with cold atoms can be transferred to gravitational-wave detection,” explains team member Vera Guarrera. “By combining these toolsets, compact optical resonators emerge as credible probes in the milli-Hz band, complementing existing approaches.”

Ground-based network

Their compact detector would comprise two optical cavities at 90° to each other – each operating at a different frequency – and an atomic reference at a third frequency. The phase shift caused by a passing gravitational wave is revealed in a change in how the three frequencies interfere with each other. The team proposes linking multiple detectors to create a global, ground-based network. This, they say, could detect a GW and also locate the position of its source in the sky.

By harnessing this existing technology, the researchers now hope that future studies could open up a new era of discovery of GWs in the milli-Hz range, far sooner than many projects currently in development.

“This detector will allow us to test astrophysical models of binary systems in our galaxy, explore the mergers of massive black holes, and even search for stochastic backgrounds from the early universe,” says team member Xavier Calmet at the University of Sussex. “With this method, we have the tools to start probing these signals from the ground, opening the path for future space missions.”

Barontini adds, “Hopefully this work will inspire the build-up of a global network of sensors that will scan the skies in a new frequency window that promises to be rich of sources – including many from our own galaxy,”.

The research is described in Classical and Quantum Gravity.

 

The post Phase shift in optical cavities could detect low-frequency gravitational waves appeared first on Physics World.

Hints of a boundary between phases of nuclear matter found at RHIC

9 octobre 2025 à 17:30

In a major advance for nuclear physics, scientists on the STAR Detector at the Relativistic Heavy Ion Collider (RHIC) in the US have spotted subtle but striking fluctuations in the number of protons emerging from high-energy gold–gold collisions. The observation might be the most compelling sign yet of the long-sought “critical point” marking a boundary separating different phases of nuclear matter. This similar to how water can exist in liquid or vapour phases depending on temperature and pressure.

Team member Frank Geurts at Rice University in the US tells Physics World that these findings could confirm that the “generic physics properties of phase diagrams that we know for many chemical substances apply to our most fundamental understanding of nuclear matter, too.”

A phase diagram maps how a substance transforms between solid, liquid, and gas. For everyday materials like water, the diagram is familiar, but the behaviour of nuclear matter under extreme heat and pressure remains a mystery.

Atomic nuclei are made of protons and neutrons tightly bound together. These protons and neutrons are themselves made of quarks that are held together by gluons. When nuclei are smashed together at high energies, the protons and neutrons “melt” into a fluid of quarks and gluons called a quark–gluon plasma. This exotic high-temperature state is thought to have filled the universe just microseconds after the Big Bang.

Smashing gold ions

The quark–gluon plasma is studied by accelerating heavy ions like gold nuclei to nearly the speed of light and smashing them together. “The advantage of using heavy-ion collisions in colliders such as RHIC is that we can repeat the experiment many millions, if not billions, of times,” Geurts explains.

By adjusting the collision energy, researchers can control the temperature and density of the fleeting quark–gluon plasma they create. This allows physicists to explore the transition between ordinary nuclear matter and the quark–gluon plasma. Within this transition, theory predicts the existence of a critical point where gradual change becomes abrupt.

Now, the STAR Collaboration has focused on measuring the minute fluctuations in the number of protons produced in each collision. These “proton cumulants,” says Geurts, are statistical quantities that “help quantify the shape of a distribution – here, the distribution of the number of protons that we measure”.

In simple terms, the first two cumulants correspond to the average and width of that distribution, while higher-order cumulants describe its asymmetry and sharpness. Ratios of these cumulants are tied to fundamental properties known as susceptibilities, which become highly sensitive near a critical point.

Unexpected discovery

Over three years of experiments, the STAR team studied gold–gold collisions at a wide range of energies, using sophisticated detectors to track and identify the protons and antiprotons created in each event. By comparing how the number of these particles changed with energy, the researchers discovered something unexpected.

As the collision energy decreased, the fluctuations in proton numbers did not follow a smooth trend. “STAR observed what it calls non-monotonic behaviour,” Geurts explains. “While at higher energies the ratios appear to be suppressed, STAR observes an enhancement at lower energies.” Such irregular changes, he said, are consistent with what might happen if the collisions pass near the critical point — the boundary separating different phases of nuclear matter.

For Volodymyr Vovchenko, a physicist at the University of Houston who was not involved in the research, the new measurements represent “a major step forward”. He says that “the STAR Collaboration has delivered the most precise proton-fluctuation data to date across several collision energies”.

Still, interpretation remains delicate. The corrections required to extract pure physical signals from the raw data are complex, and theoretical calculations lag behind in providing precise predictions for what should happen near the critical point.

“The necessary experimental corrections are intricate,” Vovchenko said, and some theoretical models “do not yet implement these corrections in a fully consistent way.” That mismatch, he cautions, “can blur apples-to-apples comparisons.”

The path forward

The STAR team is now studying new data from lower-energy collisions, focusing on the range where the signal appears strongest. The results could reveal whether the observed pattern marks the presence of a nuclear matter critical point or stems from more conventional effects.

Meanwhile, theorists are racing to catch up. “The ball now moves largely to theory’s court,” Vovchenko says. He emphasizes the need for “quantitative predictions across energies and cumulants of various order that are appropriate for apples-to-apples comparisons with these data.”

Future experiments, including RHIC’s fixed-target program and new facilities such as the FAIR accelerator in Germany, will extend the search even further. By probing lower energies and producing vastly larger datasets, they aim to map the transition between ordinary nuclear matter and quark–gluon plasma with unprecedented precision.

Whether or not the critical point is finally revealed, the new data are a milestone in the exploration of the strong force and the early universe. As Geurts put it, these findings trace “landmark properties of the most fundamental phase diagram of nuclear matter,” bringing physicists one step closer to charting how everything  – from protons to stars – first came to be.

The research is described in Physical Review Letters.

The post Hints of a boundary between phases of nuclear matter found at RHIC appeared first on Physics World.

A low vibration wire scanner fork for free electron lasers

7 octobre 2025 à 16:51
High performance, proven, wire scanner for transverse beam profile measurement for the latest generation of low emittance accelerators and FELs. (Courtesy: UHV Design)
High performance, proven, wire scanner for transverse beam profile measurement for the latest generation of low emittance accelerators and FELs. (Courtesy: UHV Design)

A new high-performance wire scanner fork that the latest generation of free electron lasers (FELs) can use for measuring beam profiles has been developed by UK-based firm UHV Design. Produced using technology licensed from the Paul Scherrer Institute (PSI) in Switzerland, the device could be customized for different FELs and low emittance accelerators around the world. It builds on the company’s PLSM range, which allows heavy objects to be moved very smoothly and with minimal vibrations.

The project began 10 years ago when the PSI was starting to build the Swiss Free Electron Laser and equipping the facility, explains Jonty Eyres. The remit for UHV Design was to provide a stiff, very smooth, bellows sealed, ultra-high vacuum compatible linear actuator that could move a wire fork without vibrating it adversely. The fork, designed by PSI, can hold wires in two directions and can therefore scan the intensity of the beam profile in both X and Y planes using just one device as opposed to two or more as in previous such structures.

“We decided to employ an industrial integrated ball screw and linear slide assembly with a very stiff frame around it, the construction of which provides the support and super smooth motion,” he says. “This type of structure is generally not used in the ultra-high vacuum industry.”

The position of the wire fork is determined through a (radiation-hard) side mounted linear optical encoder in conjunction with the PSI’s own motor and gearbox assembly. A power off brake is also incorporated to avoid any issues with back driving under vacuum load if electrical power was to be lost to the PLSM.  All electrical connections terminated with UTO style connectors to PSI specification.

Long term reliability was important to avoid costly and unnecessary down time, particularly between planned FEL maintenance shutdowns. The industrial ball screw and slide assembly by design was the perfect choice in conjunction with a bellows assembly rated for 500,000 cycles with an option to increase to 1 million cycles.

Eyres and his UHV design team began by building a prototype that the PSI tested themselves with a high-speed camera. Once validated, the UHV engineers then built a batch of 20 identical units to prove that the device could be replicated in terms of constraints and tolerances.

The real challenge in constructing this device, says Eyres, was about trying to minimize the amount of vibration on the wire, which, for PSI, is typically between 5 and 25 microns thick. This is only possible if the vibration of the wire during a scan is low compared to the cross section of the wire – that is, about a micron for a 25-micron wire. “Otherwise, you are just measuring noise,” explains Eyres. “The small vibration we achieved can be corrected for in calculations, so providing an accurate value for the beam profile intensity.”

UHV Design holds the intellectual property rights for the linear actuator and PSI the property rights of the fork. Following the success of the project and a subsequent agreement between them both, it was recently decided that UHV Design buy the licence to promote the wire fork, allowing the company to sell the device or a version of it to any institution or company operating a FEL or low-emittance accelerator. “The device is customizable and can be adapted to different types of fork, wires, motors or encoders,” says Eyres. “The heart of the design remains the same: a very stiff structure and its integrated ball screw and linear slide assembly. But, it can be tailored to meet the requirements of different beam lines in terms of stroke size, specific wiring and the components employed.”

UHV Design’s linear actuator was installed on the Swiss FEL in 2016 and has been performing very well since, says Eyres.

A final and important point to note, he adds, is that UHV Design built an identical copy of their actuator when we took on board the licence agreement, so that we could prove it could still reproduce the same performance. “We built an exact copy of the wire scanner, including the PSI fork assembly and sent it to the PSI, who then used the very same high-speed camera rig that they’d employed in 2015 to directly compare the new actuator with the original ones supplied. They reported that the results were indeed comparable, meaning that if fitted to the Swiss FEL today, it would perform in the same way.”

For more information: https://www.uhvdesign.com/products/linear-actuators/wire-scanner/

The post A low vibration wire scanner fork for free electron lasers appeared first on Physics World.

Rapid calendar life screening of electrolytes for silicon anodes using voltage holds

7 octobre 2025 à 15:48

2025-09-ecs-wb-schematic-main-image

Silicon-based lithium-ion batteries exhibit severe time-based degradation resulting in poor calendar lives. In this webinar, we will talk about how calendar aging is measured, why the traditional measurement approaches are time intensive and there is a need for new approaches to optimize materials for next generation silicon based systems. Using this new approach we also screen multiple new electrolyte systems that can lead to calendar life improvements in Si containing batteries.

An interactive Q&A session follows the presentation.

ankit-verma-headshot
Ankit Verma

Ankit Verma’s expertise is in physics-based and data-driven modeling of lithium-ion and next generation lithium metal batteries. His interests lie in unraveling the coupled reaction-transport-mechanics behavior in these electrochemical systems with experiment-driven validation to provide predictive insights for practical advancements. Predominantly, he’s working on improving silicon anodes energy density and calendar life as part of the Silicon Consortium Project, understanding solid-state battery limitations and upcycling of end-of-life electrodes as part of the ReCell Center.

Verma’s past works include optimization of lithium-ion battery anodes and cathodes for high-power and fast-charge applications and understanding electrodeposition stability in metal anodes.

 

the-electrochemical-society-logo

biologic-battery-logo-800

The post Rapid calendar life screening of electrolytes for silicon anodes using voltage holds appeared first on Physics World.

Radioactive BEC could form a ‘superradiant neutrino laser’

4 octobre 2025 à 14:48

Radioactive atoms in a Bose–Einstein condensate (BEC) could form a “superradiant neutrino laser” in which the atomic nuclei undergo accelerated beta decay. The hypothetical laser has been proposed by two researchers US who say that it could be built and tested. While such a neutrino laser has no obvious immediate applications, further developments could potentially assist in the search for background neutrinos from the Big Bang – an important goal of neutrino physicists.

Neutrinos – the ghostly particles produced in beta decay – are notoriously difficult to detect or manipulate because of the weakness of their interaction with matter. They cannot be used to produce a conventional laser because they would pass straight through mirrors unimpeded. More fundamentally, neutrinos are fermions rather than bosons such as photons. This prevents neutrinos forming a two-level system with a population inversion as only one neutrino can occupy each quantum state in a system.

However, another quantum phenomenon called superradiance can also increase the intensity and coherence of the radiation from photons. This occurs when the emitters are sufficiently close together to become indistinguishable. The emission then comes not from any single entity but from the collective ensemble. As it does not require the emitted particles to be quantum degenerate, this is not theoretically forbidden for fermions. “There are devices that use superradiance to make light sources, and people call them superradiant lasers – although that’s actually a misnomer” explains neutrino physicist Benjamin Jones of the University of Texas at Arlington and a visiting professor at the University of Manchester. “There’s no stimulated emission.”

In their new work, Jones and colleague Joseph Formaggio of Massachusetts Institute of Technology propose that, in a BEC of radioactive atoms, superradiance could enhance the neutrino emission rate and therefore speed up beta decay, with an initial burst before the expected exponential decay commences. “That has not been seen for nuclear systems so far – only for electronic ones,” says Formaggio. Rubidium was used to produce the first ever condensate in 1995 by Carl Wiemann and Eric Cornell of University of Colorado Boulder, and conveniently, one of its isotopes decays by beta emission with a half-life of 86 days.

Radioactive vapour

The presence of additional hyperfine states would make direct laser cooling of rubidium-83 more challenging than the rubidium-87 isotope used by Wiemann and Cornell, but not significantly more so than the condensation of rubidium-85, which has also been achieved. Alternatively, the researchers propose that a dual condensate could be created in which rubidium-83 is cooled by sympathetic cooling with rubidium-87. The bigger challenge, says Jones, is the Bose–Einstein condensation of a radioactive atom, which has yet to be achieved: “It’s difficult to handle in a vacuum system,” he explains, “You have to be careful to make sure you don’t contaminate your laboratory with radioactive vapour.”

If such a condensate were produced, the researchers predict that superradiance would increase with the size of the BEC. In a BEC of 106 atoms, for example, more than half the atoms would decay within three minutes. The researchers now hope to test this prediction. “This is one of those experiments that does not require a billion dollars to fund,” says Formaggio. “It is done in university laboratories. It’s a hard experiment but it’s not out of reach, and I’d love to see it done and be proven right or wrong.”

If the prediction were proved correct, the researchers suggest it could eventually lead towards a benchtop neutrino source. As the same physics applies to neutrino capture, this could theoretically assist the detection of neutrinos that decoupled from the hot plasma of the universe just seconds after the Big Bang – hundreds of thousands of years before photons in the cosmic microwave background. The researchers emphasize, however, that this would not currently be feasible.

Sound proposal

Neutrino physicist Patrick Huber of Virginia Tech is impressed by the work. “I think for a first, theoretical study of the problem this is very good,” he says. “The quantum mechanics seems to be sound, so the question is if you try to build an experiment what kind of real-world obstacles are you going to encounter?” He predicts that, if the experiment works, other researchers would quite likely find hitherto unforeseen applications.

Atomic, molecular and optical physicist James Thompson of University of Colorado Boulder is sceptical, however. He says several important aspects are either glossed over or simply ignored. Most notably, he calculates that the de Broglie wavelength of the neutrinos would be below the Bohr radius – which would prevent a BEC from feasibly satisfying the superradiance criterion that the atoms be indistinguishable.

“I think it’s a really cool, creative idea to think about,” he concludes, “but I think there are things we’ve learned in atomic physics that haven’t really crept into [the neutrino physics] community yet. We learned them the hard way by building experiments, having them not work and then figuring out what it takes to make them work.”

The proposal is described in Physical Review Letters.

The post Radioactive BEC could form a ‘superradiant neutrino laser’ appeared first on Physics World.

US scientific societies blast Trump administration’s plan to politicize grants

2 octobre 2025 à 17:30

Almost 60 US scientific societies have signed a letter calling on the US government to “safeguard the integrity” of the peer-review process when distributing grants. The move is in to response to an executive order issued by the Trump administration in August that places accountability for reviewing and awarding new government grants in the hands of agency heads.

The executive order – Improving Oversight of Federal Grantmaking – calls on each agency head to “designate a senior appointee” to review new funding announcements and to “review discretionary grants to ensure that they are consistent with agency priorities and the national interest.”

The order outlines several previous grants that it says have not aligned with the Trump administration’s current policies, claiming that in 2024 more than a quarter of new National Science Foundation (NSF) grants went to diversity, equity, and inclusion and what it calls “other far-left initiatives”.

“These NSF grants included those to educators that promoted Marxism, class warfare propaganda, and other anti-American ideologies in the classroom, masked as rigorous and thoughtful investigation,” the order states. “There is a strong need to strengthen oversight and coordination of, and to streamline, agency grantmaking to address these problems, prevent them from recurring, and ensure greater accountability for use of public funds more broadly.”

Increasing burdens

In response, the 58 agencies – including the American Physical Society, the American Astronomical Society, the Biophysical Society, the American Geophysical Union and SPIE – have written to the majority and minority leaders of the US Senate and House of Representatives, to voice their concerns that the order “raises the possibility of politicization” in federally funded research.

“Our nation’s federal grantmaking ecosystem serves as the gold standard for supporting cutting-edge research and driving technological innovation worldwide,” the letters states. “Without the oversight traditionally applied by appropriators and committees of jurisdiction, this [order] will significantly increase administrative burdens on both researchers and agencies, slowing, and sometimes stopping altogether, vital scientific research that our country needs.”

The letter says more review and oversight is required by the US Congress before the order should go into effect, adding that the scientific community “is eager” to work with congress and the Trump administration “to strengthen our scientific enterprise”.

The post US scientific societies blast Trump administration’s plan to politicize grants appeared first on Physics World.

Kirigami-inspired parachute falls on target

1 octobre 2025 à 17:00
A Kirigami-inspired parachute
On target A Kirigami-inspired parachute deploying to slow down the delivery of a water bottle from a drone. (Courtesy: Frédérick Gosselin)

Inspired by the Japanese art of kirigami, researchers in Canada and France have designed a parachute that can safely and accurately deliver its payloads when dropped directly above its target. Tested in realistic outdoor conditions, the parachute’s deformable design stabilizes the airflow around its porous structure, removing the need to drift as it falls. With its simple and affordable design, the parachute could have especially promising uses in areas including drone delivery and humanitarian aid.

When a conventional parachute is deployed, it cannot simply fall vertically towards its target. To protect itself from turbulence, which can cause its canopy to collapse, it glides at an angle that breaks the symmetry of the airflow around it, stabilizing the parachute against small perturbations.

But this necessity comes at a cost. When dropping a payload from a drone or aircraft, this gliding angle means parachutes will often drift far from their intended targets. This can be especially frustrating and potentially dangerous for operations such as humanitarian aid delivery, where precisely targeted airdrops are often vital to success.

To address this challenge, researchers led by David Mélançon at Polytechnique Montréal looked to kirigami, whereby paper is cut and folded to create elaborate 3D designs. “Previously, kirigami has been used to morph flat sheets into 3D shapes with programmed curvatures,” Mélançon explains. “We proposed to leverage kirigami’s shape morphing capability under fluid flow to design new kinds of ballistic parachutes.”

Wind-dispersed seeds

As well as kirigami, the team drew inspiration from nature. Instead of relying on a gliding angle, many wind-dispersed seeds are equipped with structures that stabilize the airflow around them: including the feathery bristles of dandelion seeds, which create a stabilized vortex in their wake; and the wings of sycamore and maple seeds, which cause them to rapidly spin as they fall. In each case, these mechanisms provide plants with passive control over where their seeds land and germinate.

For their design, Mélançon’s team created a parachute that can deform into a shape pre-programmed by a pattern of kirigami cuts, etched into a flexible disc using a laser cutter. “Our parachutes are simple flat discs, with circumferential slits inspired by a kirigami motif called a closed loop,” Mélançon describes. “Instead of attaching the payload with strings at the outer edge of the disk, we directly mount it its centre.”

When dropped, a combination of air resistance and the weight of the free-falling payload deformed the parachute into an inverted, porous bell shape. “The slits in the kirigami pattern are stretched, forcing air through its multitude of small openings,” Mélançon continues. “This ensures that the air flows in an orderly manner without any major chaotic turbulence, resulting in a predictable trajectory.”

The researchers tested their parachute extensively using numerical simulations combined with wind tunnel experiments and outdoor tests, where they used the parachute to drop a water bottle from a hovering drone. In this case, the parachute delivered its payload safely to the ground from a height of 60 m directly above its target.

Easy to make

Mélançon’s team tested their design with a variety of parachute sizes and kirigami patterns, demonstrating that designs with lower load-to-area ratios and more deformable patterns can reach comparable terminal velocity to conventional parachutes – with far greater certainty over where they will land. Compared with conventional parachutes, which are often both complex and costly to manufacture, kirigami-based designs will be far easier to fabricate.

“Little hand labour is necessary,” Mélançon says. “We have made parachutes out of sheets of plastic, paper or cardboard. We need a sheet of material with a certain rigidity, that’s all.”

By building on their design, the researchers hope that future studies will pave the way for new improvements in package home delivery. It could even advance efforts to deliver urgently needed aid during conflicts and natural disasters to those who need it most.

The parachute is described in Nature.

The post Kirigami-inspired parachute falls on target appeared first on Physics World.

Destroyers of the world: the physicists who built nuclear weapons

1 octobre 2025 à 12:00

The title of particle physicist Frank Close’s engaging new book, Destroyer of Worlds, refers to Robert Oppenheimer’s famous comment after he witnessed the first detonation of an atomic bomb, known as the Trinity test, in July 1945. Quoting the Hindu scripture Bhagavad Gita, he said “Now I am become death, the destroyer of worlds.” But although Close devotes much space to the Manhattan Project, which Oppenheimer directed between 1942 and 1945, his book has a much wider remit.

Aimed at non-physicist readers with a strong interest in science, though undoubtedly appealing to physicists too, the book seeks to explain the highly complex physics and chemistry that led to the atomic bomb – a term first coined by H G Wells in his 1914 science-fiction novel The World Set Free. It also describes the contributions of numerous gifted scientists to the development of those weapons.

Close draws mainly on numerous published sources from this deeply analysed period, including Richard Rhodes’s seminal 1988 study The Making of the Atomic Bomb. He starts with Wilhelm Röntgen’s discovery of X-rays in 1895, before turning to the discovery of radioactivity by Henri Becquerel in 1896 – described by Close as “the first pointer to nuclear energy [that was] so insignificant that it was almost missed”. Next, he highlights the work on radium by Marie and Pierre Curie in 1898.

After discussing the emergence of nuclear physics, Close goes on to talk about the Allies’ development of the nuclear bomb. A key figure in this history was Enrico Fermi, who abandoned Fascist Italy in 1938 and emigrated to the US, where he worked on the Manhattan Project and built the first nuclear reactor, in Chicago, in 1942.

Fermi showed his legendary ability to estimate a physical phenomenon’s magnitude by shredding a sheet of paper into small pieces and throwing them into the air

Within seconds of seeing Trinity’s blast in the desert in 1945, Fermi showed his legendary ability to estimate a physical phenomenon’s magnitude by shredding a sheet of paper into small pieces and throwing them into the air. The bomb’s shock wave blew this “confetti” (Close’s word) a few metres away. After measuring the exact distance, Fermi immediately estimated that the blast was equivalent to about 10,000 tonnes of TNT. This figure was not far off the 18,000 tonnes determined a week later following a detailed analysis by the project team.

The day after the Trinity test, a group of 70 scientists, led by Leo Szilard, sent a petition to US President Harry Truman, requesting him not to use the bomb against Japan. Albert Einstein agreed with the petition but did not sign it, having been excluded from the Manhattan Project on security grounds (though in 1939 he famously backed the bomb’s development, fearing that Nazi Germany might build its own device). Despite the protests, atomic bombs were dropped on Hiroshima and Nagasaki less than a month later – a decision that Close neither defends nor condemns.

Other key figures in the Manhattan Project were emigrants to the UK, who had fled Germany in the mid-1930s because of Nazi persecution of Jews, and later joined the secret British Tube Alloys bomb project. The best known are probably the nuclear physicists Otto Frisch and Rudolf Peierls, who initially worked together at the University of Birmingham for Tube Alloys before joining the Manhattan Project. They both receive their due from Close.

Oddly, however, he neglects to mention their fellow émigré Franz (Francis) Simon by name, despite acknowledging the importance of his work in demonstrating a technique to separate fissionable uranium-235 from the more stable uranium-238. In 1940 Simon, then working at the Clarendon Laboratory in wartime Oxford, showed that separation could be achieved by gaseous diffusion of uranium hexafluoride through a porous barrier, which he initially demonstrated by hammering his wife’s kitchen sieve flat to make the barrier.

The Manhattan Project set an example for the future of science as a highly collaborative, increasingly international albeit sometimes dangerous adventure

As Close ably documents and explains, numerous individuals and groups eventually ensured the success of the Manhattan Project. In addition to ending the Second World War and preserving freedom against Fascism, there is an argument that it also set an example for the future of science as a highly collaborative, increasingly international albeit sometimes dangerous adventure.

Close finishes the book with a shorter discussion of the two decades of Cold War rivalry between scientists from the US and the Soviet Union to develop and test the hydrogen bomb. It features physicists such as Edward Teller and Andrei Sakharov, who led the efforts to build the American “Super Bomb” and the Soviet “Tsar Bomba”, respectively.

The book ends in around 1965, after the 1963 partial test-ban treaty signed by the US, Soviet Union and the UK, preventing further tests of the hydrogen bomb for fear of their likely devastating effects on Earth’s atmosphere. As Close writes, the Tsar Bomba was more powerful than the meteorite impact 65 million years ago that wreaked global change and killed the dinosaurs, which had ruled for 150 million years.

“Within just one per cent of that time, humans have produced nuclear arsenals capable of replicating such levels of destruction,” Close warns. “The explosion of a gigaton weapon would signal the end of history. Its mushroom cloud ascending towards outer space would be humanity’s final vision.”

  • 2025 Allen Lane £25.00hb 321pp

The post Destroyers of the world: the physicists who built nuclear weapons appeared first on Physics World.

NASA criticized over its management of $3.3bn Dragonfly mission to Titan

30 septembre 2025 à 16:40

An internal audit has slammed NASA over its handling of the Dragonfly mission to Saturn’s largest moon, Titan. The drone-like rotorcraft, which is designed to land on and gather samples from Titan, has been hit by a two-year delay, with costs surging by $1bn to $3.3bn. NASA now envisions a launch date of July 2028 with Dragonfly arriving at Titan in 2034.

NASA chose Dragonfly in June 2019 as the next mission under its New Frontiers programme. Managed by the Johns Hopkins University Applied Physics Laboratory, it is a nuclear-powered, car-sized craft with eight rotors. Dragonfly will spend over three years studying potential landing sites before collecting data on Titan’s unique liquid environment and looking for signs that it could support life.

The audit, carried out by NASA’s Inspector General, took no issue with NASA’s tests of the rotors’ performance, which were carried out via simulations. Indeed, the mission team is already planning formal testing of the system to start in January. But the audit criticized NASA for letting Dragonfly’s development “proceed under less than ideal circumstances”, including with a “lower than optimum project cost reserves”.

Its report aims to now avoid those problems affecting future New Horizon missions. Specifically, it calls on Nicky Fox, NASA’s associate administrator for its science mission directorate, to document lessons learned from NASA’s decision to start work on the project before establishing a baseline commitment.

It also says that NASA should maintain adequate levels of “unallocated future expenses” for the project and make sure that “the science community is informed of updates to the expected scope and cadence for future New Frontier missions”. A NASA spokesperson told Physics World that NASA management agrees with the recommendations in the report adding that the agency “will use existing resources to address [them]”.

The post NASA criticized over its management of $3.3bn Dragonfly mission to Titan appeared first on Physics World.

How the slowest experiment in the world became a fast success

30 septembre 2025 à 12:00

Nothing is really known about the origin of the world-famous “pitch-drop experiment” at the School of Physics, Trinity College Dublin. Discovered in the 1980s during a clear-out of dusty cupboards, this curious glass funnel contains a dark, black substance. All we do know is that it was prepared in October 1944 (assuming you trust the writing on it). We don’t know who filled the funnel, with what exactly, or why.

Placed on a shelf at Trinity, the funnel was largely ignored by generations of students passing by. But anyone who looked closely would have seen a drop forming slowly at the bottom of the funnel, preparing to join older drops that had fallen roughly once a decade. Then, in 2013 this ultimate example of “slow science” went viral when a webcam recorded a video of a tear-drop blob of pitch falling into the beaker below.

The video attracted more than two million hits on YouTube (a huge figure back then) and the story was covered on the main Irish evening TV news. We also had a visit from German news magazine Der Spiegel, while Discover named it as one of the top 100 science stories of 2013. As one of us (SH) described in a 2014 Physics World feature, the iconic experiment became “the drop heard round the world”.

Pitching the idea

Inspired by that interest, we decided to create custom-made replicas of the experiment to send to secondary schools across Ireland as an outreach initiative. It formed part of our celebrations of 300 years of physics at Trinity, which dates back to 1724 when the college established the Erasmus Smith’s Professorship in Natural and Experimental Philosophy.

An outreach activity that takes 10 years for anything to happen is obviously never going to work. Technical staff at Trinity’s School of Physics, who initiated the project, therefore experimented for months with different tar samples. Their goal was a material that appears solid but will lead to a falling drop every few months – not every decade.

After hitting upon a special mix of two types of bitumen in just the right proportion, the staff also built a robust experimental set-up consisting of a stand, a funnel and flask to hold any fallen drops. Each was placed on a wooden base and contained inside a glass bell jar. There were also a thermometer and a ruler for data-taking along with a set of instructions.

On 27 November 2024 we held a Zoom call with all participating schools, culminating in the official call to remove the funnel stopper

Over 100 schools – scattered all over Ireland – applied for one of the set-ups, with a total of 37 selected to take part. Most kits were personally hand-delivered to schools, which were also given a video explaining how to unpack and assemble the set-ups. On 27 November 2024 we held a Zoom call with all participating schools, culminating in the official call to remove the funnel stopper. The race was on.

Joining the race

Each school was asked to record the temperature and length of the thread of pitch slowly emerging from the funnel. They were also given a guide to making a time-lapse video of the drop and provided with information about additional experiments to explore the viscosity of other materials.

To process incoming data, we set up a website, maintained by yet another one of our technical staff. It contained interactive graphs showing the increased in drop length for every school, together with the temperature when the measurement was taken. All data were shared between schools.

After about four months, four schools had recorded a pitch drop and we decided to take stock at a half-day event at Trinity in March 2025. Attended by more than 80 pupils aged 12–18 and teachers from 17 schools, we were amazed by how much excitement our initiative had created. It spawned huge levels of engagement, with lots of colourful posters.

By the end of the school year, most had recorded a drop, showing our tar mix had worked well. Some schools had also done experiments testing other viscous materials, such as syrup, honey, ketchup and oil, examining the effect of temperature on flow rate. Others had studied the flow of granular materials, such as salt and seeds. One school had even captured on video the moment their drop fell, although sadly nobody was around to see it in person.

Some schools displayed the kits in their school entrance, others in their trophy cabinet. One group of students appeared on their local radio station; another streamed the set-up live on YouTube. The pitch-drop experiment has been a great way for students to learn basic scientific skills, such as observation, data-taking, data analysis and communication.

As for teachers, the experiment is an innovative way for them to introduce concepts such as viscosity and surface tension. It lets them explore the notion of multiple variables, measurement uncertainty and long-time-scale experiments. Some are now planning future projects on statistical analysis using the publicly available dataset or by observing the pitch drop in a more controlled environment.

Wouldn’t it be great if other physics departments followed our lead?

The post How the slowest experiment in the world became a fast success appeared first on Physics World.

Cosmic muons monitor river sediments surrounding Shanghai tunnel

25 septembre 2025 à 17:00
Photograph of the portable muon detector in the Shanghai tunnel
Trundling along A portable version of the team’s muon detector was used along the length of the tunnel. (Courtesy: Kim Siang Khaw et al/Journal of Applied Physics/CC BY 4.0)

Researchers in China say that they are the first to use cosmic-ray muography to monitor the region surrounding a tunnel. Described as a lightweight, robust and affordable scintillator setup, the technology was developed by Kim Siang Khaw at Shanghai Jiao Tong University and colleagues. They hope that their approach could provide a reliable and non-invasive method for the real-time monitoring of subterranean infrastructure.

Monitoring the structural health of tunnels and other underground infrastructure is challenging because of the lack of access. Inspection often relies on techniques such as borehole drilling, sonar scanning, and multibeam echo sounders to determine when maintenance is needed. These methods can be invasive, low resolution and involve costly and disruptive shutdowns. As a result there is often a trade-off between the quality of inspections and the frequency at which they are done.

This applies to the Shanghai Outer Ring Tunnel: a major travel artery in China’s largest city, which runs for almost 3 km beneath the Huangpu River. Completed in 2023, the submerged section of the tunnel is immersed in water-saturated sediment, creating a unique set of challenges for structural inspection.

Time-varying stresses

In particular, different layers of sediment surrounding the tunnel can vary widely in their density, permeability, and cohesion. As they build up above the tunnel, they can impart uneven, time-varying stresses, making it incredibly challenging for existing techniques to accurately assess when maintenance is needed.

To address these challenges, a multi-disciplinary team was formed to explore possible solutions. “During these talks, the [Shanghai Municipal Bureau of Planning and Natural Resources] emphasized the practical challenges of monitoring sediment build-up around critical infrastructure, such as the Shanghai Outer Ring Tunnel, without causing disruptive and costly shutdowns,” Khaw describes.

Among the most promising solutions they discussed was muography, which involves detecting the muons created when high-energy cosmic rays interact with Earth’s upper atmosphere. These muons can penetrate deep beneath Earth’s surface and are absorbed at highly predictable rates depending on the density of the material they pass through.

A simple version of muography involves placing a muon detector on the surface of an object and another detector beneath the object. By comparing the muon fluxes in the two detectors, the density of the object can be determined. By measuring the flux attenuation along different paths through the object, an image of the interior density of the object can be obtained.

Muography has been used for several decades in areas as diverse as archaeology, volcanology and monitoring riverbanks. So far, however, its potential for monitoring underground infrastructure has gone largely untapped.

“We took this ‘old-school’ technique and pioneered its use in a completely new scenario: dynamically monitoring low-density, watery sediment build-up above a submerged, operational tunnel,” Khaw explains. “Our approach was not just in the hardware, but in integrating the detector data with a simplified tunnel model and validating it against environmental factors like river tides.”

With its durable, lightweight, and affordable design, the scintillator features a dual-layer configuration that suppresses background noise while capturing cosmic muons over a broad range of angles. Crucially, it is portable and could be discreetly positioned inside an underground tunnel to carry out real-time measurements, even as traffic flows.

Sediment profiles

To test the design, Khaw’s team took measurements along the full length of the Shanghai Outer Ring Tunnel while it was undergoing maintenance; allowing them to map out a profile of the sediment surrounding the tunnel. They then compared their muon flux measurements with model predictions based on sediment profiles for the Huangpu River measured in previous years. They were pleased to obtain results that were better than anticipated.

“We didn’t know the actual tidal height until we completed the measurement and checked tidal gauge data,” Khaw describes. “The most surprising and exciting discovery was a clear anti-correlation between muon flux and the tidal height of the Huangpu River.” Unexpectedly, the detector was also highly effective at measuring the real-time height of water above the tunnel, with its detected flux closely following the ebb and flow of the tides.

Reassuringly, the team’s measurements confirmed that there are no as-yet unmapped obstructions or gaps in the sediment above the tunnel thereby confirming the structure’s safety.

“Additionally, we have effectively shown a dual-purpose technology: it offers a reliable, non-invasive method for sediment monitoring and also reveals a new technique for tidal monitoring,” says Khaw. “This opens the possibility of using muon detectors as multi-functional sensors for comprehensive urban infrastructure and environmental oversight.”

The research is described in the Journal of Applied Physics.

The post Cosmic muons monitor river sediments surrounding Shanghai tunnel appeared first on Physics World.

Gyroscopic backpack improves balance for people with movement disorder

25 septembre 2025 à 10:00

A robotic backpack equipped with gyroscopes can enhance stability for people with severe balance issues and may eventually remove the need for mobility walkers. Designed to dampen unintended torso motion and improve balance, the backpack employs similar gyroscopic technology to that used by satellites and space stations to maintain orientation. Individuals with the movement disorder ataxia put the latest iteration of the device – the GyroPack – through its paces in a series of standing, walking and body motion exercises.

In development for over a decade, GyroPack is the brainchild of a team of neurologists, biomechanical engineers and rehabilitation specialists at the Radboud University Medical Centre, Delft University of Technology (TU Delft) and Erasmus Medical Centre. The first tests of its ability to improve balance performance with ataxia-impacted adults, described in npj Robotics, produced encouraging enough results to continue the GyroPack’s development as a portable robotic wearable for individuals with neurological conditions.

Degenerative ataxias, a variety of diseases of the nervous system, cause progressive cerebral dysfunction manifesting as symptoms including lack of coordination, imbalance when standing and difficulty walking. Ataxia can afflict people of all ages, including young children. Managing the progressive symptoms may require lifetime use of cumbersome, heavily weighted walkers as mobility aids and to prevent falling.

GyroPack design

The 6 kg version of the GyroPack tested in this study contains two control moment gyroscopes (CMGs), which are attitude control devices that control orientation to a specific inertial frame-of-reference. Each CMG consists of a flywheel and a gimbal, which together generate the change in angular momentum that’s exerted onto the wearer to resist unintended torso rotations. Each CMG also contains an inertial measurement unit to determine the orientation and angular rate of change of the CMG.

The backpack also holds two independent, 1.5 kg miniaturized actuators designed by the team that convert energy into motion. The system is controlled by a laptop and powered through a separate power box that filters and electrically separates electrical signals for safety. All activities can be immediately terminated when an emergency stop button is pushed.

Lead researcher Jorik Nonnekes of Radboud UMC describes how the system works: “The change of orientation imposed by the gimbal motor, combined with the angular momentum of the flywheels, causes a free moment, or torque, that is exerted onto the system the CMG is attached to – which in this study is the human upper body,” he explains. “A cascaded control scheme reliably deals with actuator limitations without causing undesired disturbances on the user. The gimbals are controlled in such a way that the torque exerted on the trunk is proportional and opposite to the trunk’s angular velocity, which effectively lets the system damp rotational motion of the wearer. This damping has been shown to make balancing easier for unimpaired subjects and individuals post-stroke.”

Performance assessment

Study participant wearing the GyroPack
Exercise study A participant wearing the GyroPack. (Courtesy: npj Robot. 10.1038/s44182-025-00041-4)

For the study, 14 recruits diagnosed with degenerative ataxia performed five tasks: standing still with feet together and arms crossed for up to 30 s; walking on a treadmill for 2 min without using the handrail; making a clockwise and a counterclockwise 360° turn-in-place; performing a tandem stance with the heel of one foot touching the toes of the other for up to 30 s; and testing reactive balance by applying two forward and two backward treadmill perturbations.

The participants performed these tasks under three conditions, two whilst wearing the backpack and one without as a baseline. In one scenario, the backpack was operated in assistive mode to investigate its damping power and torque profiles. In the other, the backpack was in “sham mode”, without assistive control but with sound and motor vibrations indistinguishable from normal operation.

The researchers report that when fully operational, the GyroPack increased the user’s average standing time compared with not wearing the backpack at all. When used during walking, it reduced the variability of trunk angular velocity and the extrapolated centre-of-mass, two common indicators of gait stability. The trunk angular velocity variability also showed a significant reduction when comparing assistive to sham GyroPack modes. However, the performance of turn-in-place and perturbation recovery tasks were similar for all three scenarios.

Interestingly, wearing the backpack in the sham scenario improved walking tasks compared with not wearing a backpack at all. The researchers attributed this to possibly more weight in the torso area improving body stabilization or to a placebo effect.

Next, the team plans to redesign the device to make it lighter and quieter. “It’s not yet suitable for everyday use,” says Nonnekes in a press statement. “But in the future, it could help people with ataxia participate more freely in daily life, like attending social events without needing a walker, which many find bulky and inconvenient. This could greatly enhance their mobility and overall quality of life.”

The post Gyroscopic backpack improves balance for people with movement disorder appeared first on Physics World.

❌