↩ Accueil

Vue lecture

Il y a de nouveaux articles disponibles, cliquez pour rafraîchir la page.

The physics of George R R Martin’s Wild Card virus revealed

It’s not every day that a well-known author writes a physics paper. But George R R Martin, who is best known for his Song of Ice and Fire series of fantasy novels, has co-authored a paper in the American Journal of Physics with the title “Ergodic Lagrangian dynamics in a superhero universe”.

Written with Los Alamos National Laboratory theoretical physicist Ian Tregillis, who is also a science-fiction author of several books, they have derived a mathematical model of the so-called wild cards virus.

The Wild Cards universe is a series of novels created by a consortium of writers including Martin and Tregillis.

Set largely during an alternate history of the US following the Second World War, the series follows events after an extraterrestrial virus, known as the Wild Card virus, has spread worldwide. It mutates human DNA causing profound changes in human physiology and society at large.

The virus follows a fixed statistical distribution of outcomes in that 90% of those infected die, 9% become physically mutated (referred to as “jokers”) and 1% gain superhuman abilities (known as “aces”). Such capabilities include the ability to fly as well as being able to move between dimensions. The stories in the series then follow the individuals that have been impacted by the virus.

Tregillis and Martin have now derived a formula for the viral behaviour of the Wild Card virus. “Like any physicist, I started with back-of-the-envelope estimates, but then I went off the deep end,” notes Tregillis. “Being a theoretician, I couldn’t help but wonder if a simple underlying model might tidy up the canon.”

The model takes into consideration the severity of the changes (for the 10% that don’t instantly die) and the mix of joke/ace traits. After all, those infected can also become cryto-jokers or crypto-aces – undetected cases where individuals have subtle changes or powers – as well as joker-aces, in which a human develops both mutations and superhuman abilities.

The result is a dynamical system in which a carrier’s state vector constantly evolves through the model space — until their “card” turns. At that point the state vector becomes fixed and its permanent location determines the fate of the carrier. “The time-averaged behavior of this system generates the statistical distribution of outcomes,” adds Tregillis.

The purpose of the paper, and the model, is also to provide an exercise in demonstrating how “whimsical” scenarios can be used to explore concepts in physics and mathematics.

“The fictional virus is really just an excuse to justify the world of Wild Cards, the characters who inhabit it, and the plot lines that spin out from their actions,” says Tregillis.

The post The physics of George R R Martin’s Wild Card virus revealed appeared first on Physics World.

Fast radio burst came from a neutron star’s magnetosphere, say astronomers

The exact origins of cosmic phenomena known as fast radio bursts (FRBs) are not fully understood, but scientists at the Massachusetts Institute of Technology (MIT) in the US have identified a fresh clue: at least one of these puzzling cosmic discharges got its start very close to the object that emitted it. This result, which is based on measurements of a fast radio burst called FRB 20221022A, puts to rest a long-standing debate about whether FRBs can escape their emitters’ immediate surroundings. The conclusion: they can.

“Competing theories argued that FRBs might instead be generated much farther away in shock waves that propagate far from the central emitting object,” explains astronomer Kenzie Nimmo of MIT’s Kavli Institute for Astrophysics and Space Research. “Our findings show that, at least for this FRB, the emission can escape the intense plasma near a compact object and still be detected on Earth.”

As their name implies, FRBs are brief, intense bursts of radio waves. The first was detected in 2007, and since then astronomers have spotted thousands of others, including some within our own galaxy. They are believed to originate from cataclysmic processes involving compact celestial objects such as neutron stars, and they typically last a few milliseconds. However, astronomers have recently found evidence for bursts a thousand times shorter, further complicating the question of where they come from.

Nimmo and colleagues say they have now conclusively demonstrated that FRB 20221022A, which was detected by the Canadian Hydrogen Intensity Mapping Experiment (CHIME) in 2022, comes from a region only 10 000 km in size. This, they claim, means it must have originated in the highly magnetized region that surrounds a star: the magnetosphere.

“Fairly intuitive” concept

The researchers obtained their result by measuring the FRB’s scintillation, which Nimmo explains is conceptually similar to the twinkling of stars in the night sky. The reason stars twinkle is that because they are so far away, they appear to us as point sources. This means that their apparent brightness is more affected by the Earth’s atmosphere than is the case for planets and other objects that are closer to us and appear larger.

“We applied this same principle to FRBs using plasma in their host galaxy as the ‘scintillation screen’, analogous to Earth’s atmosphere,” Nimmo tells Physics World. “If the plasma causing the scintillation is close to the FRB source, we can use this to infer the apparent size of the FRB emission region.”

According to Nimmo, different models of FRB origins predict very different sizes for this region. “Emissions originating within the magnetized environments of compact objects (for example, magnetospheres) would produce a much smaller apparent size compared to emission generated in distant shocks propagating far from the central object,” she explains. “By constraining the emission region size through scintillation, we can determine which physical model is more likely to explain the observed FRB.”

Challenge to existing models

The idea for the new study, Nimmo says, stemmed from a conversation with another astronomer, Pawan Kumar of the University of Texas at Austin, early last year. “He shared a theoretical result showing how scintillation could be used a ‘probe’ to constrain the size of the FRB emission region, and, by extension, the FRB emission mechanism,” Nimmo says. “This sparked our interest and we began exploring the FRBs discovered by CHIME to search for observational evidence for this phenomenon.”

The researchers say that their study, which is detailed in Nature, shows that at least some FRBs originate from magnetospheric processes near compact objects such as neutron stars. This finding is a challenge for models of conditions in these extreme environments, they say, because if FRB signals can escape the dense plasma expected to exist near such objects, the plasma may be less opaque than previously assumed. Alternatively, unknown factors may be influencing FRB propagation through these regions.

A diagnostic tool

One advantage of studying FRB 20221022A is that it is relatively conventional in terms of its brightness and the duration of its signal (around 2 milliseconds). It does have one special property, however, as discovered by Nimmo’s colleagues at McGill University in Canada: its light is highly polarized. What is more, the pattern of its polarization implies that its emitter must be rotating in a way that is reminiscent of pulsars, which are highly magnetized, rotating neutron stars. This result is reported in a separate paper in Nature.

In Nimmo’s view, the MIT team’s study of this (mostly) conventional FRB establishes scintillation as a “powerful diagnostic tool” for probing FRB emission mechanisms. “By applying this method to a larger sample of FRBs, which we now plan to investigate, future studies could refine our understanding of their underlying physical processes and the diverse environments they occupy.”

The post Fast radio burst came from a neutron star’s magnetosphere, say astronomers appeared first on Physics World.

Explore the quantum frontier: all about the International Year of Quantum Science and Technology 2025

In June 1925, a relatively unknown physics postdoc by the name of Werner Heisenberg developed the basic mathematical framework that would be the basis for the first quantum revolution. Heisenberg, who would later win the Nobel Prize for Physics, famously came up with quantum mechanics on a two-week vacation on the tiny island of Helgoland off the coast of Germany, where he had gone to cure a bad bout of hay fever.

Now, a century later, we are on the cusp of a second quantum revolution, with quantum science and technologies growing rapidly across the globe. According to the State of Quantum 2024 report, a total of 33 countries around the world currently have government initiatives in quantum technology, of which more than 20 have national strategies with large-scale funding. The report estimates that up to $50bn in public cash has already been committed.

It’s a fitting tribute, then, that the United Nations (UN) has chosen 2025 to be the International Year of Quantum Science and Technology (IYQ). They hope that the year will raise global awareness of the impact that quantum physics and its applications have already had on our world. The UN also aims to highlight to the global public the myriad potential future applications of quantum technologies and how they could help tackle universal issues – from climate and clean energy to health and infrastructure – while also addressing the UN’s sustainable development goals.

The Institute of Physics (IOP), which publishes Physics World, is one of the IYQ’s six “founding partners” alongside the German and American physical societies, SPIE, Optica and the Chinese Optical Society. “The UNESCO International Year of Quantum is a wonderful opportunity to spread the word about quantum research and technology and the transformational opportunities it is opening up” says Tom Grinyer, chief executive of the IOP. “The Institute of Physics is co-ordinating the UK and Irish elements of the year, which mark the 100th anniversary of the first formulation of quantum mechanics, and we are keen to celebrate the milestone, making sure that as many people as possible get the opportunity to find out more about this fascinating area of science and technology,” he adds.

“IYQ provides the opportunity for societies and organizations around the world to come together in marking both the 100-year history of the field, as well as the longer-term real-world impact that quantum science is certain to have for decades to come,” says Tim Smith, head of portfolio development at IOP Publishing. “Quantum science and technology represents one of the most exciting and rapidly developing areas of science today, encompassing the global physical-sciences community in a way that connects scientific wonder with fundamental research, technological innovation, industry, and funding programmes worldwide.”

Taking shape

The official opening ceremony for IYQ takes place on 4–5 February at the UNESCO headquarters in Paris, France, although several countries, including Germany and India, held their own launches in advance of the main event . Working together, the IOP and IOP Publishing have developed a wide array of quantum resources, talks, conferences, festivals and public-themed events planned as a part of the UK’s celebrations for IYQ. 

In late February, meanwhile, the Royal Society – the world’s oldest continuously active learned society – will host a two-day quantum conference. Dubbed “Quantum Information”, it will bring together scientists, industry leaders and public-sector stakeholders to discuss the current challenges involved in quantum computing, networks and sensing systems.

In Scotland, the annual Edinburgh Science Festival , which takes place in April, will include a special “quantum explorers” exhibit and workshop by the UK’s newly launched National Quantum Computing Centre. Elsewhere, the Quantum Software Lab at the School of Informatics at the University of Edinburgh is hosting a month-long “Quantum Fringe 2025” event across Scotland. It will include a quantum machine-learning school on the Isle of Skye and well as the annual UK Quantum Hackathon, which brings together teams of aspiring coders with industry mentors to tackle practical challenges and develop solutions using quantum computing.

In June, the Institution of Engineering and Technology is hosting a Quantum Engineering and Technologies conference, as part of its newly launched Quantum technologies and 6G and Future Networks events. The event’s themes include everything from information processing and memories to photon sources and cryptography.

The IOP will use the focus this year gives us to continue to make the case for the investment in research and development, and support for physics skills, which will be crucial if we are to fully unlock the economic and social potential of the quantum sector

Further IYQ-themed events will take place at  QuAMP, the IOP’s biennial international conference on quantum, atomic and molecular physics in September. Activities culminate in a three-part celebration in November, with a quantum community event led by the IOP’s History of Physics and quantum Business and Innovation Growth (qBIG) special interest groups, a schools event at the Royal Institution, and a public celebration with a keynote speech from University of Surrey quantum physicist and broadcaster Jim Al-Khalili. “The UK and Ireland already have a globally important position in many areas of quantum research, with the UK, for instance, having established one of the world’s first National Quantum Technology Programmes,” explains Grinyer. “We will also be using the focus this year gives us to continue to make the case for the investment in research and development, and support for physics skills, which will be crucial if we are to fully unlock the economic and social potential of what is both a fascinating area of research, and a fast growing physics-powered business sector,” he adds.

Quantum careers

With the booming quantum marketplace, it’s no surprise that employers are on the hunt for many skilled physicists to join the workforce. And indeed, there is a significant scarcity of skilled quantum professionals for the many roles across industry and academia. Also, with quantum research advancing everything from software and machine learning to materials science and drug discovery, your skills will be transferable across the board.

If you plan to join the quantum workforce, then choosing the right PhD programme, having the right skills for a specific role and managing risk and reward in the emerging quantum industry are all crucial. There are a number of careers events on the IYQ calendar, to learn more about the many career prospects for physicists in the sector. In April, for example, the University of Bristol’s Quantum Engineering Centre for Doctoral Training is hosting a Careers in Quantum event, while the Economist magazine is hosting its annual Commercialising Quantum conference in May.

There will also be a special quantum careers panel discussion, including top speakers from the UK and the US, as part of our newly launched Physics World Live panel discussions in April. This year’s Physics World Careers 2025 guide has a special quantum focus, and there’ll also be a bumper, quantum-themed issue of the Physics World Briefing in June. The Physics World quantum channel will be regularly updated throughout the year so you don’t miss a thing.

Read all about it

IOP Publishing’s journals will include specially curated content – from a series of Perspectives articles – personal viewpoints from leading quantum scientists – in Quantum Science and Technology. The journal will also be publishing roadmaps in quantum computing, sensing and communication, as well as focus issues on topics such as quantum machine learning and technologies for quantum gravity and thermodynamics in quantum coherent platforms.

“Going right to the core of IOP Publishing’s own historic coverage we’re excited to be celebrating the IYQ through a year-long programme of articles in Physics World and across our journals, that will hopefully show a wide audience just why everyone should care about quantum science and the people behind it,” says Smith.

Of course, we at Physics World have a Schrödinger’s box full of fascinating quantum articles for the coming year – from historical features to the latest cutting-edge developments in quantum tech. So keep your eyes peeled.

The post Explore the quantum frontier: all about the International Year of Quantum Science and Technology 2025 appeared first on Physics World.

Helgoland: leading physicists to gather on the tiny island where quantum mechanics was born

Par : James Dacey

In this episode of Physics World Stories, we celebrate the 100th anniversary of Werner Heisenberg’s trip to the North Sea island of Helgoland, where he developed the first formulation of quantum theory. Listen to the podcast as we delve into the latest advances in quantum science and technology with three researchers who will be attending a 6-day workshop on Helgoland in June 2025.

Featuring in the episode are: Nathalie De Leon of Princeton University, Ana Maria Rey from the University of Colorado Boulder, and Jack Harris from Yale University, a member of the programme committee. These experts share their insights on the current state of quantum science and technology: discussing the latest developments in quantum sensing, quantum information and quantum computing.

They also reflect on the significance of attending a conference at a location that is so deeply ingrained in the story of quantum mechanics. Talks at the event will span the science and the history of quantum theory, as well as the nature of scientific revolutions.

This episode is part of Physics World’s quantum coverage throughout 2025, designated by the UN as the International Year of Quantum Science and Technology (IYQ). Check out this article, for all you need to know about IYQ.

The post Helgoland: leading physicists to gather on the tiny island where quantum mechanics was born appeared first on Physics World.

Terahertz light produces a metastable magnetic state in an antiferromagnet

Physicists in the US, Europe and Korea have produced a long-lasting light-driven magnetic state in an antiferromagnetic material for the first time. While their project started out as a fundamental study, they say the work could have applications for faster and more compact memory and processing devices.

Antiferromagnetic materials are promising candidates for future high-density memory devices. This is because in antiferromagnets, the spins used as the bits or data units flip quickly, at frequencies in the terahertz range. Such rapid spin flips are possible because, by definition, the spins in antiferromagnets align antiparallel to each other, leading to strong interactions among the spins. This is different from ferromagnets, which have parallel electron spins and are used in today’s memory devices such as computer hard drives.

Another advantage is that antiferromagnets display almost no macroscopic magnetization. This means that bits can be packed more densely onto a chip than is the case for the ferromagnets employed in conventional magnetic memory, which do have a net magnetization.

A further attraction is that the values of bits in antiferromagnetic memory devices are generally unaffected by the presence of stray magnetic fields. However, Nuh Gedik of the Massachusetts Institute of Technology (MIT), who led the latest research effort, notes that this robustness can be a double-edged sword: the fact that antiferromagnet spins are insensitive to weak magnetic fields also makes them difficult to control.

Antiferromagnetic state lasts for more than 2.5 milliseconds

In the new work, Gedik and colleagues studied FePS3, which becomes an antiferromagnet below a critical temperature of around 118 K. By applying intense pulses of terahertz-frequency light to this material, they were able to control this transition, placing the material in a metastable magnetic state that lasts for more than 2.5 milliseconds even after the light source is switched off. While such light-induced transitions have been observed before, Gedik notes that they typically only last for picoseconds.

The technique works because the terahertz source stimulates the atoms in the FePS3 at the same frequency at which the atoms collectively vibrate (the resonance frequency). When this happens, Gedik explains that the atomic lattice undergoes a unique form of stretching. This stretching cannot be achieved with external mechanical forces, and it pushes the spins of the atoms out of their magnetically alternating alignment.

The result is a state in which the spin in one direction is larger, transforming the originally antiferromagnetic material into a state with net magnetization. This metastable state becomes increasingly robust as the temperature of the material approaches the antiferromagnetic transition point. That is a sign that critical fluctuations near the phase transition point are a key factor in enhancing both the magnitude and lifetime of the new magnetic state, Gedik says.

A new experimental setup

The team, which includes researchers from the Max Planck Institute for the Structure and Dynamics of Matter in Germany, the University of the Basque Country in Spain, Seoul National University and the Flatiron Institute in New York, wasn’t originally aiming to produce long-lived magnetic states. Instead, its members were investigating nonlinear interactions among low-energy collective modes, such as phonons (vibrations of the atomic lattice) and spin excitations called magnons, in layered magnetic materials like FePS3. It was for this purpose that they developed a new experimental setup capable of generating strong terahertz pulses with a wide spectral bandwidth.

“Since nonlinear interactions are generally weak, we chose a family of materials known for their strong coupling between magnetic spins and phonons,” Gedik says. “We also suspected that, under such intense resonant excitation in these particular materials, something intriguing might occur – and indeed, we discovered a new magnetic state with an exceptionally long lifetime.”

While the researchers’ focus remains on fundamental questions, they say the new findings may enable a “significant step” toward practical applications for ultrafast science. “The antiferromagnetic nature of the material holds great potential for potentially enabling faster and more compact memory and processing devices,” says. Gedik’s MIT colleague Batyr Ilyas. He adds that the observed long lifetime of the induced state means that it can be explored further using conventional experimental probes used in spintronic technologies.

The team’s next step will be to study the nonlinear interactions between phonons and magnons more closely using two-dimensional spectroscopy experiments. “Second, we plan to demonstrate the feasibility of probing this metastable state through electrical transport experiments,” Ilyas tells Physics World. “Finally, we aim to investigate the generalizability of this phenomenon in other materials, particularly those exhibiting enhanced fluctuations near room temperature.”

The work is detailed in Nature.

The post Terahertz light produces a metastable magnetic state in an antiferromagnet appeared first on Physics World.

Why electrochemistry lies at the heart of modern technology

This episode of the Physics World Weekly podcast features a conversation with Colm O’Dwyer, who is professor of chemical energy at University College Cork in Ireland and president of the Electrochemical Society.

He talks about the role that electrochemistry plays in the development of modern technologies including batteries, semiconductor chips and pharmaceuticals. O’Dwyer chats about the role that the Electrochemical Society plays in advancing the theory and practice of electrochemistry and solid-state science and technology. He also explains how electrochemists collaborate with scientists and engineers in other fields including physics – and he looks forward to the future of electrochemistry.

Courtesy: American Elements

 

This podcast is supported by American Elements. Trusted by researchers and industries the world over, American Elements is helping shape the future of battery and electrochemistry technology.

The post Why electrochemistry lies at the heart of modern technology appeared first on Physics World.

China’s Experimental Advanced Superconducting Tokamak smashes fusion confinement record

A fusion tokamak in China has smashed its previous fusion record of maintaining a steady-state plasma. This week, scientists working on the Experimental Advanced Superconducting Tokamak (EAST) announced that they had produced a steady-state high-confinement plasma for 1066 seconds, breaking EAST’s previous 2023 record of 403 seconds.

EAST is an experimental superconducting tokamak fusion device located in Hefei, China. Operated by the Institute of Plasma Physics (AISPP) at the Hefei Institute of Physical Science, it began operations in 2006. It is the first tokamak to contain a deuterium plasma using superconducting niobium-titanium toroidal and poloidal magnets.

EAST has recently undergone several upgrades, notably with new plasma diagnostic tools and a doubling in the power of the plasma heating system. EAST is also acting as a testbed for the ITER fusion reactor that is currently being built in Cadarache, France.

The EAST tokamak is able to maintain a plasma in the so-called “H‐mode”. This is the high-confinement regime that modern tokamaks, including ITER, employ. It occurs when the plasma undergoes intense heating by a neutral beam and results in a sudden improvement of plasma confinement by a factor of two.

In 2017 scientists at EAST broke the 100 seconds barrier for a steady-state H-mode plasma and then in 2023 achieved a 403 seconds, a world record at the time. On Monday, EAST officials announced that they had almost tipled that time, delivering H-mode operation for 1066 seconds.

ASIPP director Song Yuntao notes that the new record is “monumental” and represents a “critical step” toward realizing a functional fusion reactor. “A fusion device must achieve stable operation at high efficiency for thousands of seconds to enable the self-sustaining circulation of plasma,” he says, “which is essential for the continuous power generation of future fusion plants”.

The post China’s Experimental Advanced Superconducting Tokamak smashes fusion confinement record appeared first on Physics World.

New candidate emerges for a universal quantum electrical standard

Physicists in Germany have developed a new way of defining the standard unit of electrical resistance. The advantage of the new technique is that because it is based on the quantum anomalous Hall effect rather than the ordinary quantum Hall effect, it does not require the use of applied magnetic fields. While the method in its current form requires ultracold temperatures, an improved version could allow quantum-based voltage and resistance standards to be integrated into a single, universal quantum electrical reference.

Since 2019, all base units in the International System of Units (SI) have been defined with reference to fundamental constants of nature. For example, the definition of the kilogram, which was previously based on a physical artefact (the international prototype kilogram), is now tied to Planck’s constant, h.

These new definitions do come with certain challenges. For example, today’s gold-standard way to experimentally determine the value of h (as well the elementary charge e, another base SI constant) is to measure a quantized electrical resistance (the von Klitzing constant RK = h/e2) and a quantized voltage (the Josephson constant KJ = 2e/h). With RK and KJ pinned down, scientists can then calculate e and h.

To measure RK with high precision, physicists use the fact that it is related to the quantized values of the Hall resistance of a two-dimensional electron system (such as the ones that form in semiconductor heterostructures) in the presence of a strong magnetic field. This quantized change in resistance is known as the quantum Hall effect (QHE), and in semiconductors like GaAs or AlGaAs, it shows up at fields of around 10 Tesla. In graphene, a two-dimensional carbon sheet, fields of about 5 T are typically required.

The problem with this method is that KJ is measured by means of a separate phenomenon known as the AC Josephson effect, and the large external magnetic fields that are so essential to the QHE measurement render Josephson devices inoperable. According to Charles Gould of the Institute for Topological Insulators at the University of Würzburg (JMU), who led the latest research effort, this makes it difficult to integrate a QHE-based resistance standard with the voltage standard.

A way to measure RK at zero external magnetic field

Relying on the quantum anomalous Hall effect (QAHE) instead would solve this problem. This variant of the QHE arises from electron transport phenomena recently identified in a family of materials known as ferromagnetic topological insulators. Such quantum spin Hall systems, as they are also known, conduct electricity along their (quantized) edge channels or surfaces, but act as insulators in their bulk. In these materials, spontaneous magnetization means the QAHE manifests as a quantization of resistance even at weak (or indeed zero) magnetic fields.

In the new work, Gould and colleagues made Hall resistance quantization measurements in the QAHE regime on a device made from V-doped (Bi,Sb)2Te3. These measurements showed that the relative deviation of the Hall resistance from RK at zero external magnetic field is just (4.4 ± 8.7) nΩ Ω−1. The method thus makes it possible to determine RK at zero magnetic field with the needed precision — something Gould says was not previously possible.

The snag is that the measurement only works under demanding experimental conditions: extremely low temperatures (below about 0.05 K) and low electrical currents (below 0.1 uA). “Ultimately, both these parameters will need to be significantly improved for any large-scale use,” Gould explains. “To compare, the QHE works at temperatures of 4.2 K and electrical currents of about 10 uA; making its detection much easier and cheaper to operate.”

Towards a universal electrical reference instrument

The new study, which is detailed in Nature Electronics, was made possible thanks to a collaboration between two teams, he adds. The first is at Würzburg, which has pioneered studies on electron transport in topological materials for some two decades. The second is at the Physikalisch-Technische Bundesanstalt (PTB) in Braunschweig, which has been establishing QHE-based resistance standards for even longer. “Once the two teams became aware of each other’s work, the potential of a combined effort was obvious,” Gould says.

Because the project brings together two communities with very different working methods and procedures, they first had to find a window of operations where their work could co-exist. “As a simple example,” explains Gould, “the currents of ~100 nA used in the present study are considered extremely low for metrology, and extreme care was required to allow the measurement instrument to perform under such conditions. At the same time, this current is some 200 times larger than that typically used when studying topological properties of materials.”

As well as simplifying access to the constants h and e, Gould says the new work could lead to a universal electrical reference instrument based on the QAHE and the Josephson effect. Beyond that, it could even provide a quantum standard of voltage, resistance, and (by means of Ohm’s law) current, all in one compact experiment.

The possible applications of the QAHE in metrology have attracted a lot of attention from the European Union, he adds. “The result is a Europe-wide EURAMET metrology consortium QuAHMET aimed specifically at further exploiting the effect and operation of the new standard at more relaxed experimental conditions.”

The post New candidate emerges for a universal quantum electrical standard appeared first on Physics World.

Nanocrystals measure tiny forces on tiny length scales

Par : No Author

Two independent teams in the US have demonstrated the potential of using the optical properties of nanocrystals to create remote sensors that measure tiny forces on tiny length scales. One team is based at Stanford University and used nanocrystals to measure the micronewton-scale forces exerted by a worm as it chewed bacteria. The other team is based at several institutes and used the photon avalanche effect in nanocrystals to measure sub-nanonewton to micronewton forces. The latter technique could potentially be used to study forces involved in processes such as stem cell differentiation.

Remote sensing of forces at small scales is challenging, especially inside living organisms. Optical tweezers cannot make remote measurements inside the body, while fluorophores – molecules that absorb and re-emit light – can measure forces in organisms, but have limited range, problematic stability or, in the case of quantum dots, toxicity. Nanocrystals with optical properties that change when subjected to external forces offer a way forward.

At Stanford, materials scientist Jennifer Dionne led a team that used nanocrystals doped with ytterbium and erbium. When two ytterbium atoms absorb near-infrared photons, they can then transfer energy to a nearby erbium atom. In this excited state, the erbium can either decay directly to its lowest energy state by emitting red light, or become excited to an even higher-energy state that decays by emitting green light. These processes are called upconversion.

Colour change

The ratio of green to red emission depends on the separation between the ytterbium and erbium atoms, and the separation between the erbium atoms – explains Dionne’s PhD student Jason Casar, who is lead author of a paper describing the Stanford research. Forces on the nanocrystal can change these separations and therefore affect that ratio.

The researchers encased their nanocrystals in polystyrene vessels approximately the size of a E coli bacterium. They then mixed the encased nanoparticles with E coli bacteria that were then fed to tiny nematode worms. To extract the nutrients, the worm’s pharynx needs to break open the bacterial cell wall. “The biological question we set out to answer is how much force is the bacterium generating to achieve that breakage?” explains Stanford’s Miriam Goodman.

The researchers shone near-infrared light on the worms, allowing them to monitor the flow of the nanocrystals. By measuring the colour of the emitted light when the particles reached the pharynx, they determined the force it exerted with micronewton-scale precision.

Meanwhile, a collaboration of scientists at Columbia University, Lawrence Berkeley National Laboratory and elsewhere has shown that a process called photon avalanche can be used to measure even smaller forces on nanocrystals. The team’s avalanching nanoparticles (ANPs) are sodium yttrium fluoride nanocrystals doped with thulium – and were discovered by the team in 2021.

The fun starts here

The sensing process uses a laser tuned off-resonance from any transition from the ground state of the ANP. “We’re bathing our particles in 1064 nm light,” explains James Schuck of Columbia University, whose group led the research. “If the intensity is low, that all just blows by. But if, for some reason, you do eventually get some absorption – maybe a non-resonant absorption in which you give up a few phonons…then the fun starts. Our laser is resonant with an excited state transition, so you can absorb another photon.”

This creates a doubly excited state that can decay radiatively directly to the ground state, producing an upconverted photon. Or, it energy can be transferred to a nearby thulium atom, which becomes resonant with the excited state transition and can excite more thulium atoms into resonance with the laser. “That’s the avalanche,” says Schuck; “We find on average you get 30 or 40 of these events – it’s analogous to a chain reaction in nuclear fission.”

Now, Schuck and colleagues have shown that the exact number of photons produced in each avalanche decreases when the nanoparticle experiences compressive force. One reason is that the phonon frequencies are raised as the lattice is compressed, making non-radiatively decay energetically more favourable.

The thulium-doped nanoparticles decay by emitting either red or near infrared photons. As the force increases, the red dims more quickly, causing a change in the colour of the emitted light. These effects allowed the researchers to measure forces from the sub-nanonewton to the micronewton range – at which point the light output from the nanoparticles became too low to detect.

Not just for forces

Schuck and colleagues are now seeking practical applications of their discovery, and not just for measuring forces.

“We’re discovering that this avalanching process is sensitive to a lot of things,” says Schuck. “If we put these particles in a cell and we’re trying to measure a cellular force gradient, but the cell also happened to change its temperature, that would also affect the brightness of our particles, and we would like to be able to differentiate between those things. We think we know how to do that.”

If the technique could be made to work in a living cell, it could be used to measure tiny forces such as those involved in the extra-cellular matrix that dictate stem cell differentiation.

Andries Meijerink of Utrecht University in the Netherlands believes both teams have done important work that is impressive in different ways. Schuck and colleagues for unveiling a fundamentally new force sensing technique and Dionne’s team for demonstrating a remarkable practical application.

However, Meijerink is sceptical that photon avalanching will be useful for sensing in the short term. “It’s a very intricate process,” he says, adding, “There’s a really tricky balance between this first absorption step, which has to be slow and weak, and this resonant absorption”. Nevertheless, he says that researchers are discovering other systems that can avalanche. “I’m convinced that many more systems will be found,” he says.

Both studies are described in Nature. Dionne and colleagues report their results here, and Schuck and colleagues here.

The post Nanocrystals measure tiny forces on tiny length scales appeared first on Physics World.

IOP president Keith Burnett outlines a ‘pivotal’ year ahead for UK physics

Par : No Author

Last year was the year of elections and 2025 is going to be the year of decisions.

After many countries, including the UK, Ireland and the US, went to the polls in 2024, the start of 2025 will see governments at the beginning of new terms, forced to respond swiftly to mounting economic, social, security, environmental and technological challenges.

These issues would be difficult to address at any given time, but today they come amid a turbulent geopolitical context. Governments are often judged against short milestones – the first 100 days or a first budget – but urgency should not come at the cost of thinking long-term, because the decisions over the next few months will shape outcomes for years, perhaps decades, to come. This is no less true for science than it is for health and social care, education or international relations.

In the UK, the first half of the year will be dominated by the government’s spending review. Due in late spring, it could be one of the toughest political tests for UK science, as the implications of the tight spending plans announced in the October budget become clear. Decisions about departmental spending will have important implications for physics funding, from research to infrastructure, facilities and teaching.

One of the UK government’s commitments is to establish 10-year funding cycles for key R&D activities – a policy that could be a positive improvement. Physics discoveries often take time to realise in full, but their transformational nature is indisputable. From fibre-optic communications to magnetic resonance imaging, physics has been indispensable to many of the world’s most impactful and successful innovations.

Emerging technologies, enabled by physicists’ breakthroughs in fields such as materials science and quantum physics, promise to transform the way we live and work, and create new business opportunities and open up new markets. A clear, comprehensive and long-term vision for R&D would instil confidence among researchers and innovators, and long-term and sustainable R&D funding would enable people and disruptive ideas to flourish and drive tomorrow’s breakthroughs.

Alongside the spending review, we are also expecting the publication of the government’s industrial strategy. The focus of the green paper published last year was an indication of how the strategy will place significance on science and technology in positioning the UK for economic growth.

If we don’t recognise the need to fund more physicists, we will miss so many of the opportunities that lie ahead

Physics-based industries are a foundation stone for the UK economy and are highly productive, as highlighted by research commissioned by the Institute of Physics, which publishes Physics World. Across the UK, the physics sector generates £229bn gross value added, or 11% of total UK gross domestic product. It creates a collective turnover of £643bn, or £1380bn when indirect and induced turnover is included.

Labour productivity in physics-based businesses is also strong at £84 300 per worker, per year. So, if physics is not at the heart of this effort, then the government’s mission of economic revival is in danger of failing to get off the launch pad.

A pivotal year

Another of the new government’s policy priorities is the strategic defence review, which is expected to be published later this year. It could have huge implications for physics given its core role in many of the technologies that contribute to the UK’s defence capabilities. The changing geopolitical landscape, and potential for strained relations between global powers, may well bring research security to the front of the national mind.

Intellectual property, and scientific innovation, are some of the UK’s greatest strengths and it is right to secure them. But physics discoveries in particular can be hampered by overzealous security measures. So much of the important work in our discipline comes from years of collaboration between researchers across the globe. Decisions about research security need to protect, not hamper, the future of UK physics research.

This year could also be pivotal for UK universities, as securing their financial stability and future will be one of the major challenges. Last year, the pressures faced by higher education institutions became apparent, with announcements of course closures, redundancies and restructures as a way of saving money. The rise in tuition fees has far from solved the problem, so we need to be prepared for more turbulence coming for the higher education sector.

These things matter enormously. We have heard that universities are facing a tough situation, and it’s getting harder for physics departments to exist. But if we don’t recognise the need to fund more physicists, we will miss so many of the opportunities that lie ahead.

As we celebrate the International Year of Quantum Science and Technology that marks the centenary of the initial development of quantum mechanics by Werner Heisenberg, 2025 is a reminder of how the benefits of physics span over decades.

We need to enhance all the vital and exciting developments that are happening in physics departments. The country wants and needs a stronger scientific workforce – just think about all those individuals who studied physics and now work in industries that are defending the country – and that workforce will be strongly dependent on physics skills. So our priority is to make sure that physics departments keep doing world-leading research and preparing the next generation of physicists that they do so well.

The post IOP president Keith Burnett outlines a ‘pivotal’ year ahead for UK physics appeared first on Physics World.

Why telling bigger stories is the only way to counter misinformation

If aliens came to Earth and tried to work out how we earthlings make sense of our world, they’d surely conclude that we take information and slot it into pre-existing stories – some true, some false, some bizarre. Ominously, these aliens would be correct. You don’t need to ask earthling philosophers, just look around.

Many politicians and influencers, for instance, are convinced that scientific evidence does not tell the reality about, for instance, autism or AIDS, the state of the atmosphere or the legitimacy of elections, or even about aliens. Truth comes to light only when you “know the full story”, which will eventually reveal the scientific data to be deceptive or irrelevant.

To see how this works in practice, suppose you hear someone say that a nearby lab is leaking x picocuries of a radioactive substance, potentially exposing you to y millirems of dose. How do you know if you’re in danger? Well, you’ll instinctively start going through a mental checklist of questions.

Who’s speaking – scientist, politician, reporter or activist? If it’s a scientist, are they from the government, a university, or an environmental or anti-nuclear group? You might then wonder: how trustworthy are the agencies that regulate the substance? Is the lab a good neighbour, or did it cover up past incidents? How much of the substance is truly harmful?

Your answers to all these questions will shape the story you tell yourself. You might conclude: “The lab is a responsible organization and will protect me”. Or perhaps you’ll think: “The lab is a thorn in the side of the community and is probably doing weapons-related work. The leak’s a sign of something far worse.”

Perhaps your story will be: “Those environmentalists are just trying to scare us and the data indicate the leak is harmless”. Or maybe it’ll be: “I knew it! The lab’s sold out, the data are terrifying, and the activists are revealing the real truth”. Such stories determine the meaning of the picocuries and millirems for humans, not the other way around.

Acquiring data

Humans gain a sense of what’s happening in several ways. Three of them, to use philosophical language, are deferential, civic and melodramatic epistemology.

In “deferential epistemology”, citizens habitually take the word of experts and institutions about things like the dangers of picocuries and exposures of millirems. In his 1624 book New Atlantis, the philosopher Francis Bacon famously crafted a fictional portrait of an island society where deferential epistemology rules and people instinctively trust the scientific infrastructure.

Earthlings haven’t seen deferential epistemology in a while.

We may think this is how people ought to behave. But Bacon, who was also a politician, understood that deference to experts is not automatic and requires constantly curating the public face of the scientific infrastructure. Earthlings haven’t seen deferential epistemology in a while.

“Civic epistemology”, meanwhile, is how people acquire knowledge in the absence of that curation. Such people don’t necessarily reject experts but hear their voices alongside many others claiming to know best how to pursue our interests and values. Civic epistemology is when we negotiate daily life not by first consulting scientists but by pursuing our concerns with a mix of habit, trust, experience and friendly advice.

We sometimes don’t, in fact, take scientific advice when it collides with how we already behave; we may smoke or drink, for instance, despite warnings not to. Or we might seek guidance from non-scientists about things like the harms of radiation.

Finally, what I call “melodramatic epistemology” draws on the word “melodrama”, a genre of theatre involving extreme plots, obvious villains, emotional appeal, sensational language, and moral outrage (the 1939 film Gone with the Wind comes to mind).

A melodramatic lens can be a powerful and irresistible way for humans to digest difficult and emotionally charged events.

Melodramas were once considered culturally insignificant, but scholars such as Peter Brooks from Yale University have shown that a melodramatic lens can be a powerful and irresistible way for humans to digest difficult and emotionally charged events. The clarity, certainty and passion provided by a melodramatic read on a situation tends to displace the complexities, uncertainties and dispassion of scientific evaluation and evidence.

One example from physics occurred at the Lawrence Berkeley Laboratory in the late 1990s when activists fought, successfully, for the closing of its National Tritium Labeling Facility (NTLF). As I have written before, the NTLF had successfully developed techniques for medical studies while releasing tritium emissions well below federal and state environmental standards.

Activists, however, used melodramatic epistemology to paint the NTLF’s scientists as villains spreading breast cancer throughout the area, and denounced them as making “a terrorist attack on the citizens of Berkeley”. One activist called the scientists “piano players in a nuclear whorehouse.”

The critical point

The aliens studying us would worry most about melodramatic epistemology. Melodramatic epistemology, though dangerous, is nearly impervious to being altered, for any contrary data, studies and expert judgment are considered to spring from the villain’s allies and therefore to incite rather than allay fear.

Two US psychologists – William Brady from Northwestern University and Molly Crockett from Princeton University – recently published a study of how and why misinformation spreads (Science 386 991). By analyzing data from Facebook and Twitter and by conducting real experiments with participants, they found that sources of misinformation evoke more outrage than trustworthy sources. Worse still, the outrage encourages us to share the misinformation even if we haven’t fully read the original source.

This makes it hard to counter misinformation. As the authors tactfully conclude: “Outrage-evoking misinformation may be difficult to mitigate with interventions that assume users want to share accurate information”.

The best, and perhaps only, way to challenge melodramatic stories is to write bigger, more encompassing stories that reveal that a different plot is unfolding.

In my view, the best, and perhaps only, way to challenge melodramatic stories is to write bigger, more encompassing stories that reveal that a different plot is unfolding. Such a story about the NTLF, for instance, would comprise story lines about the benefits of medical techniques, the testing of byproducts, the origin of regulations of toxins, the perils of our natural environment, the nature of fear and its manipulation, and so forth. In such a big story, those who promote melodramatic epistemology show up as an obvious, and dangerous, subplot.

If the aliens see us telling such bigger stories, they might not give up earthlings for lost.

The post Why telling bigger stories is the only way to counter misinformation appeared first on Physics World.

SMART spherical tokamak produces its first plasma

A novel fusion device based at the University of Seville in Spain has achieved its first plasma. The SMall Aspect Ratio Tokamak (SMART) is a spherical tokamak that can operate with a “negative triangularity” – the first compact spherical tokamak to do so. Work performed on the machine could be useful when designing compact fusion power plants based on spherical tokamak technology.

SMART has been constructed by the University of Seville’s Plasma Science and Fusion Technology Laboratory. With a vessel dimension of 1.6 × 1.6 m, SMART has a 30 cm diameter solenoid wrapped around 12 toroidal field coils while eight poloidal field coils are used to shape the plasma.

Triangularity refers to the shape of the plasma relative to the tokamak. The cross section of the plasma in a tokamak is typically shaped like a “D”. When the straight part of the D faces the centre of the tokamak, it is said to have positive triangularity. When the curved part of the plasma faces the centre, however, the plasma has negative triangularity.

It is thought that negative triangularity configurations can better suppress plasma instabilities that expel particles and energy from the plasma, helping to prevent damage to the tokamak wall.

Last year, researchers at the University of Seville began to prepare the tokamak’s inner walls for a high pressure plasma by heating argon gas with microwaves. When those tests were successful, engineers then worked toward producing the first plasma.

“This is an important achievement for the entire team as we are now entering the operational phase,” notes SMART principal investigator Manuel García Muñoz. “The SMART approach is a potential game changer with attractive fusion performance and power handling for future compact fusion reactors. We have exciting times ahead.”

The post SMART spherical tokamak produces its first plasma appeared first on Physics World.

When charging quantum batteries, decoherence is a friend, not a foe

Devices like lasers and other semiconductor-based technologies operate on the principles of quantum mechanics, but they only scratch the surface. To fully exploit quantum phenomena, scientists are developing a new generation of quantum-based devices. These devices are advancing rapidly, fuelling what many call the “second quantum revolution”.

One exciting development in this domain is the rise of next-generation energy storage devices known as quantum batteries (QBs).  These devices leverage exotic quantum phenomena such as superposition, coherence, correlation and entanglement to store and release energy in ways that conventional batteries cannot. However, practical realization of QBs has its own challenges  such as reliance on fragile quantum states and difficulty in operating at room temperature.

A recent theoretical study by Rahul Shastri and colleagues from IIT Gandhinagar, India, in collaboration with researchers at China’s Zhejiang University and the China Academy of Engineering Physics takes significant strides towards understanding how QBs can be charged faster and more efficiently, thereby lowering some of the barriers restricting their use.

How does a QB work?

The difference between charging a QB and charging a mobile phone is that with a QB, both the battery and the charger are quantum systems. Shastri and colleagues focused on two such systems: a harmonic oscillator (HO) and a two-level system.  While a two-level system can exist in just two energy states, a harmonic oscillator has an evenly spaced range of energy levels. These systems therefore represent two extremes – one with a discrete, bounded energy range and the other with a more complex, unbounded energy spectrum approaching a continuous limit – making them ideal for exploring the versality of QBs.

In the quantum HO-based setup, a higher-energy HO acts as the charger and a lower-energy one as the battery. When the two are connected, or coupled, energy transfers from the charger to the battery. The two-level system follows the same working principle.  Such coupled quantum systems are routinely realized in experiments.

Using decoherence as a tool to improve QB performance

The study’s findings, which are published in npj Quantum Information, are both surprising and promising, illustrating how a phenomenon typically seen as a challenge in quantum systems – decoherence – can become a solution.

The term “decoherence” refers to the process where a quantum system loses its unique quantum properties (such as quantum correlation, coherence and entanglement). The key trigger for decoherence is quantum noise caused by interactions between a quantum system and its environment.

Since no real-world physical system is perfectly isolated, such noise is unavoidable, and even minute amounts of environmental noise can lead to decoherence. Maintaining quantum coherence is thus extremely challenging even in controlled laboratory settings, let alone industrial environments producing large-scale practical devices. For this reason, decoherence represents one of the most significant obstacles in advancing quantum technologies towards practical applications.

Shastri and colleagues, however discovered a way to turn this foe into a friend. “Instead of trying to eliminate these naturally occurring environmental effects, we ask: why not use them to our advantage?” Shashtri says.

The method they developed speeds up the charging process using a technique called controlled dephasing. Dephasing is a form of decoherence that usually involves the gradual loss of quantum coherence, but the researchers found that when managed carefully, it can actually boost the battery’s performance.

Dissipative effects, traditionally seen as a hindrance, can be harnessed to enhance performance

Rahul Shastri

To understand how this works, it’s important to note that at low levels of dephasing, the battery undergoes smooth energy oscillations. Too much dephasing, however, freezes these oscillations in what’s known as the quantum Zeno effect, essentially stalling the energy transfer. But with just the right amount of dephasing, the battery charges faster while maintaining stability. By precisely controlling the dephasing rate, therefore, it becomes possible to strike a balance that significantly improves charging speed while still preserving stability. This balance leads to quicker, more robust charging that could overcome challenges posed by environmental factors.

“Our study shows how dissipative effects, traditionally seen as a hindrance, can be harnessed to enhance performance,” Shastri notes. This opens the door to scalable, robust quantum battery designs, which could be extremely useful for energy management in quantum computing and other quantum-enabled applications.

Implications for scalable quantum technologies

The results of this study are encouraging for the quantum-technology industry. As per Shastri, using dephasing to optimize the charging speed and stability of QBs not only advances fundamental understanding but also addresses practical challenges in quantum energy storage.

“Our proposed method could be tested on existing platforms such as superconducting qubits and NMR systems, where dephasing control is already experimentally feasible,” he says. These platforms offer experimentalists a tangible starting point for verifying the study’s predictions and further refining QB performance.

Experimentalists testing this theory will face challenges. Examples include managing additional decoherence mechanisms like amplitude damping and achieving the ideal balance of controlled dephasing in realistic setups. However, Shastri says that these challenges present valuable opportunities to refine and expand the proposed theoretical model for optimizing QB performance under practical conditions. The second quantum revolution is already underway, and QBs might just be the power source that charges our quantum future.

The post When charging quantum batteries, decoherence is a friend, not a foe appeared first on Physics World.

Microbeams plus radiosensitizers could optimize brain cancer treatment

Par : Tami Freeman

Brain tumours are notoriously difficult to treat, resisting conventional treatments such as radiation therapy, where the deliverable dose is limited by normal tissue tolerance. To better protect healthy tissues, researchers are turning to microbeam radiation therapy (MRT), which uses spatially fractionated beams to spare normal tissue while effectively killing cancer cells.

MRT is delivered using arrays of ultrahigh-dose rate synchrotron X-ray beams tens of microns wide (high-dose peaks) and spaced hundreds of microns apart (low-dose valleys). A research team from the Centre for Medical Radiation Physics at the University of Wollongong in Australia has now demonstrated that combining MRT with targeted radiosensitizers – such as nanoparticles or anti-cancer drugs – can further boost treatment efficacy, reporting their findings in Cancers.

“MRT is famous for its healthy tissue-sparing capabilities with good tumour control, whilst radiosensitizers are known for their ability to deliver targeted dose enhancement to cancer,” explains first author Michael Valceski. “Combining these modalities just made sense, with their synergy providing the potential for the best of both worlds.”

Enhancement effects

Valceski and colleagues combined MRT with thulium oxide nanoparticles, the chemotherapy drug methotrexate and the radiosensitizer iododeoxyuridine (IUdR). They examined the response of monolayers of rodent brain cancer cells to various therapy combinations. They also compared conventional broadbeam orthovoltage X-ray irradiation with synchrotron broadbeam X-rays and synchrotron MRT.

Synchrotron irradiations were performed on the Imaging and Medical Beamline at the ANSTO Australian Synchrotron, using ultrahigh dose rates of 74.1 Gy/s for broadbeam irradiation and 50.3 Gy/s for MRT. The peak-to-valley dose ratio (PVDR, used to characterize an MRT field) of this set-up was measured as 8.9.

Using a clonogenic assay to measure cell survival, the team observed that synchrotron-based irradiation enhanced cell killing compared with conventional irradiation at the same 5 Gy dose (for MRT this is the valley dose, the peaks experience 8.9 times higher dose), demonstrating the increased cell-killing effect of these ultrahigh-dose rate X-rays.

Adding radiosensitizers further increased the impact of synchrotron broadbeam irradiation, with DNA-localized IUdR killing more cells than cytoplasm-localized nanoparticles. Methotrexate, meanwhile, halved cell survival compared with conventional irradiation.

The team observed that at 5 Gy, MRT showed equivalent cell killing to synchrotron broadbeam irradiation. Valceski explains that this demonstrates MRT’s tissue-sparing potential, by showing how MRT can maintain treatment efficacy while simultaneously protecting healthy cells.

MRT also showed enhanced cell killing when combined with radiosensitizers, with the greatest effect seen for IUdR and IUdR plus methotrexate. This local dose enhancement, attributed to the DNA localization of IUdR, could further improve the tissue-sparing capabilities of MRT by enabling a lower per-fraction dose to reduce patient exposure whilst maintaining tumour control.

Imaging valleys and peaks

To link the biological effects with the physical collimation of MRT, the researchers performed confocal microscopy (at the Fluorescence Analysis Facility in Molecular Horizons, University of Wollongong) to investigate DNA damage following treatment at 0.5 and 5 Gy. Twenty minutes after irradiation, they imaged fixed cells to visualize double-strand DNA breaks (DSBs), as shown by γH2AX foci (representing a nuclear DSB site).

Spatially fractionated beams
Spatially fractionated beams Imaging DNA damage following MRT confirms that the cells’ biological responses match the beam collimation. The images show double-strand DNA breaks (green) overlaid on a nuclear counterstain (blue). (Courtesy: CC BY/Cancers 10.3390/cancers16244231)

The images verified that the cells’ biological responses corresponded with the MRT beam patterns, with the 400 µm microbeam spacing clearly seen in all treated cells, both with and without radiosensitizers.

In the 0.5 Gy images, the microbeam tracks were consistent in width, while the 5 Gy MRT tracks were wider as DNA damage spread from peaks into the valleys. This radiation roll-off was also seen with IUdR and IUdR plus methotrexate, with numerous bright foci visible in the valleys, demonstrating dose enhancement and improved cancer-killing with these radiosensitizers.

The researchers also analysed the MRT beam profiles using the γH2AX foci intensity across the images. Cells treated with radiosensitizers had broadened peaks, with the largest effect seen with the nanoparticles. As nanoparticles can be designed to target tumours, this broadening (roughly 30%) can be used to increase the radiation dose to cancer cells in nearby valleys.

“Peak broadening adds a novel benefit to radiosensitizer-enhanced MRT. The widening of the peaks in the presence of nanoparticles could potentially ‘engulf’ the entire cancer, and only the cancer, whilst normal tissues without nanoparticles retain the protection of MRT tissue sparing,” Valceski explains. “This opens up the potential for MRT radiosurgery, something our research team has previously investigated.”

Finally, the researchers used γH2AX foci data for each peak and valley to determine a biological PVDR. The biological PDVR values matched the physical PVDR of 8.9, confirming for the first time a direct relationship between physical dose delivered and DSBs induced in the cancer cells. They note that adding radiosensitizers generally lowered the biological PVDRs from the physical value, likely due to additional DSBs induced in the valleys.

The next step will be to perform preclinical studies of MRT. “Trials to assess the efficacy of this multimodal therapy in treating aggressive cancers in vivo are key, especially given the theragnostic potential of nanoparticles for image-guided treatment and precision planning, as well as cancer-specific dose enhancement,” senior author Moeava Tehei tells Physics World. “Considering the radiosurgical potential of stereotactic, radiosensitizer-enhanced MRT fractions, we can foresee a revolutionary multimodal technique with curative potential in the near future.”

The post Microbeams plus radiosensitizers could optimize brain cancer treatment appeared first on Physics World.

Wrinkles in spacetime could remember the secrets of exploding stars

Par : No Author

Permanent distortions in spacetime caused by the passage of gravitational waves could be detectable from Earth. Known as “gravitational memory”, such distortions are predicted to occur most prominently when the core of a supernova collapses. Observing them could therefore provide a window into the death of massive stars and the creation of black holes, but there’s a catch: the supernova might have to happen in our own galaxy.

Physicists have been detecting gravitational waves from colliding stellar-mass black holes and neutron stars for almost a decade now, and theory predicts that core-collapse supernovae should also produce them. The difference is that unlike collisions, supernovae tend to be lopsided – they don’t explode outwards equally in all directions. It is this asymmetry – in both the emission of neutrinos from the collapsing core and the motion of the blast wave itself – that produces the gravitational-wave memory effect.

“The memory is the result of the lowest frequency aspects of these motions,” explains Colter Richardson, a PhD student at the University of Tennessee in Knoxville, US and co-lead author (with Haakon Andresen of Sweden’s Oskar Klein Centre) of a Physical Review Letters paper describing how gravitational-wave memory detection might work on Earth.

Filtering out seismic noise

Previously, many physicists assumed it wouldn’t be possible to detect the memory effect from Earth. This is because it manifests at frequencies below 10 Hz, where noise from seismic events tends to swamp detectors. Indeed, Harvard astrophysicist Kiranjyot Gill argues that detecting gravitational memory “would require exceptional sensitivity in the millihertz range to separate it from background noise and other astrophysical signals” – a sensitivity that she says Earth-based detectors simply don’t have.

Anthony Mezzacappa, Richardson’s supervisor at Tennessee, counters this by saying that while the memory signal itself cannot be detected, the ramp-up to it can. “The signal ramp-up corresponds to a frequency of 20–30 Hz, which is well above 10 Hz, below which the detector response needs to be better characterized for what we can detect on Earth, before dropping down to virtually 0 Hz where the final memory amplitude is achieved,” he tells Physics World.

The key, Mezzacappa explains, is a “matched filter” technique in which templates of what the ramp-up should look like are matched to the signal to pick it out from low-frequency background noise. Using this technique, the team’s simulations show that it should be possible for Earth-based gravitational-wave detectors such as LIGO to detect the ramp-up even though the actual deformation effect would be tiny – around 10-16 cm “scaled to the size of a LIGO detector arm”, Richardson says.

The snag is that for the ramp-up to be detectable, the simulations suggest the supernova would need to be close – probably within 10 kiloparsecs (32::615 light years) of Earth. That would place it within our own galaxy, and galactic supernovae are not exactly common. The last to be observed in real time was spotted by Johannes Kepler in 1604; though there have been others since, we’ve only identified their remnants after the fact.

Going to the Moon

Mezzacappa and colleagues are optimistic that multi-messenger astronomy techniques such as gravitational-wave and neutrino detectors will help astronomers identify future Milky Way supernovae as they happen, even if cosmic dust (for example) hides their light for optical observers.

Gill, however, prefers to look towards the future. In a paper under revision at Astrophysical Journal Letters, and currently available as a preprint, she cites two proposals for detectors on the Moon that could transform gravitational-wave physics and extend the range at which gravitational memory signals can be detected.

The first, called the Lunar Gravitational Wave Antenna, would use inertial sensors to detect the Moon shaking as gravitational waves ripple through it. The other, known as the Laser Interferometer Lunar Antenna, would be like a giant, triangular version of LIGO with arms spanning tens of kilometres open to space. Both are distinct from the European Space Agency’s Laser Interferometer Space Antenna, which is due for launch in the 2030s, but is optimized to detect gravitational waves from supermassive black holes rather than supernovae.

“Lunar-based detectors or future space-based observatories beyond LISA would overcome the terrestrial limitations,” Gill argues. Such detectors, she adds, could register a memory effect from supernovae tens or even hundreds of millions of light years away. This huge volume of space would encompass many galaxies, making the detection of gravitational waves from core-collapse supernovae almost routine.

The memory of something far away

In response, Richardson points out that his team’s filtering method could also work at longer ranges – up to approximately 10 million light years, encompassing our own Local Group of galaxies and several others – in certain circumstances. If a massive star is spinning very quickly, or it has an exceptionally strong magnetic field, its eventual supernova explosion will be highly collimated and almost jet-like, boosting the amplitude of the memory effect. “If the amplitude is significantly larger, then the detection distance is also significantly larger,” he says.

Whatever technologies are involved, both groups agree that detecting gravitational-wave memory is important. It might, for example, tell us whether a supernova has left behind a neutron star or a black hole, which would be valuable because the reasons one forms and not the other remain a source of debate among astrophysicists.

“By complementing other multi-messenger observations in the electromagnetic spectrum and neutrinos, gravitational-wave memory detection would provide unparalleled insights into the complex interplay of forces in core-collapse supernovae,” Gill says.

Richardson agrees that a detection would be huge and hopes that his work and that of others “motivates new investigations into the low-frequency region of gravitational-wave astronomy”.

The post Wrinkles in spacetime could remember the secrets of exploding stars appeared first on Physics World.

‘Why do we have to learn this?’ A physics educator’s response to every teacher’s least favourite question

Par : No Author

Several years ago I was sitting at the back of a classroom supporting a newly qualified science teacher. The lesson was going well, a pretty standard class on Hooke’s law, when a student leaned over to me and asked “Why are we doing this? What’s the point?”.

Having taught myself, this was a question I had been asked many times before. I suspect that when I was a teacher, I went for the knee-jerk “it’s useful if you want to be an engineer” response, or something similar. This isn’t a very satisfying answer, but I never really had the time to formulate a real justification for studying Hooke’s law, or physics in general for that matter.

Who is the physics curriculum designed for? Should it be designed for the small number of students who will pursue the subject, or subjects allied to it, at the post-16 and post-18 level? Or should we be reflecting on the needs of the overwhelming majority who will never use most of the curriculum content again? Only about 10% of students pursue physics or physics-rich subjects post-16 in England, and at degree level, only around 4000 students graduate with physics degrees in the UK each year.

One argument often levelled at me is that learning this is “useful”, to which I retort – in a similar vein to the student from the first paragraph – “In what way?” In the 40 years or so since first learning Hooke’s law, I can’t remember ever explicitly using it in my everyday life, despite being a physicist. Whenever I give a talk on this subject, someone often pipes up with a tenuous example, but I suspect they are in the minority. An audience member once said they consider the elastic behaviour of wire when hanging pictures, but I suspect that many thousands of pictures have been successfully hung with no recourse to F = –kx.

Hooke’s law is incredibly important in engineering but, again, most students will not become engineers or rely on a knowledge of the properties of springs, unless they get themselves a job in a mattress factory.

From a personal perspective, Hooke’s law fascinates me. I find it remarkable that we can see the macroscopic properties of materials being governed by microscopic interactions and that this can be expressed in a simple linear form. There is no utilitarianism in this, simply awe, wonder and aesthetics. I would always share this “joy of physics” with my students, and it was incredibly rewarding when this was reciprocated. But for many, if not most, my personal perspective was largely irrelevant, and they knew that the curriculum content would not directly support them in their future careers.

At this point, I should declare my position – I don’t think we should take Hooke’s law, or physics, off the curriculum, but my reason is not the one often given to students.

A series of lessons on Hooke’s law is likely to include: experimental design; setting up and using equipment; collecting numerical data using a range of devices; recording and presenting data, including graphs; interpreting data; modelling data and testing theories; devising evidence-based explanations; communicating ideas; evaluating procedures; critically appraising data; collaborating with others; and working safely.

Science education must be about preparing young people to be active and critical members of a democracy, equipped with the skills and confidence to engage with complex arguments that will shape their lives. For most students, this is the most valuable lesson they will take away from Hooke’s law. We should encourage students to find our subject fascinating and relevant, and in doing so make them receptive to the acquisition of scientific knowledge throughout their lives.

At a time when pressures on the education system are greater than ever, we must be able to articulate and justify our position within a crowded curriculum. I don’t believe that students should simply accept that they should learn something because it is on a specification. But they do deserve a coherent reason that relates to their lives and their careers. As science educators, we owe it to our students to have an authentic justification for what we are asking them to do. As physicists, even those who don’t have to field tricky questions from bored teenagers, I think it’s worthwhile for all of us to ask ourselves how we would answer the question “What is the point of this?”.

The post ‘Why do we have to learn this?’ A physics educator’s response to every teacher’s least favourite question appeared first on Physics World.

New Journal of Physics seeks to expand its horizons

Par : No Author

The New Journal of Physics (NJP) has long been a flagship journal for IOP Publishing. The journal published its first volume in 1998 and was an early pioneer of open-access publishing. Co-owned by the Institute of Physics, which publishes Physics World, and the Deutsche Physikalische Gesellschaft (DPG), after some 25 years the journal is now seeking to establish itself further as a journal that represents the entire range of physics disciplines.

New Journal of Physics
A journal for all physics: the New Journal of Physics publishes research in a broad range of disciplines including quantum optics and quantum information, condensed-matter physics as well as high-energy physics. (Courtesy: IOP Publishing)

NJP publishes articles in pure, applied, theoretical and experimental research, as well as interdisciplinary topics. Research areas include optics, condensed-matter physics, quantum science and statistical physics, and the journal publishes a range of article types such as papers, topical reviews, fast-track communications, perspectives and special issues.

While NJP has been seen as a leading journal for quantum information, optics and condensed-matter physics, the journal is currently undergoing a significant transformation to broaden its scope to attract a wider array of physics disciplines. This shift aims to enhance the journal’s relevance, foster a broader audience and maintain NJP’s position as a leading publication in the global scientific community.

While quantum physics in general, and quantum optics and quantum information in particular, will remain crucial areas for the journal, researchers in other fields such as gravitational-wave research, condensed- and soft-matter physics, polymer physics, theoretical chemistry, statistical and mathematical physics are being encouraged to submit their articles to the journal. “It’s a reminder to the community that NJP is a journal for all kinds of physics and not just a select few,” says quantum physicist Andreas Buchleitner from the Albert-Ludwigs-Universität Freiburg who is NJP’s editor-in-chief.

Historically, NJP has had a strong focus on theoretical physics, particularly in quantum information. Yet another significant aspect of NJP’s new strategy is the inclusion of more experimental research. Attracting high-quality experimental papers to balance its content and enhance its reputation as a comprehensive physics journal, will also allow it to compete with other leading physics journals. Part of this shift will also involve attracting a reliable and loyal group of authors who regularly publish their best work in NJP.

A broader scope

To aid this move, NJP has recently grown its editorial board to add expertise in subjects such as gravitational-wave physics. This diversity of capabilities is crucial to evaluate submissions from different areas of physics and maintain high standards of quality during the peer-review process. That point is particularly relevant for Buchleitner, who sees the expansion of the editorial board as helping to improve the journal’s handling of submissions to ensure that authors feel their work is being evaluated fairly and by knowledgeable and engaged individuals. “Increasing the editorial board was quite an important concept in terms of helping the journal expand,” adds Buchleitner. “What is important to me is that scientists who contact the journal feel that they are talking to people and not to artificial intelligence substitutes.”

While citation metrics such as impact factors are often debated in terms of their scientific value, they remain essential for a journal’s visibility and reputation. In the competitive landscape of scientific publishing, they can set a journal apart from its competitors. With that in mind, NJP, which has an impact factor of 2.8, is also focusing on improving its citation indices to compete with top-tier journals.

Yet that doesn’t only just include the impact factor but other metrics that ensure efficient and constructive handling of submissions that will encourage researchers to publish with the journal again. To set it apart from competitors, the time taken to first decision before peer review, for example, is only six days while the journal has a median of 50 days to first decision after peer review.

Society benefits

While NJP pioneered the open-access model of scientific publishing, that position is no longer unique given the huge increase in open-access journals over the past decade. Yet the publishing model continues to be an important aspect of the journal’s identity to ensure that the research it publishes is freely available to all. Another crucial factor to attract authors and set it apart from commercial entities is that NJP is published by learned societies – the IOP and DPG.

NJP has often been thought of as a “European journal”. Indeed, NJP’s role is significant in the context of the UK leaving the European Union, in that it serves as a bridge between the UK and mainland European research communities. “That’s one of the reasons why I like the journal,” says Buchleitner, who adds that with a wider scope NJP will not only publish the best research from around the world but also strengthen its identity as a leading European journal.

The post <em>New Journal of Physics</em> seeks to expand its horizons appeared first on Physics World.

Novel MRI technique can quantify lung function

Par : Tami Freeman

Assessing lung function is crucial for diagnosing and monitoring respiratory diseases. The most common way to do this is using spirometry, which measures the amount and speed of air that a person can inhale and exhale. Spirometry, however, is insensitive to early disease and cannot detect heterogeneity in lung function. Techniques such as chest radiography or CT provide more detailed spatial information, but are not ideal for long-term monitoring as they expose patients to ionizing radiation.

Now, a team headed up at Newcastle University in the UK has demonstrated a new lung MR imaging technique that provides quantitative and spatially localized assessment of pulmonary ventilation. The researchers also show that the MRI scans – recorded after the patient inhales a safe gas mixture – can track improvements in lung function following medication.

Although conventional MRI of the lungs is challenging, lung function can be assessed by imaging the distribution of an inhaled gas, most commonly hyperpolarized 3He or 129Xe. These gases can be expensive, however, and the magnetic preparation step requires extra equipment and manpower. Instead, project leader Pete Thelwall and colleagues are investigating 19F-MRI of inhaled perfluoropropane – an inert gas that does not require hyperpolarization to be visible in an MRI scan.

“Conventional MRI detects magnetic signals from hydrogen nuclei in water to generate images of water distribution,” Thelwall explains. “Perfluoropropane is interesting to us as we can also get an MRI signal from fluorine nuclei and visualize the distribution of inhaled perfluoropropane. We assess lung ventilation by seeing how well this MRI-visible gas moves into different parts of the lungs when it is inhaled.”

Testing the new technique

The researchers analysed 19F-MRI data from 38 healthy participants, 35 with asthma and 21 with chronic obstructive pulmonary disease (COPD), reporting their findings in Radiology. For the 19F-MRI scans, participants were asked to inhale a 79%/21% mixture of perfluoropropane and oxygen and then hold their breath. All subjects also underwent spirometry and an anatomical 1H-MRI scan, and those with respiratory disease withheld their regular bronchodilator medication before the MRI exams.

After co-registering each subject’s anatomical (1H) and ventilation (19F) images, the researchers used the perfluoropropane distribution in the images to differentiate ventilated and non-ventilated lung regions. They then calculated the ratio of non-ventilated lung to total lung volume, a measure of ventilation dysfunction known as the ventilation defect percentage (VDP).

Healthy subjects had a mean VDP of 1.8%, reflecting an even distribution of inhaled gas throughout their lungs and well-preserved lung function. In comparison, the patient groups showed elevated mean VDP values – 8.3% and 27.2% for those with asthma and COPD, respectively – reflecting substantial ventilation heterogeneity.

In participants with respiratory disease, the team also performed 19F-MRI after treatment with salbutamol, a common inhaler. They found that the MR images revealed changes in regional ventilation in response to this bronchodilator therapy.

Post-treatment images of patients with asthma showed an increase in lung regions containing perfluoropropane, reflecting the reversible nature of this disease. Participants with COPD generally showed less obvious changes following treatment, as expected for this less reversible disease. Bronchodilator therapy reduced the mean VDP by 33% in participants with asthma (from 8.3% to 5.6%) and by 14% in those with COPD (from 27.2% to 23.3%).

The calculated VDP values were negatively associated with standard spirometry metrics. However, the team note that some participants with asthma exhibited normal spirometry but an elevated mean VDP (6.7%) compared with healthy subjects. This finding suggests that the VDP acquired by 19F-MRI of inhaled perfluoropropane is more sensitive to subclinical disease than conventional spirometry.

Supporting lung transplants

In a separate study reported in JHLT Open, Thelwall and colleagues used dynamic 19F-MRI of inhaled perfluoropropane to visualize the function of transplanted lungs. Approximately half of lung transplant recipients experience organ rejection, known as chronic lung allograft dysfunction (CLAD), within five years of transplantation.

Lung function MRI
Early detection Lung function MRI showing areas of dysfunction in transplant recipients. (Courtesy: Newcastle University, UK)

Transplant recipients are monitored frequently using pulmonary function tests and chest X-rays. But by the time CLAD is diagnosed, irreversible lung damage may already have occurred. The team propose that 19F-MRI may find subtle early changes in lung function that could help detect rejection earlier.

The researchers studied 10 lung transplant recipients, six of whom were experiencing chronic rejection. They used a wash-in and washout technique, acquiring breath-hold 19F-MR images while the patient inhaled a perfluoropropane/oxygen mixture (wash-in acquisitions), followed by scans during breathing of room air (washout acquisitions).

The MR images revealed quantifiable differences in regional ventilation in participants with and without CLAD. In those with chronic rejection, scans showed poorer air movement to the edges of the lungs, likely due to damage to the small airways, a typical feature of CLAD. By detecting such changes in lung function, before signs of damage are seen in other tests, it’s possible that this imaging method might help inform patient treatment decisions to better protect the transplanted lungs from further damage.

The studies fall squarely within the field of clinical research, requiring non-standard MRI hardware to detect fluorine nuclei. But Thelwall sees a pathway towards introducing 19F-MRI in hospitals, noting that scanner manufacturers have brought systems to market that can detect nuclei other than 1H in routine diagnostic scans. Removing the requirement for hyperpolarization, combined with the lower relative cost of perfluoropropane inhalation (approximately £50 per study participant), could also help scale this method for use in the clinic.

The team is currently working on a study looking at how MRI assessment of lung function could help reduce the side effects associated with radiotherapy for lung cancer. The idea is to design a radiotherapy plan that minimizes dose to lung regions with good function, whilst maintaining effective cancer treatment.

“We are also looking at how better lung function measurements might help the development of new treatments for lung disease, by being able to see the effects of new treatments earlier and more accurately than current lung function measurements used in clinical trials,” Thelwall tells Physics World.

The post Novel MRI technique can quantify lung function appeared first on Physics World.

Astrophysicists reveal huge variation in the shape of exocomet belts

Astrophysicists have released a series of images of exocomet belts and the tiny “pebbles” that reside in them.

Exocomets are boulders of rock and ice, at least 1 km in size, that exist outside our solar system. Exocometary belts – regions containing many such icy bodies – are found in at least 20% of planetary systems. When the exocomets within these belts smash together they can also produce small pebbles.

The belts in the latest study orbit 74 nearby stars that cover a range of ages – from those that are have just formed to those in more mature systems like our own Solar System. The belts typically lie tens to hundreds of astronomical units (the distance from the Earth to the Sun) from their central star.

At that distance, the temperature is between -250 to -150 degrees Celsius, meaning that most compounds on the exocomets are frozen as ice.

While most exocometary belts in the latest study are disks, some are narrow rings. Some even have multiple rings/disks that are eccentric, which provides evidence that yet undetectable planets are present and their gravity affects the distribution of the pebbles in these systems.

According the Sebastián Marino from the University of Exeter, the images reveal “a remarkable diversity in the structure” of the belts.

Indeed, Luca Matrà from Trinity College Dublin says that the “power” of such a large survey is to reveal population-wide properties and trends. “[The survey] confirmed that the number of pebbles decreases for older planetary systems as belts run out of larger exocomets smashing together, but showed for the first time that this decrease in pebbles is faster if the belt is closer to the central star,” Matrà adds. “It also indirectly showed – through the belts’ vertical thickness – that unobservable objects as large as 140 km to Moon-size are likely present in these belts.”

The researchers took the images using the Atacama Large Millimeter/submillimeter Array – an array of 66 radio telescopes in the Atacama Desert of northern Chile – as well as the eight-element Submillimeter Array based in Hawaii.

The post Astrophysicists reveal huge variation in the shape of exocomet belts appeared first on Physics World.

World’s darkest skies threatened by industrial megaproject in Chile, astronomers warn

Par : No Author

The darkest, clearest skies anywhere in the world could suffer “irreparable damage” by a proposed industrial megaproject. That is the warning from the European Southern Observatory (ESO) in response to plans by AES Andes, a subsidiary of the US power company AES Corporation, to develop a green hydrogen project just a few kilometres from ESO’s flagship Paranal Observatory in Chile’s Atacama Desert.

The Atacama Desert is considered one of the most important astronomical research sites in the world due to its stable atmosphere and lack of light pollution. Sitting 2635 m above sea level, on Cerro Paranal, the Paranal Observatory is home to key astronomical instruments including the Very Large Telescope. The Extremely Large Telescope (ELT) – the largest visible and infrared light telescope in the world – is also being constructed at the observatory on Cerro Armazones with first light expected in 2028.

AES Chile submitted an Environmental Impact Assessment in Chile for an industrial-scale green hydrogen project at the end of December. The complex is expected to cover more than 3000 hectares – similar in size to 1200 football pitches. According to AES, the project is in the early stages of development, but could include green hydrogen and ammonia production plants, solar and wind farms as well as battery storage facilities.

ESO is calling for the development to be relocated to preserve “one of Earth’s last truly pristine dark skies” and “safeguard the future” of astronomy. “The proximity of the AES Andes industrial megaproject to Paranal poses a critical risk to the most pristine night skies on the planet,” says ESO director general Xavier Barcons. “Dust emissions during construction, increased atmospheric turbulence, and especially light pollution will irreparably impact the capabilities for astronomical observation.”

In a statement sent to Physics World, an AES spokesperson says they “understand there are concerns raised by ESO regarding the development of renewable energy projects in the area”. The spokesperson adds that the project would be in an area “designated for renewable energy development”. They also claim that the company is “dedicated to complying with all regulatory guidelines and rules” and “supporting local economic development while maintaining the highest environmental and safety standards”.

According to the statement, the proposal “incorporates the highest standards in lighting” to comply with Chilean regulatory requirements designed “to prevent light pollution, and protect the astronomical quality of the night skies”.

Yet Romano Corradi, director of the Gran Telescopio Canarias, which is located at the Roque de los Muchachos Observatory, La Palma, Spain, noted that it is “obvious” that the light pollution from such a large complex will negatively affect observations. “There are not many places left in the world with the dark and other climatic conditions necessary to do cutting-edge science in the field of observational astrophysics,” adds Corradi. “Light pollution is a global effect and it is therefore essential to protect sites as important as Paranal.”

The post World’s darkest skies threatened by industrial megaproject in Chile, astronomers warn appeared first on Physics World.

Could bubble-like microrobots be the drug delivery vehicles of the future?

Par : No Author

Biomedical microrobots could revolutionize future cancer treatments, reliably delivering targeted doses of toxic cancer-fighting drugs to destroy malignant tumours while sparing healthy bodily tissues. Development of such drug-delivering microrobots is at the forefront of biomedical engineering research. However, there are many challenges to overcome before this minimally invasive technology moves from research lab to clinical use.

Microrobots must be capable of rapid, steady and reliable propulsion through various biological materials, while generating enhanced image contrast to enable visualization through thick body tissue. They require an accurate guidance system to precisely target diseased tissue. They also need to support sizable payloads of drugs, maintain their structure long enough to release this cargo, and then efficiently biodegrade – all without causing any harm to the body.

Aiming to meet this tall order, researchers at the California Institute of Technology (Caltech) and the University of Southern California have designed a hydrogel-based, image-guided, bioresorbable acoustic microrobot (BAM) with these characteristics and capabilities. Reporting their findings in Science Robotics, they demonstrated that the BAMs could successfully deliver drugs that decreased the size of bladder tumours in mice.

Microrobot design

The team, led by Caltech’s Wei Gao, fabricated the hydrogel-based BAMs using high-resolution two-photon polymerization. The microrobots are hollow spheres with an outer diameter of 30 µm and an 18 µm-diameter internal cavity to trap a tiny air bubble inside.

The BAMs have a hydrophobic inner surface to prolong microbubble retention within biofluids and a hydrophilic outer layer that prevents microrobot clustering and promotes degradation. Magnetic nanoparticles and therapeutic agents integrated into the hydrogel matrix enable wireless magnetic steering and drug delivery, respectively.

The entrapped microbubbles are key as they provide propulsion for the BAMs. When stimulated by focused ultrasound (FUS), the bubbles oscillate at their resonant frequencies. This vibration creates microstreaming vortices around the BAM, generating a propulsive force in the opposite direction of the flow. The microbubbles inside the BAMs also act as ultrasound contrast agents, enabling real-time, deep-tissue visualization.

The researchers designed the microrobots with two cylinder-like openings, which they found achieves faster propulsion speeds than single- or triple-opening spheres. They attribute this to propulsive forces that run parallel to the sphere’s boundary improving both speed and stability of movement when activated by FUS.

Flow patterns generated by a vibrating BAM
Numerical simulations Flow patterns generated by a BAM vibrating at its resonant frequency. The microrobot’s two openings are clearly visible. Scale bar, 15 µm. (Courtesy: Hong Han)

They also discovered that asymmetric placement of the microbubble centre from the centre of the sphere generated propulsion speeds more than twice that achieved by BAMS with a symmetric design.

To perform simultaneous imaging of BAM location and acoustic propulsion within soft tissue, the team employed a dual-probe design. An ultrasound imaging probe enabled real-time imaging of the bubbles, while the acoustic field generated by a FUS probe (at an excitation frequency of 480 kHz and an applied acoustic pressure of 626 kPa peak-to-peak) provided effective propulsion.

In vitro and in vivo testing 

The team performed real-time imaging of the propulsion of BAMs in vitro, using an agarose chamber to simulate an artificial bladder. When exposed to an ultrasound field generated by the FUS probe, the BAMs demonstrated highly efficient motion, as observed in the ultrasound imaging scans. The propulsion direction of BAMs could be precisely controlled by an external magnetic field.

The researchers also conducted in vivo testing, using laboratory mice with bladder cancer and the anti-cancer drug 5-fluorouracil (5-FU). They treated groups of mice with either phosphate buffered saline, free drug, passive BAMs or active (acoustically actuated and magnetically guided) BAMs, at three day intervals over four sessions. They then monitored the tumour progression for 21 days, using bioluminescence signals emitted by cancer cells.

The active BAM group exhibited a 93% decrease in bioluminescence by the 14th day, indicating large tumour shrinkage. Histological examination of excised bladders revealed that mice receiving this treatment had considerably reduced tumour sizes compared with the other groups.

“Embedding the anticancer drug 5-FU into the hydrogel matrix of BAMs substantially improved the therapeutic efficiency compared with 5-FU alone,” the authors write. “These BAMs used a controlled-release mechanism that prolonged the bioavailability of the loaded drug, leading to sustained therapeutic activity and better outcomes.”

Mice treated with active BAMS experienced no weight changes, and no adverse effects to the heart, liver, spleen, lung or kidney compared with the control group. The researchers also evaluated in vivo degradability by measuring BAM bioreabsorption rates following subcutaneous implantation into both flanks of a mouse. Within six weeks, they observed complete breakdown of the microrobots.

Gao tells Physics World that the team has subsequently expanded the scope of its work to optimize the design and performance of the microbubble robots for broader biomedical applications.

“We are also investigating the use of advanced surface engineering techniques to further enhance targeting efficiency and drug loading capacity,” he says. “Planned follow-up studies include preclinical trials to evaluate the therapeutic potential of these robots in other tumour models, as well as exploring their application in non-cancerous diseases requiring precise drug delivery and tissue penetration.”

The post Could bubble-like microrobots be the drug delivery vehicles of the future? appeared first on Physics World.

Sustainability spotlight: PFAS unveiled

Par : No Author

So-called “forever chemicals”, or per- and polyfluoroalkyl substances (PFAS), are widely used in consumer, commercial and industrial products, and have subsequently made their way into humans, animals, water, air and soil. Despite this ubiquity, there are still many unknowns regarding the potential human health and environmental risks that PFAS pose.

Join us for an in-depth exploration of PFAS with four leading experts who will shed light on the scientific advances and future challenges in this rapidly evolving research area.

Our panel will guide you through a discussion of PFAS classification and sources, the journey of PFAS through ecosystems, strategies for PFAS risk mitigation and remediation, and advances in the latest biotechnological innovations to address their effects.

Sponsored by Sustainability Science and Technology, a new journal from IOP Publishing that provides a platform for researchers, policymakers, and industry professionals to publish their research on current and emerging sustainability challenges and solutions.

Left to right: Jonas Baltrusaitis, Linda S. Lee, Clinton Williams, Sara Lupton, Jude Maul

Jonas Baltrusaitis, inaugural editor-in-chief of Sustainability Science and Technology, has co-authored more than 300 research publications on innovative materials. His work includes nutrient recovery from waste, their formulation and delivery, and renewable energy-assisted catalysis for energy carrier and commodity chemical synthesis and transformations.

Linda S Lee is a distinguished professor at Purdue University with joint appointments in the Colleges of Agriculture (COA) and Engineering, program head of the Ecological Sciences & Engineering Interdisciplinary Graduate Program and COA assistant dean of graduate education and research. She joined Purdue in 1993 with degrees in chemistry (BS), environmental engineering (MS) and soil chemistry/contaminant hydrology (PhD) from the University of Florida. Her research includes chemical fate, analytical tools, waste reuse, bioaccumulation, and contaminant remediation and management strategies with PFAS challenges driving much of her research for the last two decades. Her research is supported by a diverse funding portfolio. She has published more than 150 papers with most in top-tier environmental journals.

Clinton Williams is the research leader of Plant and Irrigation and Water Quality Research units at US Arid Land Agricultural Research Center. He has been actively engaged in environmental research focusing on water quality and quantity for more than 20 years. Clinton looks for ways to increase water supplies through the safe use of reclaimed waters. His current research is related to the environmental and human health impacts of biologically active contaminants (e.g. PFAS, pharmaceuticals, hormones and trace organics) found in reclaimed municipal wastewater and the associated impacts on soil, biota, and natural waters in contact with wastewater. His research is also looking for ways to characterize the environmental loading patterns of these compounds while finding low-cost treatment alternatives to reduce their environmental concentration using byproducts capable of removing the compounds from water supplies.

Sara Lupton has been a research chemist with the Food Animal Metabolism Research Unit at the Edward T Schafer Agricultural Research Center in Fargo, ND within the USDA-Agricultural Research Service since 2010. Sara’s background is in environmental analytical chemistry. She is the ARS lead scientist for the USDA’s Dioxin Survey and other research includes the fate of animal drugs and environmental contaminants in food animals and investigation of environmental contaminant sources (feed, water, housing, etc.) that contribute to chemical residue levels in food animals. Sara has conducted research on bioavailability, accumulation, distribution, excretion, and remediation of PFAS compounds in food animals for more than 10 years.

Jude Maul received a master’s degree in plant biochemistry from University of Kentucky and a PhD in horticulture and biogeochemistry from Cornell University in 2008. Since then he has been with the USDA-ARS as a research ecologist in the Sustainable Agriculture System Laboratory. Jude’s research focuses on molecular ecology at the plant/soil/water interface in the context of plant health, nutrient acquisition and productivity. Taking a systems approach to agroecosystem research, Jude leads the USDA-ARS-LTAR Soils Working group which is creating an national soils data repository which coincides with his research results contributing to national soil health management recommendations.

About this journal

Sustainability Science and Technology is an interdisciplinary, open access journal dedicated to advances in science, technology, and engineering that can contribute to a more sustainable planet. It focuses on breakthroughs in all science and engineering disciplines that address one or more of the three sustainability pillars: environmental, social and/or economic.
Editor-in-chief: Jonas Baltrusaitis, Lehigh University, USA

 

The post Sustainability spotlight: PFAS unveiled appeared first on Physics World.

String theory may be inevitable as a unified theory of physics, calculations suggest

Par : No Author

Striking evidence that string theory could be the sole viable “theory of everything” has emerged in a new theoretical study of particle scattering that was done by a trio of physicists in the US. By unifying all fundamental forces of nature, including gravity, string theory could provide the long-sought quantum description of gravity that has eluded scientists for decades.

The research was done by Caltech’s Clifford Cheung and Aaron Hillman along with Grant Remmen at New York University. They have delved into the intricate mathematics of scattering amplitudes, which are quantities that encapsulate the probabilities of particles interacting when they collide.

Through a novel application of the bootstrap approach, the trio demonstrated that imposing general principles of quantum mechanics uniquely determines the scattering amplitudes of particles at the smallest scales. Remarkably, the results match the string scattering amplitudes derived in earlier works. This suggests that string theory may indeed be an inevitable description of the universe, even as direct experimental verification remains out of reach.

“A bootstrap is a mathematical construction in which insight into the physical properties of a system can be obtained without having to know its underlying fundamental dynamics,” explains Remmen. “Instead, the bootstrap uses properties like symmetries or other mathematical criteria to construct the physics from the bottom up, ‘effectively pulling itself up by its bootstraps’. In our study, we bootstrapped scattering amplitudes, which describe the quantum probabilities for the interactions of particles or strings.”

Why strings?

String theory posits that the elementary building blocks of the universe are not point-like particles but instead tiny, vibrating strings. The different vibrational modes of these strings give rise to the various particles observed in nature, such as electrons and quarks. This elegant framework resolves many of the mathematical inconsistencies that plague attempts to formulate a quantum description of gravity. Moreover, it unifies gravity with the other fundamental forces: electromagnetic, weak, and strong interactions.

However, a major hurdle remains. The characteristic size of these strings is estimated to be around 1035 m, which is roughly 15 orders of magnitude smaller than the resolution of today’s particle accelerators, including the Large Hadron Collider. This makes experimental verification of string theory extraordinarily challenging, if not impossible, for the foreseeable future.

Faced with the experimental inaccessibility of strings, physicists have turned to theoretical methods like the bootstrap to test whether string theory aligns with fundamental principles. By focusing on the mathematical consistency of scattering amplitudes, the researchers imposed constraints based on basic quantum mechanical requirements on the scattering amplitudes such as locality and unitarity.

“Locality means that forces take time to propagate: particles and fields in one place don’t instantaneously affect another location, since that would violate the rules of cause-and-effect,” says Remmen. “Unitarity is conservation of probability in quantum mechanics: the probability for all possible outcomes must always add up to 100%, and all probabilities are positive. This basic requirement also constrains scattering amplitudes in important ways.”

In addition to these principles, the team introduced further general conditions, such as the existence of an infinite spectrum of fundamental particles and specific high-energy behaviour of the amplitudes. These criteria have long been considered essential for any theory that incorporates quantum gravity.

Unique solution

Their result is a unique solution to the bootstrap equations, which turned out to be the Veneziano amplitude — a formula originally derived to describe string scattering. This discovery strongly indicates that string theory meets the most essential criteria for a quantum theory of gravity. However, the definitive answer to whether string theory is truly the “theory of everything” must ultimately come from experimental evidence.

Cheung explains, “Our work asks: what is the precise math problem whose solution is the scattering amplitude of strings? And is it the unique solution?”. He adds, “This work can’t verify the validity of string theory, which like all questions about nature is a question for experiment to resolve. But it can help illuminate whether the hypothesis that the world is described by vibrating strings is actually logically equivalent to a smaller, perhaps more conservative set of bottom up assumptions that define this math problem.”

The trio’s study opens up several avenues for further exploration. One immediate goal for the researchers is to generalize their analysis to more complex scenarios. For instance, the current work focuses on the scattering of two particles into two others. Future studies will aim to extend the bootstrap approach to processes involving multiple incoming and outgoing particles.

Another direction involves incorporating closed strings, which are loops that are distinct from the open strings analysed in this study. Closed strings are particularly important in string theory because they naturally describe gravitons, the hypothetical particles responsible for mediating gravity. While closed string amplitudes are more mathematically intricate, demonstrating that they too arise uniquely from the bootstrap equations would further bolster the case for string theory.

The research is described in Physical Review Letters.

The post String theory may be inevitable as a unified theory of physics, calculations suggest appeared first on Physics World.

Ceryx Medical: company uses bioelectronics to coordinate the heart and lungs

Heart failure is a serious condition that occurs when a damaged heart loses its ability to pump blood around the body. It affects as many as 100 million people worldwide and it is a progressive disease such that five years after a diagnosis, 50% of patients with heart failure will be dead.

The UK-based company Ceryx Medical has created a new bioelectronic device called Cysoni, which is designed to adjust the pace of the heart as a patient breathes in and out. This mimics a normal physiological process called respiratory sinus arrhythmia, which can be absent in people with heart failure. The company has just began the first trial of Cysoni on human subjects.

This podcast features the biomedical engineer Stuart Plant and the physicist Ashok Chauhan, who are Ceryx Medical’s CEO and senior scientist respectively. In a wide-ranging conversation with Physics World’s Margaret Harris, they talk about how bioelectronics could be used treat heart failure and some other diseases. Chauhan and Plant also chat about challenges and rewards of developing medical technologies within a small company.

The post Ceryx Medical: company uses bioelectronics to coordinate the heart and lungs appeared first on Physics World.

❌