↩ Accueil

Vue normale

Il y a de nouveaux articles disponibles, cliquez pour rafraîchir la page.
Aujourd’hui — 30 janvier 2025Physics World

Mark Thomson looks to the future of CERN and particle physics

30 janvier 2025 à 15:27

This episode of the Physics World Weekly podcast features Mark Thomson, who will become the next director-general of CERN in January 2026. In a conversation with Physics World’s Michael Banks, Thomson shares his vision of the future of the world’s preeminent particle physics lab, which is home to the Large Hadron Collider (LHC).

They chat about the upcoming high-luminosity upgrade to the LHC (HL-LHC), which will be completed in 2030. The interview explores long-term strategies for particle physics research and the challenges of managing large international scientific organizations. Thomson also looks back on his career in particle physics and his involvement with some of the field’s biggest experiments.

 

 

This podcast is supported by Atlas Technologies, specialists in custom aluminium and titanium vacuum chambers as well as bonded bimetal flanges and fittings used everywhere from physics labs to semiconductor fabs.

The post Mark Thomson looks to the future of CERN and particle physics appeared first on Physics World.

Filter inspired by deep-sea sponge cleans up oil spills

Par : No Author
30 janvier 2025 à 14:00

Oil spills can pollute large volumes of surrounding water – thousands of times greater than the spill itself – causing long-term economic, environmental, social and ecological damage. Effective methods for in situ capture of spilled oil are thus essential to minimize contamination from such disasters.

Many oil spill cleanup technologies, however, exhibit poor hydrodynamic stability under complex flow conditions, which leads to poor oil-capture efficiency. To address this shortfall, researchers from Harbin Institute of Technology in China have come up with a new approach to oil cleanup using a vortex-anchored filter (VAF).

“Since the 1979 Atlantic Empress disaster, interception and adsorption have been the primary methods for oil spill recovery, but these are sensitive to water-flow fluctuation,” explains lead author Shijie You. Oil-in-water emulsions from leaking pipelines and offshore industrial discharge are particularly challenging, says You, adding that “these problems inspire us to consider how we can address hydrodynamic stability of oil-capture devices under turbulent conditions”.

Inspired by the natural world

You and colleagues believe that the answers to oil spill challenges could come from nature – arguably the world’s greatest scientist. They found that the deep-sea glass sponge E. aspergillum, which lives at depths of up to 1000 m in the Pacific Ocean, has an excellent ability to filter feed with a high effectiveness, selectivity and robustness, and that its food particles share similarities with oil droplets.

The anatomical structure of E. aspergillum – also known as Venus’ flower basket – provided inspiration for the researchers to design their VAF. By mimicking the skeletal architecture and filter feeding patterns of the sponge, they created a filter that exhibited a high mass transfer and hydrodynamic stability in cleaning up oil spills under turbulent flow.

“The E. aspergillum has a multilayered skeleton–flagellum architecture, which creates 3D streamlines with frequent collision, deflection, convergence and separation,” explains You. “This can dissipate macro-scale turbulent flows into small-scale swirling flow patterns called low-speed vortical flows within the body cavity, which reduces hydrodynamic load and enhances interfacial mass transfer.”

For the sponges, this allows them to maintain a high mechanical stability while absorbing nutrients from the water. The same principles can be applied to synthetic materials for cleaning up oil spills.

Design of the vortex-anchored filter
VAF design Skeletal motif of E. aspergillum and (right column) front and top views of the VAF with a bio-inspired hollow cylinder skeleton and flagellum adsorbent. (Courtesy: Y Yu et al. Nat. Commun. 10.1038/s41467-024-55587-y)

The VAF is a synthetic form of the sponge’s architecture and, according to You, “is capable of transferring kinematic energy from an external water flow into multiple small-scale low-speed vortical flows within the body cavity to enhance hydrodynamic stability and oil capture efficiency”.

The tubular outer skeleton of the VAF comprises a helical ridge and chequerboard lattice. It is this skeleton that creates a slow vortex field inside the cavity and enables mass transfer of oil during the filtering process. Once the oil has been forced into the filter, the internal area – composed of flagellum-shaped adsorbent materials – provides a large interfacial area for oil adsorption.

Using the VAF to clean up oil spills

The researchers used their nature-inspired VAF to clean up oil spills under complex hydrodynamic conditions. You states that “the VAF can retain the external turbulent-flow kinetic energy in the low-speed vortical flows – with a small Kolmogorov microscale (85 µm) [the size of the smallest eddy in a turbulent flow] – inside the cavity of the skeleton, leading to enhanced interfacial mass transfer and residence time”.

“This led to an improvement in the hydrodynamic stability of the filter compared to other approaches by reducing the Reynolds stresses in nearly quiescent wake flows,” You explains. The filter was also highly resistant to bending stresses caused at the boundary of the filter when trying separate viscous fluids. When put into practice, the VAF was able to capture more than 97% of floating, underwater and emulsified oils, even under strong turbulent flow.

When asked how the researchers plan to improve the filter further, You tells Physics World that they “will integrate the VAF with photothermal, electrothermal and electrochemical modules for environmental remediation and resource recovery”.

“We look forward to applying VAF-based technologies to solve sea pollution problems with a filter that has an outstanding flexibility and adaptability, easy-to-handle operability and scalability, environmental compatibility and life-cycle sustainability,” says You.

The research is published in Nature Communications.

The post Filter inspired by deep-sea sponge cleans up oil spills appeared first on Physics World.

Anomalous Hall crystal made from twisted graphene

Par : No Author
30 janvier 2025 à 10:25

A topological electronic crystal (TEC) in which the quantum Hall effect emerges without the need for an external magnetic field has been unveiled by an international team of physicists. Led by Josh Folk at the University of British Columbia, the group observed the effect in a stack of bilayer and trilayer graphene that is twisted at a specific angle.

In a classical electrical conductor, the Hall voltage and its associated resistance appear perpendicular both to the direction of an applied electrical current and an applied magnetic field. A similar effect is also seen in 2D electron systems that have been cooled to ultra-low temperatures. But in this case, the Hall resistance becomes quantized in discrete steps.

This quantum Hall effect can emerge in electronic crystals, also known as Wigner crystals. These are arrays of electrons that are held in place by their mutual repulsion. Some researchers have considered the possibility of a similar effect occurring in structures called TECs, but without an applied magnetic field. This is called the “quantum anomalous Hall effect”.

Anomalous Hall crystal

“Several theory groups have speculated that analogues of these structures could emerge in quantized anomalous Hall systems, giving rise to a type of TEC termed an ‘anomalous Hall crystal’,” Folk explains. “This structure would be insulating, due to a frozen-in electronic ordering in its interior, with dissipation-free currents along the boundary.”

For Folk’s team, the possibility of anomalous hall crystals emerging in real systems was not the original focus of their research. Initially, a team at the University of Washington had aimed to investigate the diverse phenomena that emerge when two or more flakes of graphene are stacked on top of each other, and twisted relative to each other at different angles

While many interesting behaviours emerged from these structures, one particular stack caught the attention of Washington’s Dacen Waters, which inspired his team to get in touch with Folk and his colleagues in British Columbia.

In a vast majority of cases, the twisted structures studied by the team had moiré patterns that were very disordered. Moiré patterns occur when two lattices are overlaid and rotated relative to each other. Yet out of tens of thousands of permutations of twisted graphene stacks, one structure appeared to be different.

Exceptionally low levels of disorder

“One of the stacks seemed to have exceptionally low levels of disorder,” Folk describes. “Waters shared that one with our group to explore in our dilution refrigerator, where we have lots of experience measuring subtle magnetic effects that appear at a small fraction of a degree above absolute zero.”

As they studied this highly ordered structure, the team found that its moiré pattern helped to modulate the system’s electronic properties, allowing a TEC to emerge.

“We observed the first clear example of a TEC, in a device made up of bilayer graphene stacked atop trilayer graphene with a small, 1.5° twist,” Folk explains. “The underlying topology of the electronic system, combined with strong electron-electron interactions, provide the essential ingredients for the crystal formation.”

After decades of theoretical speculation, Folk, Waters and colleagues have identified an anomalous Hall crystal, where the quantum Hall effect emerges from an in-built electronic structure, rather than an applied magnetic field.

Beyond confirming the theoretical possibility of TECs, the researchers are hopeful that their results could lay the groundwork for a variety of novel lines of research.

“One of the most exciting long-term directions this work may lead is that the TEC by itself – or perhaps a TEC coupled to a nearby superconductor – may host new kinds of particles,” Folk says. “These would be built out of the ‘normal’ electrons in the TEC, but totally unlike them in many ways: such as their fractional charge, and properties that would make them promising as topological qubits.”

The research is described in Nature.

The post Anomalous Hall crystal made from twisted graphene appeared first on Physics World.

Hier — 29 janvier 2025Physics World

Imaging reveals how microplastics may harm the brain

Par : Tami Freeman
29 janvier 2025 à 13:00

Pollution from microplastics – small plastic particles less than 5 mm in size – poses an ongoing threat to human health. Independent studies have found microplastics in human tissues and within the bloodstream. And as blood circulates throughout the body and through vital organs, these microplastics reach can critical regions and lead to tissue dysfunction and disease. Microplastics can also cause functional irregularities in the brain, but exactly how they exert neurotoxic effects remains unclear.

A research collaboration headed up at the Chinese Research Academy of Environmental Sciences and Peking University has shed light on this conundrum. In a series of cerebral imaging studies reported in Science Advances, the researchers tracked the progression of fluorescent microplastics through the brains of mice. They found that microplastics entering the bloodstream become engulfed by immune cells, which then obstruct blood vessels in the brain and cause neurobehavioral abnormalities.

“Understanding the presence and the state of microplastics in the blood is crucial. Therefore, it is essential to develop methods for detecting microplastics within the bloodstream,” explains principal investigator Haipeng Huang from Peking University. “We focused on the brain due to its critical importance: if microplastics induce lesions in this region, it could have a profound impact on the entire body. Our experimental technology enables us to observe the blood vessels within the brain and detect microplastics present in these vessels.”

In vivo imaging

Huang and colleagues developed a microplastics imaging system by integrating a two-photon microscopy system with fluorescent plastic particles and demonstrated that it could image brain blood vessels in awake mice. They then fed five mice with water containing 5-µm diameter fluorescent microplastics. After a couple of hours, fluorescence images revealed microplastics within the animals’ cerebral vessels.

The microplastic flash
Lightening bolt The “MP-flash” observed as two plastic particles rapidly fly through the cerebral blood vessels. (Courtesy: Haipeng Huang)

As they move through rapidly flowing blood, the microplastics generate a fluorescence signal resembling a lightning bolt, which the researchers call a “microplastic flash” (MP-flash). This MP-flash was observed in four of the mice, with the entire MP-flash trajectory captured in a single imaging frame of less than 208 ms.

Three hours after administering the microplastics, the researchers observed fluorescent cells in the bloodstream. The signals from these cells were of comparable intensity to the MP-flash signal, suggesting that the cells had engulfed microplastics in the blood to create microplastic-labelled cells (MPL-cells). The team note that the microplastics did not directly attach to the vessel wall or cross into brain tissue.

To test this idea further, the researchers injected microplastics directly into the bloodstream of the mice. Within minutes, they saw the MP-Flash signal in the brain’s blood vessels, and roughly 6 min later MPL-cells appeared. No fluorescent cells were seen in non-treated mice. Flow cytometry of mouse blood after microplastics injection revealed that the MPL-cells, which were around 21 µm in dimeter, were immune cells, mostly neutrophils and macrophages.

Tracking these MPL-cells revealed that they sometimes became trapped within a blood vessel. Some cells exited the imaging field following a period of obstruction while others remained in cerebral vessels for extended durations, in some instances for nearly 2.5 h of imaging. The team also found that one week after injection, the MPL-cells had still not cleared, although the density of blockages was much reduced.

“[While] most MPL-cells flow rapidly with the bloodstream, a small fraction become trapped within the blood vessels,” Huang tells Physics World. “We provide an example where an MPL-cell is trapped at a microvascular turn and, after some time, is fortunate enough to escape. Many obstructed cells are less fortunate, as the blockage may persist for several weeks. Obstructed cells can also trigger a crash-like chain reaction, resulting in several MPL-cells colliding in a single location and posing significant risks.”

The MPL-cell blockages also impeded blood flow in the mouse brain. Using laser speckle contrast imaging to monitor blood flow, the researchers saw reduced perfusion in the cerebral cortical vessels, notably at 30 min after microplastics injection and particularly affecting smaller vessels.

Laser speckle contrast images showing blood flow in the mouse brain
Reduced blood flow These laser speckle contrast images show blood flow in the mouse brain at various times after microplastics injection. The images indicate that blockages of microplastic-labelled cells inhibit perfusion in the cerebral cortical vessels. (Courtesy: Huang et al. Sci. Adv. 11 eadr8243 (2025))

Changing behaviour

Lastly, Huang and colleagues investigated whether the reduced blood supply to the brain caused by cell blockages caused behavioural changes in the mice. In an open-field experiment (used to assess rodents’ exploratory behaviour) mice injected with microplastics travelled shorter distances at lower speeds than mice in the control group.

The Y-maze test for assessing memory also showed that microplastics-treated mice travelled smaller total distances than control animals, with a significant reduction in spatial memory. Tests to evaluate motor coordination and endurance revealed that microplastics additionally inhibited motor abilities. By day 28 after injection, these behavioural impairments were restored, corresponding with the observed recovery of MPL-cell obstruction in the cerebral vasculature at 28 days.

The researchers conclude that their study demonstrates that microplastics harm the brain indirectly – via cell obstruction and disruption of blood circulation – rather than directly penetrating tissue. They emphasize, however, that this mechanism may not necessarily apply to humans, who have roughly 1200 times greater volume of circulating blood volume than mice and significantly different vascular diameters.

“In the future, we plan to collaborate with clinicians,” says Huang. “We will enhance our imaging techniques for the detection of microplastics in human blood vessels, and investigate whether ‘MPL-cell-car-crash’ happens in human. We anticipate that this research will lead to exciting new discoveries.”

Huang emphasizes how the use of fluorescent microplastic imaging technology has fundamentally transformed research in this field over the past five years. “In the future, advancements in real-time imaging of depth and the enhanced tracking ability of microplastic particles in vivo may further drive innovation in this area of study,” he says.

The post Imaging reveals how microplastics may harm the brain appeared first on Physics World.

What ‘equity’ really means for physics

Par : No Author
29 janvier 2025 à 12:09

If you have worked in a university, research institute or business during the past two decades you will be familiar with the term equality, diversity and inclusion (EDI). There is likely to be an EDI strategy that includes measures and targets to nurture a workforce that looks more like the wider population and a culture in which everyone can thrive. You may find a reasoned business case for EDI, which extends beyond the organization’s legal obligations, to reflect and understand the people that you work with.

Look more closely and it is possible that the “E” in EDI is not actually equality, but rather equity. Equity is increasingly being used as a more active commitment, not least by the Institute of Physics, which publishes Physics World.  How, though, is equity different to equality? What is causing this change of language and will it make any difference in practice?

These questions have become more pressing as discussions around equality and equity have become entwined in the culture wars.  This is a particularly live issue in the US as Donald Trump’s second term as US president has begun to withdraw funding from EDI activities.  But it has also influenced science policy in the UK.

The distinction between equality and equity is often illustrated by a cartoon published in 2016 by the UK artist Angus Maguire (above). It shows a fence and people of variable height gaining an equal view of a baseball match thanks to different numbers of crates that they stand on. This has itself, however, resulted in arguments about other factors such as the conditions necessary to watch the game in the stadium, or indeed even join in. That requires consideration about how the teams and the stadium could adapt to the needs of all potential participants, but also how these changes might affect the experience of others involved.

In terms of education, the Organization for Economic Co-operation and Development (OECD) states that equity “does not mean that all students obtain equal education outcomes, but rather that differences in students’ outcomes are unrelated to their background or to economic and social circumstances over which the students have no control”. This is an admirable goal, but there are questions about how to achieve it.

In OECD member countries, freedom of choice and competition yield social inequalities that flow through to education and careers. This means that governments are continually balancing the benefits of inspiring and rewarding individuals alongside concerns about group injustice.

In 2024, we hosted a multidisciplinary workshop about equity in science, and especially physics. Held at the University of Birmingham, it brought together physicists at different career stages with social scientists and people who had worked on science and education in government, charities and learned societies. At the event, social scientists told us that equality is commonly conceived as a basic right to be treated equally and not discriminated against, regardless of personal characteristics. This right provides a platform for “equality of opportunity” whereby barriers are removed so talent and effort can be rewarded.

In the UK, the promotion of equality of opportunity is enshrined within the country’s Equality Act 2010 and underpins current EDI work in physics. This includes measures to promote physics to young people in deprived areas, and to women and ethnic minorities, as well as mentoring and additional academic and financial support through all stages of education and careers.  It extends to re-shaping the content and promotion of physics courses in universities so they are more appealing and responsive to a wider constituency. In many organizations, there is also training for managers to combat discrimination and bias, whether conscious or not.

Actions like these have helped to improve participation and progression across physics education and careers, but there is still significant underrepresentation and marginalization due to gender, ethnicity and social background. This is not unusual in open and competitive societies where the effects of promoting equal opportunities are often outweighed by the resources and connections of people with characteristics that are highly represented. Talent and effort are crucial in “high-performance” sectors such as academia and industry, but they are not the only factors influencing success.

Physicists at the meeting told us that they are motivated by intellectual curiosity, fascination with the natural world and love for their subject. Yet there is also, in physics, a culture of “genius” and competition, in which confidence is crucial. Facilities and working conditions, which often involve short-term contracts and international mobility, are difficult to balance alongside other life commitments. Although inequalities and exclusions are recognized, they are often ascribed to broader social factors or the inherent requirements of research. As a result, physicists tend not to accept responsibility for inequities within the discipline.

Physics has a culture of “hyper-meritocracy” where being correct counts more than respecting others

Many physicists want merit to be a reflection of talent and effort. But we identified that physics has a culture of “hyper-meritocracy” where being correct counts more than respecting others. Across the community, some believe in positive action beyond the removal of discrimination, but others can be actively hostile to any measure associated with EDI. This is a challenging environment for any young researcher and we heard distressing stories of isolation from women and colleagues who had hidden disabilities or those who were the first in their family to go to university.

The experience, positive or not, when joining a research group as a postgraduate or postdoctoral researcher is often linked with the personality of leaders. Peer groups and networks have helped many physicists through this period of their career, but it is also where the culture in a research group or department can drive some to the margins and ultimately out of the profession. In environments like this, equal opportunities have proved insufficient to advance diversity, let alone inclusion.

Culture change

Organizations that have replaced equality with equity want to signal a commitment not just to equal treatment, but also more equitable outcomes. However, those who have worked in government told us that some people become disengaged, thinking such efforts can only be achieved by reducing standards and threatening cultures they value. Given that physics needs technical proficiency and associated resources and infrastructure, it is not a discipline where equity can mean an equal distribution of positions and resources.

Physics can, though, counter the influence of wider inequalities by helping colleagues who are under-represented to gain the attributes, experiences and connections that are needed to compete successfully for doctoral studentships, research contracts and academic positions. It can also face up to its cultural problems, so colleagues who are minoritized feel less marginalized and they are ultimately recognized for their efforts and contributions.

This will require physicists giving more prominence to marginalized voices as well as critically and honestly examining their culture and tackling unacceptable behaviour. We believe we can achieve this by collaborating with our social science colleagues. That includes gathering and interpreting qualitative data, so there is shared understanding of problems, as well as designing strategies with people who are most affected, so that everyone has a stake in success.

If this happens, we can look forward to a physics community that genuinely practices equity, rather than espousing equality of opportunity.

The post What ‘equity’ really means for physics appeared first on Physics World.

Watch this amazing quantum-inspired stained-glass artwork in all its glory

29 janvier 2025 à 10:29
This video has no voice over. (Video courtesy: Space Production)

The aim of the International Year of Quantum Science & Technology (IYQ) in 2025 to help raise the public’s awareness of the importance and impact of quantum science and applications on all aspects of life.

Ukraine-born artist Oksana Kondratyeva has certainly taken that message to heart. A London-based designer and producer of architectural glass art, she has recently created an intriguing piece of stained glass inspired by the casing for a quantum computer.

In this video specially made by Kondratyeva for Physics World, you can see her artwork, which was displayed at the 2024 British Glass Biennale, and glimpse the artist in the protective gear she wears while working with the chemicals to make her piece.

To discover more on this topic, take a look at the recent Physics World article: A ‘quantum rose’ for the 21st century: Oksana Kondratyeva on her stained-glass art inspired by a quantum computer

In the feature, Kondratyeva describes how her work fuses science and art – and reveals how the collaboration with Rigetti came about. As it happens, it was an article in Physics World during another international year – devoted to glass – that inspired the project.

This article forms part of Physics World‘s contribution to the 2025 International Year of Quantum Science and Technology (IYQ), which aims to raise global awareness of quantum physics and its applications.

Stayed tuned to Physics World and our international partners throughout the next 12 months for more coverage of the IYQ.

Find out more on our quantum channel.

The post Watch this amazing quantum-inspired stained-glass artwork in all its glory appeared first on Physics World.

À partir d’avant-hierPhysics World

When Bohr got it wrong: the impact of a little-known paper on the development of quantum theory

Par : No Author
28 janvier 2025 à 19:00
Niels Bohr, illustration
Brilliant mind Illustration of the Danish physicist and Nobel laureate Niels Bohr (1885-1962). Bohr made numerous contributions to physics during his career, but it was his work on atomic structure and quantum theory that won him the 1922 Nobel Prize for Physics. (Courtesy: Sam Falconer, Debut Art/Science Photo Library)

One hundred and one years ago, Danish physicist Niels Bohr proposed a radical theory together with two young colleagues – Hendrik Kramers and John Slater – in an attempt to resolve some of the most perplexing issues in fundamental physics at the time. Entitled “The Quantum Theory of Radiation”, and published in the Philosophical Magazine, their hypothesis was quickly proved wrong, and has since become a mere footnote in the history of quantum mechanics.

Despite its swift demise, their theory perfectly illustrates the sense of crisis felt by physicists at that moment, and the radical ideas they were prepared to contemplate to resolve it. For in their 1924 paper Bohr and his colleagues argued that the discovery of the “quantum of action” might require the abandonment of nothing less than the first law of thermodynamics: the conservation of energy.

As we celebrate the centenary of Werner Heisenberg’s 1925 quantum breakthrough with the International Year of Quantum Science and Technology (IYQ) 2025, Bohr’s 1924 paper offers a lens through which to look at how the quantum revolution unfolded. Most physicists at that time felt that if anyone was going to rescue the field from the crisis, it would be Bohr. Indeed, this attempt clearly shows signs of the early rift between Bohr and Albert Einstein about the quantum realm, that would turn into a lifelong argument. Remarkably, the paper also drew on an idea that later featured in one of today’s most prominent alternatives to Bohr’s “Copenhagen” interpretation of quantum mechanics.

Genesis of a crisis

The quantum crisis began when German physicist Max Planck proposed the quantization of energy in 1900, as a mathematical trick for calculating the spectrum of radiation from a warm, perfectly absorbing “black body”. Later, in 1905, Einstein suggested taking this idea literally to account for the photoelectric effect, arguing that light consisted of packets or quanta of electromagnetic energy, which we now call photons.

Bohr entered the story in 1912 when, working in the laboratory of Ernest Rutherford in Manchester, he devised a quantum theory of the atom. In Bohr’s picture, the electrons encircling the atomic nucleus (that Rutherford had discovered in 1909) are constrained to specific orbits with quantized energies. The electrons can hop in “quantum jumps” by emitting or absorbing photons with the corresponding energy.

Albert Einstein and Niels Bohr
Conflicting views Stalwart physicists Albert Einstein and Niels Bohr had opposing views on quantum fundamentals from early on, which turned into a lifelong scientific argument between the two. (Paul Ehrenfest/Wikimedia Commons)

Bohr had no theoretical justification for this ad hoc assumption, but he showed that, by accepting it, he could predict (more or less) the spectrum of the hydrogen atom. For this work Bohr was awarded the 1922 Nobel Prize for Physics, the same year that Einstein collected the prize for his work on light quanta and the photoelectric effect (he had been awarded it in 1921 but was unable to attend the ceremony).

After establishing an institute of theoretical physics (now the Niels Bohr Institute) in Copenhagen in 1917, Bohr’s mission was to find a true theory of the quantum: a mechanics to replace, at the atomic scale, the classical physics of Isaac Newton that worked at larger scales. It was clear that classical physics did not work at the scale of the atom, although Bohr’s correspondence principle asserted that quantum theory should give the same results as classical physics at a large enough scale.

Hendrik Kramers
Mathematical mind Dutch physicist Hendrik Kramers spent 10 years as Niels Bohr’s assistant in Copenhagen. (Wikimedia Commons)

Quantum theory was at the forefront of physics at the time, and so was the most exciting topic for any aspiring young physicist. Three groups stood out as the most desirable places to work for anyone seeking a fundamental mathematical theory to replace the makeshift and sometimes contradictory “old” quantum theory that Bohr had cobbled together: that of Arnold Sommerfeld in Münich, of Max Born in Göttingen, and of Bohr in Copenhagen.

Dutch physicist Hendrik Kramers had hoped to work on his doctorate with Born – but in 1916 the First World War ruled that out, and so he opted instead for Copenhagen, in politically neutral Denmark. There he became Bohr’s assistant for ten years: as was the case with several of Bohr’s students, Kramers did the maths (it was never Bohr’s forte) while Bohr supplied the ideas, philosophy and kudos. Kramers ended up working on an impressive range of problems, from chemical physics to pure mathematics.

Reckless and radical

One of the most vexing question for Bohr and his Copenhagen circle in the early 1920s was how to think about electron orbits in atoms. Try as they might, they couldn’t find a way to make the orbits “fit” with experimental observations of atomic spectra.

Perhaps, in quantum systems like atoms, we have to abandon any attempt to construct a physical picture at all

Bohr and others, including Heisenberg, began to voice a possibility that seemed almost reckless: perhaps, in quantum systems like atoms, we have to abandon any attempt to construct a physical picture at all. Maybe we just can’t think of quantum particles as objects moving along trajectories in space and time.

This struck others, such as Einstein, as desperate, if not crazy. Surely the goal of science had always been to offer a picture of the world in terms of “things happening to objects in space”. What else could there be than that? How could we just give it all up?

But it was worse than that. For one thing, Bohr’s quantum jumps were supposed to happen instantaneously: an electron, say, jumping from one orbit to another in no time at all. In classical physics, everything happens continuously: a particle gets from here to there by moving smoothly across the intervening space, in some finite time. The discontinuities of quantum jumps seemed to some – like Austrian physicist Erwin Schrödinger in Vienna – bordering on the obscene.

Worse still was the fact that while the old quantum theory stipulated the energy of quantum jumps, there was nothing to dictate when they would happen – they simply did. In other words, there was no causal kick that instigated a quantum jump: the electron just seemed to make up its own mind about when to jump. As Heisenberg would later proclaim in his 1927 paper on the uncertainty principle (Zeitschrift für Physik 43 172),  quantum theory “establishes the final failure of causality”.

Such notions were not the only source of friction between the Copenhagen team and Einstein. Bohr didn’t like light quanta. While they seemed to explain the photoelectric effect, Bohr was convinced that light had to be fundamentally wave-like, so that photons (to use the anachronistic term) were only a way of speaking, not real entities.

To add to the turmoil in 1924, the French physicist Louis de Broglie had, in his doctoral thesis for the Sorbonne, turned the quantum idea on its head by proposing that particles such as electrons might show wave-like behaviour. Einstein had at first considered this too wild, but soon came round to the idea.

Go where the waves take you

In 1924 these virtually heretical ideas were only beginning to surface, but they were creating such a sense of crisis that it seemed anything was possible. In the 1960s, science historian Paul Forman suggested that the feverish atmosphere in physics was part of an even wider cultural current. By rejecting causality and materialism, the German quantum physicists, Forman said, were attempting to align their ideas with a rejection of mechanistic thinking while embracing the irrational – as was the fashion in the philosophical and intellectual circles of the beleaguered Weimar republic. The idea has been hotly debated by historians and philosophers of science – but it was surely in Copenhagen, not Munich or Göttingen, that the most radical attitudes to quantum theory were developing.

John Clark Slater
Particle pilot In 1923, US physicist John Clark Slater moved to Copenhagen, and suggested the concept of a “virtual field” that spread throughout a quantum system. (Emilio Segrè Visual Archives General Collection/MIT News Office)

Then, just before Christmas in 1923, a new student arrived at Copenhagen. John Clarke Slater, who had a PhD in physics from Harvard, turned up at Bohr’s institute with a bold idea. “You know those difficulties about not knowing whether light is old-fashioned waves or Mr Einstein’s light particles”, he wrote to his family during a spell in Cambridge that November. “I had a really hopeful idea… I have both the waves and the particles, and the particles are sort of carried along by the waves, so that the particles go where the waves take them.” The waves were manifested in a kind of “virtual field” of some kind that spread throughout the system, and they acted to “pilot” the particles.

Bohr was mostly not a fan of Slater’s idea, not least because it retained the light particles that he wished to dispose of. But he liked Slater’s notion of a virtual field that could put one part of a quantum system in touch with others. Together with Slater and Kramers, Bohr prepared a paper in a remarkably short time (especially for him) outlining what became known as the Bohr-Kramers-Slater (BKS) theory. They sent it off to the Philosophical Magazine (where Bohr had published his seminal papers on the quantum atom) at the end of January 1924, and it was published in May (47(281) 785). As was increasingly characteristic of Bohr’s style, it was free of any mathematics (beyond Einstein’s quantum relationship E=hν).

In the BKS picture, an excited atom about to emit light can “communicate continually” with the other atoms around it via the virtual field. The transition, with emission of a light quantum, is then not spontaneous but induced by the virtual field. This mechanism could solve the long-standing question of how an atom “knows” which frequency of light to emit in order to reach another energy level: the virtual field effectively puts the atom “in touch” with all the possible energy states of the system.

The problem was that this meant the emitting atom was in instant communication with its environment all around – which violated the law of causality. Well then, so much the worse for causality: BKS abandoned it. The trio’s theory also violated the conservation of energy and momentum – so they had to go too.

Causality and conservation, abandoned

But wait: hadn’t these conservation laws been proved? In 1923 the American physicist Arthur Compton in Cambridge had shown that when light is scattered by electrons, they exchange energy, and the frequency of the light decreases as it gives up energy to the electrons. The results of Compton’s experiments agreed perfectly with predictions made on the assumptions that light is a stream of quanta (photons) and that their collisions with electrons conserve energy and momentum.

Ah, said BKS, but that’s only true statistically. The quantities are conserved on average, but not in individual collisions. After all, such statistical outcomes were familiar to physicists: that was the basis of the second law of thermodynamics, which presented the inexorable increase in entropy as a statistical phenomenon that need not constrain processes involving single particles.

The radicalism of the BKS paper got a mixed reception. Einstein, perhaps predictably, was dismissive. “Abandonment of causality as a matter of principle should be permitted only in the most extreme emergency”, he wrote. Wolfgang Pauli, who had worked in Copenhagen in 1922–23, confessed to being “completely negative” about the idea. Born and Schrödinger were more favourable.

But the ultimate arbiter is experiment. Was energy conservation really violated in single-particle interactions? The BKS paper motivated others to find out. In early 1925, German physicists Walther Bothe and Hans Geiger in Berlin looked more closely at Compton’s X-ray scattering by electrons. Having read the BKS paper, Bothe felt that “it was immediately obvious that this question would have to be decided experimentally, before definite progress could be made.

Walther Bothe and Hans Geiger
Experimental arbitrators German physicists Walther Bothe and Hans Geiger (right) conducted an experiment to explore the BKS paper, that looked at X-ray scattering from electrons to determine the conservation of energy at microscopic scales. (IPP/© Archives of the Max Planck Society)

Geiger agreed, and the duo devised a scheme for detecting both the scattered electron and the scattered photon in separate detectors. If causality and energy conservation were preserved, the detections should be simultaneous; while any delay between them could indicate a violation. As Bothe would later recall “The ‘question to Nature’ which the experiment was designed to answer could therefore be formulated as follows: is it exactly a scatter quantum and a recoil electron that are simultaneously emitted in the elementary process, or is there merely a statistical relationship between the two?” It was incredibly painstaking work to seek such coincident detections using the resources then available. But in April 1925 Geiger and Bothe reported simultaneity within a millisecond – close enough to make a strong case that Compton’s treatment, which assumed energy conservation, was correct. Compton himself, working with Alfred Simon using a cloud chamber, confirmed that energy and momentum were conserved for individual events (Phys. Rev. 26 289).

Revolutionary defeat… singularly important

Bothe was awarded the 1954 Nobel Prize for Physics for the work. He shared it with Born for his work on quantum theory, and Geiger would surely have been a third recipient, if he had not died in 1945. In his Nobel speech, Bothe definitively stated that “the strict validity of the law of the conservation of energy even in the elementary process had been demonstrated, and the ingenious way out of the wave-particle problem discussed by Bohr, Kramers, and Slater was shown to be a blind alley.”

Bohr was gracious in his defeat, writing to a colleague in April 1925 that “It seems… there is nothing else to do than to give our revolutionary efforts as honourable a funeral as possible.” Yet he was soon to have no need of that particular revolution, for just a few months later Heisenberg, who had returned to Göttingen after working with Bohr in Copenhagen for six months, came up the first proper theory of quantum mechanics, later called matrix mechanics.

“In spite of its short lifetime, the BKS theory was singularly important,” says historian of science Helge Kragh, now emeritus professor at the Niels Bohr Institute. “Its radically new approach paved the way for a greater understanding, that methods and concepts of classical physics could not be carried over in a future quantum mechanics.”

The Bothe-Geiger experiment that [the paper] inspired was not just an important milestone in early particle physics. It was also a crucial factor in Heisenberg’s argument [about] the probabilistic character of his matrix mechanics

The BKS paper was thus in a sense merely a mistaken curtain-raiser for the main event. But the Bothe-Geiger experiment that it inspired was not just an important milestone in early particle physics. It was also a crucial factor in Heisenberg’s argument that the probabilistic character of his matrix mechanics (and also of Schrödinger’s 1926 version of quantum mechanics, called wave mechanics) couldn’t be explained away as a statistical expression of our ignorance about the details, as it is in classical statistical mechanics.

Quantum concept
Radical approach Despite its swift defeat, the BKS proposal showed how classical concepts could not apply to a quantum reality. (Courtesy: Shutterstock/Vink Fan)

Rather, the probabilities that emerged from Heisenberg’s and Schrödinger’s theories applied to individual events: they were, Heisenberg said, fundamental to the way single particles behave. Schrödinger was never happy with that idea, but today it seems inescapable.

Over the next few years, Bohr and Heisenberg argued that the new quantum mechanics indeed smashed causality and shattered the conventional picture of reality as an objective world of objects moving in space–time with fixed properties. Assisted by Born, Wolfgang Pauli and others, they articulated the “Copenhagen interpretation”, which became the predominant vision of the quantum world for the rest of the century.

Failed connections

Slater wasn’t at all pleased with what became of the idea he took to Copenhagen. Bohr and Kramers had pressured him into accepting their take on it, “without the little lump carried along on the waves”, as he put it in mid-January. “I am willing to let them have their way”, he wrote at the time, but in retrospect he felt very unhappy about his time in Denmark. After the BKS theory was disproved, Bohr wrote to Slater saying “I have a bad conscience in persuading you to our views”.

Slater replied that there was no need for that. But in later life – after he had made a name for himself in solid-state physics – Slater admitted to a great deal of resentment. “I completely failed to make any connection with Bohr”, he said in a 1963 interview with the historian of science Thomas Kuhn. “I fought with them [Bohr and Kramers] so seriously that I’ve never had any respect for those people since. I had a horrible time in Copenhagen.” While most of Bohr’s colleagues and students expressed adulation, Slater’s was a rare dissenting voice.

But Slater might have reasonably felt more aggrieved at what became of his “pilot-wave” idea. Today, that interpretation of quantum theory is generally attributed to de Broglie – who intimated a similar notion in his 1924 thesis, before presenting the theory in more detail at the famous 1927 Solvay Conference – and to American physicist David Bohm, who revitalized the idea in the 1950s. Initially dismissed on both occasions, the de Broglie-Bohm theory has gained advocates in recent years, not least because it can be applied to a classical hydrodynamic analogue, in which oil droplets are steered by waves on an oil surface.

Whether or not it is the right way to think about quantum mechanics, the pilot-wave theory touches on the deep philosophical problems of the field. Can we rescue an objective reality of concrete particles with properties described by hidden variables, as Einstein had advocated, from the fuzzy veil that Bohr and Heisenberg seemed to draw over the quantum world? Perhaps Slater would at least be gratified to know that Bohr has not yet had the last word.

This article forms part of Physics World‘s contribution to the 2025 International Year of Quantum Science and Technology (IYQ), which aims to raise global awareness of quantum physics and its applications.

Stayed tuned to Physics World and our international partners throughout the next 12 months for more coverage of the IYQ.

Find out more on our quantum channel.

The post When Bohr got it wrong: the impact of a little-known paper on the development of quantum theory appeared first on Physics World.

Theorists propose a completely new class of quantum particles

Par : No Author
28 janvier 2025 à 14:04

In a ground-breaking theoretical study, two physicists have identified a new class of quasiparticle called the paraparticle. Their calculations suggest that paraparticles exhibit quantum properties that are fundamentally different from those of familiar bosons and fermions, such as photons and electrons respectively.

Using advanced mathematical techniques, Kaden Hazzard at Rice University in the US and his former graduate student Zhiyuan Wang, now at the Max Planck Institute of Quantum Optics in Germany, have meticulously analysed the mathematical properties of paraparticles and proposed a real physical system that could exhibit paraparticle behaviour.

“Our main finding is that it is possible for particles to have exchange statistics different from those of fermions or bosons, while still satisfying the important physical principles of locality and causality,” Hazzard explains.

Particle exchange

In quantum mechanics, the behaviour of particles (and quasiparticles) is probabilistic in nature and is described by mathematical entities known as wavefunctions. These govern the likelihood of finding a particle in a particular state, as defined by properties like position, velocity, and spin. The exchange statistics of a specific type of particle dictates how its wavefunction behaves when two identical particles swap places.

For bosons such as photons, the wavefunction remains unchanged when particles are exchanged. This means that many bosons can occupy the same quantum state, enabling phenomena like lasers and superfluidity. In contrast, when fermions such as electrons are exchanged, the sign of the wavefunction flips from positive to negative or vice versa. This antisymmetric property prevents fermions from occupying the same quantum state. This underpins the Pauli exclusion principle and results in the electronic structure of atoms and the nature of the periodic table.

Until now, physicists believed that these two types of particle statistics – bosonic and fermionic – were the only possibilities in 3D space. This is the result of fundamental principles like locality, which states that events occurring at one point in space cannot instantaneously influence events at a distant location.

Breaking boundaries

Hazzard and Wang’s research overturns the notion that 3D systems are limited to bosons and fermions and shows that new types of particle statistics, called parastatistics, can exist without violating locality.

The key insight in their theory lies in the concept of hidden internal characteristics. Beyond the familiar properties like position and spin, paraparticles require additional internal parameters that enable more complex wavefunction behaviour. This hidden information allows paraparticles to exhibit exchange statistics that go beyond the binary distinction of bosons and fermions.

Paraparticles exhibit phenomena that resemble – but are distinct from – fermionic and bosonic behaviours. For example, while fermions cannot occupy the same quantum state, up to two paraparticles could be allowed to coexist in the same point in space. This behaviour strikes a balance between the exclusivity of fermions and the clustering tendency of bosons.

Bringing paraparticles to life

While no elementary particles are known to exhibit paraparticle behaviour, the researchers believe that paraparticles might manifest as quasiparticles in engineered quantum systems or certain materials. A quasiparticle is particle-like collective excitation of a system. A familiar example is the hole, which is created in a semiconductor when a valence-band electron is excited to the conduction band. The vacancy (or hole) left in the valence band behaves as a positively-charged particle that can travel through the semiconductor lattice.

Experimental systems of ultracold atoms created by collaborators of the duo could be one place to look for the exotic particles. “We are working with them to see if we can detect paraparticles there,” explains Wang.

In ultracold atom experiments, lasers and magnetic fields are used to trap and manipulate atoms at temperatures near absolute zero. Under these conditions, atoms can mimic the behaviour of more exotic particles. The team hopes that similar setups could be used to observe paraparticle-like behaviour in higher-dimensional systems, such as 3D space. However, further theoretical advances are needed before such experiments can be designed.

Far-reaching implications

The discovery of paraparticles could have far-reaching implications for physics and technology. Fermionic and bosonic statistics have already shaped our understanding of phenomena ranging from the stability of neutron stars to the behaviour of superconductors. Paraparticles could similarly unlock new insights into the quantum world.

“Fermionic statistics underlie why some systems are metals and others are insulators, as well as the structure of the periodic table,” Hazzard explains. “Bose-Einstein condensation [of bosons] is responsible for phenomena such as superfluidity. We can expect a similar variety of phenomena from paraparticles, and it will be exciting to see what these are.”

As research into paraparticles continues, it could open the door to new quantum technologies, novel materials, and deeper insights into the fundamental workings of the universe. This theoretical breakthrough marks a bold step forward, pushing the boundaries of what we thought possible in quantum mechanics.

The paraparticles are described in Nature.

The post Theorists propose a completely new class of quantum particles appeared first on Physics World.

The secret to academic success? Publish a top paper as a postdoc, study finds

Par : No Author
28 janvier 2025 à 11:53

If you’re a postdoc who wants to nail down that permanent faculty position, it’s wise to publish a highly cited paper after your PhD. That’s the conclusion of a study by an international team of researchers, which finds that publication rates and performance during the postdoc period is key to academic retention and early-career success. Their analysis also reveals that more than four in 10 postdocs drop out of academia.

A postdoc is usually a temporary appointment that is seen as preparation for an academic career. Many researchers, however, end up doing several postdocs in a row as they hunt for a permanent faculty job. “There are many more postdocs than there are faculty positions, so it is a kind of systemic bottleneck,” says Petter Holme, a computer scientist at Aalto University in Finland, who led the study.

Previous research into academic career success has tended to overlook the role of a postdoc, focusing instead on, say, the impact of where researchers did their PhD. To eke out the effect of a postdoc, Holme and colleagues combined information of academics’ career stages from LinkedIn with their publication history obtained from Microsoft Academic Graph. The resulting global dataset covered 45, 572 careers spanning 25 years across all academic disciplines.

Overall, they found, 41% of postdocs left academia. But researchers who publish a highly cited paper as a postdoc are much more likely to pursue a faculty career – whether they published a highly cited paper during their PhD degree, or not. Publication rate is also vital, with researchers who publish less as postdocs compared to their PhD days being more likely to drop out of academia. Conversely, as productivity increased, so did the likelihood of a postdoc gaining a faculty position.

Expanding horizons

Holme says their results suggest that a researcher only has a few years “to get on the positive feedback loop, where one success leads to another”. In fact, the team found that a “moderate” change in research topic when moving from PhD to postdoc could improve future success. “It is a good thing to change your research focus, but not too much,” says Holme because it widens perspective without having to learn an entire new research topic from scratch.

Likewise, shifting perspective by moving abroad can also benefit postdocs. The analysis shows that a researcher moving abroad for a postdoc boosts their citations, but a move to a different institution in the same country has a negligible impact.

The post The secret to academic success? Publish a top paper as a postdoc, study finds appeared first on Physics World.

Alternative building materials could store massive amounts of carbon dioxide

27 janvier 2025 à 13:00

Replacing conventional building materials with alternatives that sequester carbon dioxide could allow the world to lock away up to half the CO2 generated by humans each year – about 16 billion tonnes. This is the finding of researchers at the University of California Davis and Stanford University, both in the US, who studied the sequestration potential of materials such as carbonate-based aggregates and biomass fibre in brick.

Despite efforts to reduce greenhouse gas emissions by decarbonizing industry and switching to renewable sources of energy, it is likely that humans will continue to produce significant amounts of CO2 beyond the target “net zero” date of 2050. Carbon storage and sequestration – either at source or directly from the atmosphere – are therefore worth exploring as an additional route towards this goal. Researchers have proposed several possible ways of doing this, including injecting carbon underground or deep under the ocean. However, all these scenarios are challenging to implement practically and pose their own environmental risks.

Modifying common building materials

In the present work, a team of civil engineers and earth systems scientists led by Elisabeth van Roijen (then a PhD student at UC Davis) calculated how much carbon could be stored in modified versions of several common building materials. These include concrete (cement) and asphalt containing carbonate-based aggregates; bio-based plastics; wood; biomass-fibre bricks (from waste biomass); and biochar filler in cement.

The researchers obtained the “16 billion tonnes of CO2” figure by assuming that all aggregates currently employed in concrete would be replaced with carbonate-based versions. They also supplemented 15% of cement with biochar and the remainder with carbonatable cements; increased the amount of wood used in all new construction by 20%; and supplemented 15% of bricks with biomass and the remainder with carbonatable calcium hydroxide. A final element in their calculation was to replace all plastics used in construction today with bio-based plastics and all bitumen with bio-oil in asphalt.

“We calculated the carbon storage potential of each material based on the mass ratio of carbon in each material,” explains van Roijen. “These values were then scaled up based on 2016 consumption values for each material.”

“The sheer magnitude of carbon storage is pretty impressive”

While the production of some replacement materials would need to increase to meet the resulting demand, van Roijen and colleagues found that resources readily available today – for example, mineral-rich waste streams – would already let us replace 10% of conventional aggregates with carbonate-based ones. “These alone could store 1 billion tonnes of CO2,” she says. “The sheer magnitude of carbon storage is pretty impressive, especially when you put it in context of the level of carbon dioxide removal needed to stay below the 1.5 and 2 °C targets set by The Intergovernmental Panel on Climate Change (IPCC).”

Indeed, even if the world doesn’t implement these technologies until 2075, we could still store enough carbon between 2075 and 2100 to stay below these targets, she tells Physics World. “This is assuming, of course, that all other decarbonization efforts outlined in the IPCC reports are also implemented to achieve net-zero emissions,” she says.

Building materials are a good option for carbon storage

The motivation for the study, she explains, came from the urgent need – as expressed by the IPCC – to not only reduce new carbon emissions through rapid and significant decarbonization, but to also remove large amounts of COalready present in the atmosphere. “Rather than burying it in geological, terrestrial or ocean reservoirs, we wanted to look into the possibility of leveraging existing technology – namely conventional building materials – as a way to store CO2. Building materials are a good option for carbon storage given the massive quantity (30 billion tonnes) produced each year, not to mention their durability.”

Van Roijen, who is now a postdoctoral researcher at the US Department of Energy Renewable Energy Laboratory, hopes that this work, which is detailed in Science, will go beyond the reach of the research lab and attract the attention of policymakers and industrialists. While some of the technologies outlined in this study are new and require further research, others, such as bio-based plastics, are well established and simply need some economic and political support, she says. “That said, conventional building materials such as concrete and plastics are pretty cheap, so there will need to be some incentive for industries to make the switch over to these low-carbon materials.”

The post Alternative building materials could store massive amounts of carbon dioxide appeared first on Physics World.

Flexible tactile sensor reads braille in real time

Par : Tami Freeman
27 janvier 2025 à 10:00

Braille is a tactile writing system that helps people who are blind or partially sighted acquire information by touching patterns of tiny raised dots. Braille uses combinations of six dots (two columns of three) to represent letters, numbers and punctuation. But learning to read braille can be challenging, particularly for those who lose their sight later in life, prompting researchers to create automated braille recognition technologies.

One approach involves simply imaging the dots and using algorithms to extract the required information. This visual method, however, struggles with the small size of braille characters and can be impacted by differing light levels. Another option is tactile sensing; but existing tactile sensors aren’t particularly sensitive, with small pressure variations leading to incorrect readings.

To tackle these limitations, researchers from Beijing Normal University and Shenyang Aerospace University in China have employed an optical fibre ring resonator (FRR) to create a tactile braille recognition system that accurately reads braille in real time.

“Current braille readers often struggle with accuracy and speed, especially when it comes to dynamic reading, where you move your finger across braille dots in real time,” says team leader Zhuo Wang. “I wanted to create something that could read braille more reliably, handle slight variations in pressure and do it quickly. Plus, I saw an opportunity to apply cutting-edge technology – like flexible optical fibres and machine learning – to solve this challenge in a novel way.”

Flexible fibre sensor

At the core of the braille sensor is the optical FRR – a resonant cavity made from a loop of fibre containing circulating laser light. Wang and colleagues created the sensing region by embedding an optical fibre in flexible polymer and connecting it into the FRR ring. Three small polymer protrusions on top of the sensor act as probes to transfer the applied pressure to the optical fibre. Spaced 2.5 mm apart to align with the dot spacing, each protrusion responds to the pressure from one of the three braille dots (or absence of a dot) in a vertical column.

Fabricating the fibre ring resonator sensor
Sensor fabrication The optical FRR is made by connecting ports of a 2×2 fibre coupler to form a loop. The sensing region is then connected into the loop. (Courtesy: Optics Express 10.1364/OE.546873)

As the sensor is scanned over the braille surface, the pressure exerted by the raised dots slightly changes the length and refractive index of the fibre, causing tiny shifts in the frequency of the light travelling through the FRR. The device employs a technique called Pound-Drever-Hall (PDH) demodulation to “lock” onto these shifts, amplify them and convert them into readable data.

“The PDH demodulation curve has an extremely steep linear slope, which means that even a very tiny frequency shift translates into a significant, measurable voltage change,” Wang explains. “As a result, the system can detect even the smallest variations in pressure with remarkable precision. The steep slope significantly enhances the system’s sensitivity and resolution, allowing it to pick up subtle differences in braille dots that might be too small for other sensors to detect.”

The eight possible configurations of three dots generate eight distinct pressure signals, with each braille character defined by two pressure outputs (one per column). Each protrusion has a slightly different hardness level, enabling the sensor to differentiate pressures from each dot. Rather than measuring each dot individually, the sensor reads the overall pressure signal and instantly determines the combination of dots and the character they correspond to.

The researchers note that, in practice, the contact force may vary slightly during the scanning process, resulting in the same dot patterns exhibiting slightly different pressure signals. To combat this, they used neural networks trained on large amounts of experimental data to correctly classify braille patterns, even with small pressure variations.

“This design makes the sensor incredibly efficient,” Wang explains. “It doesn’t just feel the braille, it understands it in real time. As the sensor slides over a braille board, it quickly decodes the patterns and translates them into readable information. This allows the system to identify letters, numbers, punctuation, and even words or poems with remarkable accuracy.”

Stable and accurate

Measurements on the braille sensor revealed that it responds to pressures of up to 3 N, as typically exerted by a finger when touching braille, with an average response time of below 0.1 s, suitable for fast dynamic braille reading. The sensor also exhibited excellent stability under temperature or power fluctuations.

To assess its ability to read braille dots, the team used the sensor to read eight different arrangements of three dots. Using a multilayer perceptron (MLP) neural network, the system effectively distinguished the eight different tactile pressures with a classification accuracy of 98.57%.

Next, the researchers trained a long short-term memory (LSTM) neural network to classify signals generated by five English words. Here, the system demonstrated a classification accuracy of 100%, implying that slight errors in classifying signals in each column will not affect the overall understanding of the braille.

Finally, they used the MLP-LSTM model to read short sentences, either sliding the sensor manually or scanning it electronically to maintain a consistent contact force. In both cases, the sensor accurately recognised the phrases.

The team concludes that the sensor can advance intelligent braille recognition, with further potential in smart medical care and intelligent robotics. The next phase of development will focus on making the sensor more durable, improving the machine learning models and making it scalable.

“Right now, the sensor works well in controlled environments; the next step is to test its use by different people with varying reading styles, or under complex application conditions,” Wang tells Physics World. “We’re also working on making the sensor more affordable so it can be integrated into devices like mobile braille readers or wearables.”

The sensor is described in Optics Express.

The post Flexible tactile sensor reads braille in real time appeared first on Physics World.

The physics of George R R Martin’s Wild Card virus revealed

24 janvier 2025 à 17:00

It’s not every day that a well-known author writes a physics paper. But George R R Martin, who is best known for his Song of Ice and Fire series of fantasy novels, has co-authored a paper in the American Journal of Physics with the title “Ergodic Lagrangian dynamics in a superhero universe”.

Written with Los Alamos National Laboratory theoretical physicist Ian Tregillis, who is also a science-fiction author of several books, they have derived a mathematical model of the so-called wild cards virus.

The Wild Cards universe is a series of novels created by a consortium of writers including Martin and Tregillis.

Set largely during an alternate history of the US following the Second World War, the series follows events after an extraterrestrial virus, known as the Wild Card virus, has spread worldwide. It mutates human DNA causing profound changes in human physiology and society at large.

The virus follows a fixed statistical distribution of outcomes in that 90% of those infected die, 9% become physically mutated (referred to as “jokers”) and 1% gain superhuman abilities (known as “aces”). Such capabilities include the ability to fly as well as being able to move between dimensions. The stories in the series then follow the individuals that have been impacted by the virus.

Tregillis and Martin have now derived a formula for the viral behaviour of the Wild Card virus. “Like any physicist, I started with back-of-the-envelope estimates, but then I went off the deep end,” notes Tregillis. “Being a theoretician, I couldn’t help but wonder if a simple underlying model might tidy up the canon.”

The model takes into consideration the severity of the changes (for the 10% that don’t instantly die) and the mix of joke/ace traits. After all, those infected can also become cryto-jokers or crypto-aces – undetected cases where individuals have subtle changes or powers – as well as joker-aces, in which a human develops both mutations and superhuman abilities.

The result is a dynamical system in which a carrier’s state vector constantly evolves through the model space — until their “card” turns. At that point the state vector becomes fixed and its permanent location determines the fate of the carrier. “The time-averaged behavior of this system generates the statistical distribution of outcomes,” adds Tregillis.

The purpose of the paper, and the model, is also to provide an exercise in demonstrating how “whimsical” scenarios can be used to explore concepts in physics and mathematics.

“The fictional virus is really just an excuse to justify the world of Wild Cards, the characters who inhabit it, and the plot lines that spin out from their actions,” says Tregillis.

The post The physics of George R R Martin’s Wild Card virus revealed appeared first on Physics World.

Fast radio burst came from a neutron star’s magnetosphere, say astronomers

24 janvier 2025 à 16:00

The exact origins of cosmic phenomena known as fast radio bursts (FRBs) are not fully understood, but scientists at the Massachusetts Institute of Technology (MIT) in the US have identified a fresh clue: at least one of these puzzling cosmic discharges got its start very close to the object that emitted it. This result, which is based on measurements of a fast radio burst called FRB 20221022A, puts to rest a long-standing debate about whether FRBs can escape their emitters’ immediate surroundings. The conclusion: they can.

“Competing theories argued that FRBs might instead be generated much farther away in shock waves that propagate far from the central emitting object,” explains astronomer Kenzie Nimmo of MIT’s Kavli Institute for Astrophysics and Space Research. “Our findings show that, at least for this FRB, the emission can escape the intense plasma near a compact object and still be detected on Earth.”

As their name implies, FRBs are brief, intense bursts of radio waves. The first was detected in 2007, and since then astronomers have spotted thousands of others, including some within our own galaxy. They are believed to originate from cataclysmic processes involving compact celestial objects such as neutron stars, and they typically last a few milliseconds. However, astronomers have recently found evidence for bursts a thousand times shorter, further complicating the question of where they come from.

Nimmo and colleagues say they have now conclusively demonstrated that FRB 20221022A, which was detected by the Canadian Hydrogen Intensity Mapping Experiment (CHIME) in 2022, comes from a region only 10 000 km in size. This, they claim, means it must have originated in the highly magnetized region that surrounds a star: the magnetosphere.

“Fairly intuitive” concept

The researchers obtained their result by measuring the FRB’s scintillation, which Nimmo explains is conceptually similar to the twinkling of stars in the night sky. The reason stars twinkle is that because they are so far away, they appear to us as point sources. This means that their apparent brightness is more affected by the Earth’s atmosphere than is the case for planets and other objects that are closer to us and appear larger.

“We applied this same principle to FRBs using plasma in their host galaxy as the ‘scintillation screen’, analogous to Earth’s atmosphere,” Nimmo tells Physics World. “If the plasma causing the scintillation is close to the FRB source, we can use this to infer the apparent size of the FRB emission region.”

According to Nimmo, different models of FRB origins predict very different sizes for this region. “Emissions originating within the magnetized environments of compact objects (for example, magnetospheres) would produce a much smaller apparent size compared to emission generated in distant shocks propagating far from the central object,” she explains. “By constraining the emission region size through scintillation, we can determine which physical model is more likely to explain the observed FRB.”

Challenge to existing models

The idea for the new study, Nimmo says, stemmed from a conversation with another astronomer, Pawan Kumar of the University of Texas at Austin, early last year. “He shared a theoretical result showing how scintillation could be used a ‘probe’ to constrain the size of the FRB emission region, and, by extension, the FRB emission mechanism,” Nimmo says. “This sparked our interest and we began exploring the FRBs discovered by CHIME to search for observational evidence for this phenomenon.”

The researchers say that their study, which is detailed in Nature, shows that at least some FRBs originate from magnetospheric processes near compact objects such as neutron stars. This finding is a challenge for models of conditions in these extreme environments, they say, because if FRB signals can escape the dense plasma expected to exist near such objects, the plasma may be less opaque than previously assumed. Alternatively, unknown factors may be influencing FRB propagation through these regions.

A diagnostic tool

One advantage of studying FRB 20221022A is that it is relatively conventional in terms of its brightness and the duration of its signal (around 2 milliseconds). It does have one special property, however, as discovered by Nimmo’s colleagues at McGill University in Canada: its light is highly polarized. What is more, the pattern of its polarization implies that its emitter must be rotating in a way that is reminiscent of pulsars, which are highly magnetized, rotating neutron stars. This result is reported in a separate paper in Nature.

In Nimmo’s view, the MIT team’s study of this (mostly) conventional FRB establishes scintillation as a “powerful diagnostic tool” for probing FRB emission mechanisms. “By applying this method to a larger sample of FRBs, which we now plan to investigate, future studies could refine our understanding of their underlying physical processes and the diverse environments they occupy.”

The post Fast radio burst came from a neutron star’s magnetosphere, say astronomers appeared first on Physics World.

Explore the quantum frontier: all about the International Year of Quantum Science and Technology 2025

24 janvier 2025 à 12:13

In June 1925 a relatively unknown physics postdoc by the name of Werner Heisenberg developed the basic mathematical framework that would be the basis for the first quantum revolution. Heisenberg, who would later win the Nobel Prize for Physics, famously came up with quantum mechanics on a two-week vacation on the tiny island of Helgoland off the coast of Germany, where he had gone to cure a bad bout of hay fever.

Now, a century later, we are on the cusp of a second quantum revolution, with quantum science and technologies growing rapidly across the globe. According to the State of Quantum 2024 report, a total of 33 countries around the world currently have government initiatives in quantum technology, of which more than 20 have national strategies with large-scale funding. The report estimates that up to $50bn in public cash has already been committed.

It’s a fitting tribute, then, that the United Nations (UN) has chosen 2025 to be the International Year of Quantum Science and Technology (IYQ). They hope that the year will raise global awareness of the impact that quantum physics and its applications have already had on our world. The UN also aims to highlight to the global public the myriad potential future applications of quantum technologies and how they could help tackle universal issues – from climate and clean energy to health and infrastructure – while also addressing the UN’s sustainable development goals.

The Institute of Physics (IOP), which publishes Physics World, is one of the IYQ’s six “founding partners” alongside the German (DPG) and American physical societies (APS), SPIE, Optica and the Chinese Optical Society. “The UNESCO International Year of Quantum is a wonderful opportunity to spread the word about quantum research and technology and the transformational opportunities it is opening up” says Tom Grinyer, chief executive of the IOP. “The Institute of Physics is co-ordinating the UK and Irish elements of the year, which mark the 100th anniversary of the first formulation of quantum mechanics, and we are keen to celebrate the milestone, making sure that as many people as possible get the opportunity to find out more about this fascinating area of science and technology,” he adds.

“IYQ provides the opportunity for societies and organizations around the world to come together in marking both the 100-year history of the field, as well as the longer-term real-world impact that quantum science is certain to have for decades to come,” says Tim Smith, head of portfolio development at IOP Publishing. “Quantum science and technology represents one of the most exciting and rapidly developing areas of science today, encompassing the global physical-sciences community in a way that connects scientific wonder with fundamental research, technological innovation, industry, and funding programmes worldwide.”

Taking shape

The official opening ceremony for IYQ takes place on 4–5 February at the UNESCO headquarters in Paris, France, although several countries, including Germany and India, held their own launches in advance of the main event. Working together, the IOP and IOP Publishing have developed a wide array of quantum resources, talks, conferences, festivals and public-themed events planned as a part of the UK’s celebrations for IYQ. 

In late February, meanwhile, the Royal Society – the world’s oldest continuously active learned society – will host a two-day quantum conference. Dubbed “Quantum Information”, it will bring together scientists, industry leaders and public-sector stakeholders to discuss the current challenges involved in quantum computing, networks and sensing systems.

In Scotland, the annual Edinburgh Science Festival , which takes place in April, will likely include a special “quantum explorers” exhibit and workshop by the UK’s newly launched National Quantum Computing Centre. Elsewhere, the Quantum Software Lab at the School of Informatics at the University of Edinburgh is hosting a month-long “Quantum Fringe 2025” event across Scotland. It will include a quantum machine-learning school on the Isle of Skye and well as the annual UK Quantum Hackathon, which brings together teams of aspiring coders with industry mentors to tackle practical challenges and develop solutions using quantum computing.

In June, the Institution of Engineering and Technology is hosting a Quantum Engineering and Technologies conference, as part of its newly launched Quantum technologies and 6G and Future Networks events. The event’s themes include everything from information processing and memories to photon sources and cryptography.

The IOP will use the focus this year gives us to continue to make the case for the investment in research and development, and support for physics skills, which will be crucial if we are to fully unlock the economic and social potential of the quantum sector

Further IYQ-themed events will take place at  QuAMP, the IOP’s biennial international conference on quantum, atomic and molecular physics in September. Activities culminate in a three-part celebration in November, with a quantum community event led by the IOP’s History of Physics and quantum Business and Innovation Growth (qBIG) special interest groups, a schools event at the Royal Institution, and a public celebration with a keynote speech from University of Surrey quantum physicist and broadcaster Jim Al-Khalili. “The UK and Ireland already have a globally important position in many areas of quantum research, with the UK, for instance, having established one of the world’s first National Quantum Technology Programmes,” explains Grinyer. “We will also be using the focus this year gives us to continue to make the case for the investment in research and development, and support for physics skills, which will be crucial if we are to fully unlock the economic and social potential of what is both a fascinating area of research, and a fast growing physics-powered business sector,” he adds.

Quantum careers

With the booming quantum marketplace, it’s no surprise that employers are on the hunt for many skilled physicists to join the workforce. And indeed, there is a significant scarcity of skilled quantum professionals for the many roles across industry and academia. Also, with quantum research advancing everything from software and machine learning to materials science and drug discovery, your skills will be transferable across the board.

If you plan to join the quantum workforce, then choosing the right PhD programme, having the right skills for a specific role and managing risk and reward in the emerging quantum industry are all crucial. There are a number of careers events on the IYQ calendar, to learn more about the many career prospects for physicists in the sector. In April, for example, the University of Bristol’s Quantum Engineering Centre for Doctoral Training is hosting a Careers in Quantum event, while the Economist magazine is hosting its annual Commercialising Quantum conference in May.

There will also be a special quantum careers panel discussion, including top speakers from the UK and the US, as part of our newly launched Physics World Live panel discussions in April. This year’s Physics World Careers 2025 guide has a special quantum focus, and there’ll also be a bumper, quantum-themed issue of the Physics World Briefing in June. The Physics World quantum channel will be regularly updated throughout the year so you don’t miss a thing.

Read all about it

IOP Publishing’s journals will include specially curated content – from a series of Perspectives articles – personal viewpoints from leading quantum scientists – in Quantum Science and Technology. The journal will also be publishing roadmaps in quantum computing, sensing and communication, as well as focus issues on topics such as quantum machine learning and technologies for quantum gravity and thermodynamics in quantum coherent platforms.

“Going right to the core of IOP Publishing’s own historic coverage we’re excited to be celebrating the IYQ through a year-long programme of articles in Physics World and across our journals, that will hopefully show a wide audience just why everyone should care about quantum science and the people behind it,” says Smith.

Of course, we at Physics World have a Schrödinger’s box full of fascinating quantum articles for the coming year – from historical features to the latest cutting-edge developments in quantum tech. So keep your eyes peeled.

This article forms part of Physics World‘s contribution to the 2025 International Year of Quantum Science and Technology (IYQ), which aims to raise global awareness of quantum physics and its applications.

Stayed tuned to Physics World and our international partners throughout the next 12 months for more coverage of the IYQ.

Find out more on our quantum channel.

The post Explore the quantum frontier: all about the International Year of Quantum Science and Technology 2025 appeared first on Physics World.

Helgoland: leading physicists to gather on the tiny island where quantum mechanics was born

24 janvier 2025 à 10:20

In this episode of Physics World Stories, we celebrate the 100th anniversary of Werner Heisenberg’s trip to the North Sea island of Helgoland, where he developed the first formulation of quantum theory. Listen to the podcast as we delve into the latest advances in quantum science and technology with three researchers who will be attending a 6-day workshop on Helgoland in June 2025.

Featuring in the episode are: Nathalie De Leon of Princeton University, Ana Maria Rey from the University of Colorado Boulder, and Jack Harris from Yale University, a member of the programme committee. These experts share their insights on the current state of quantum science and technology: discussing the latest developments in quantum sensing, quantum information and quantum computing.

They also reflect on the significance of attending a conference at a location that is so deeply ingrained in the story of quantum mechanics. Talks at the event will span the science and the history of quantum theory, as well as the nature of scientific revolutions.

This episode is part of Physics World’s quantum coverage throughout 2025, designated by the UN as the International Year of Quantum Science and Technology (IYQ). Check out this article, for all you need to know about IYQ.

This article forms part of Physics World‘s contribution to the 2025 International Year of Quantum Science and Technology (IYQ), which aims to raise global awareness of quantum physics and its applications.

Stayed tuned to Physics World and our international partners throughout the next 12 months for more coverage of the IYQ.

Find out more on our quantum channel.

The post Helgoland: leading physicists to gather on the tiny island where quantum mechanics was born appeared first on Physics World.

💾

Terahertz light produces a metastable magnetic state in an antiferromagnet

24 janvier 2025 à 10:00

Physicists in the US, Europe and Korea have produced a long-lasting light-driven magnetic state in an antiferromagnetic material for the first time. While their project started out as a fundamental study, they say the work could have applications for faster and more compact memory and processing devices.

Antiferromagnetic materials are promising candidates for future high-density memory devices. This is because in antiferromagnets, the spins used as the bits or data units flip quickly, at frequencies in the terahertz range. Such rapid spin flips are possible because, by definition, the spins in antiferromagnets align antiparallel to each other, leading to strong interactions among the spins. This is different from ferromagnets, which have parallel electron spins and are used in today’s memory devices such as computer hard drives.

Another advantage is that antiferromagnets display almost no macroscopic magnetization. This means that bits can be packed more densely onto a chip than is the case for the ferromagnets employed in conventional magnetic memory, which do have a net magnetization.

A further attraction is that the values of bits in antiferromagnetic memory devices are generally unaffected by the presence of stray magnetic fields. However, Nuh Gedik of the Massachusetts Institute of Technology (MIT), who led the latest research effort, notes that this robustness can be a double-edged sword: the fact that antiferromagnet spins are insensitive to weak magnetic fields also makes them difficult to control.

Antiferromagnetic state lasts for more than 2.5 milliseconds

In the new work, Gedik and colleagues studied FePS3, which becomes an antiferromagnet below a critical temperature of around 118 K. By applying intense pulses of terahertz-frequency light to this material, they were able to control this transition, placing the material in a metastable magnetic state that lasts for more than 2.5 milliseconds even after the light source is switched off. While such light-induced transitions have been observed before, Gedik notes that they typically only last for picoseconds.

The technique works because the terahertz source stimulates the atoms in the FePS3 at the same frequency at which the atoms collectively vibrate (the resonance frequency). When this happens, Gedik explains that the atomic lattice undergoes a unique form of stretching. This stretching cannot be achieved with external mechanical forces, and it pushes the spins of the atoms out of their magnetically alternating alignment.

The result is a state in which the spin in one direction is larger, transforming the originally antiferromagnetic material into a state with net magnetization. This metastable state becomes increasingly robust as the temperature of the material approaches the antiferromagnetic transition point. That is a sign that critical fluctuations near the phase transition point are a key factor in enhancing both the magnitude and lifetime of the new magnetic state, Gedik says.

A new experimental setup

The team, which includes researchers from the Max Planck Institute for the Structure and Dynamics of Matter in Germany, the University of the Basque Country in Spain, Seoul National University and the Flatiron Institute in New York, wasn’t originally aiming to produce long-lived magnetic states. Instead, its members were investigating nonlinear interactions among low-energy collective modes, such as phonons (vibrations of the atomic lattice) and spin excitations called magnons, in layered magnetic materials like FePS3. It was for this purpose that they developed a new experimental setup capable of generating strong terahertz pulses with a wide spectral bandwidth.

“Since nonlinear interactions are generally weak, we chose a family of materials known for their strong coupling between magnetic spins and phonons,” Gedik says. “We also suspected that, under such intense resonant excitation in these particular materials, something intriguing might occur – and indeed, we discovered a new magnetic state with an exceptionally long lifetime.”

While the researchers’ focus remains on fundamental questions, they say the new findings may enable a “significant step” toward practical applications for ultrafast science. “The antiferromagnetic nature of the material holds great potential for potentially enabling faster and more compact memory and processing devices,” says. Gedik’s MIT colleague Batyr Ilyas. He adds that the observed long lifetime of the induced state means that it can be explored further using conventional experimental probes used in spintronic technologies.

The team’s next step will be to study the nonlinear interactions between phonons and magnons more closely using two-dimensional spectroscopy experiments. “Second, we plan to demonstrate the feasibility of probing this metastable state through electrical transport experiments,” Ilyas tells Physics World. “Finally, we aim to investigate the generalizability of this phenomenon in other materials, particularly those exhibiting enhanced fluctuations near room temperature.”

The work is detailed in Nature.

The post Terahertz light produces a metastable magnetic state in an antiferromagnet appeared first on Physics World.

Why electrochemistry lies at the heart of modern technology

23 janvier 2025 à 15:25

This episode of the Physics World Weekly podcast features a conversation with Colm O’Dwyer, who is professor of chemical energy at University College Cork in Ireland and president of the Electrochemical Society.

He talks about the role that electrochemistry plays in the development of modern technologies including batteries, semiconductor chips and pharmaceuticals. O’Dwyer chats about the role that the Electrochemical Society plays in advancing the theory and practice of electrochemistry and solid-state science and technology. He also explains how electrochemists collaborate with scientists and engineers in other fields including physics – and he looks forward to the future of electrochemistry.

Courtesy: American Elements

 

This podcast is supported by American Elements. Trusted by researchers and industries the world over, American Elements is helping shape the future of battery and electrochemistry technology.

The post Why electrochemistry lies at the heart of modern technology appeared first on Physics World.

China’s Experimental Advanced Superconducting Tokamak smashes fusion confinement record

23 janvier 2025 à 14:35

A fusion tokamak in China has smashed its previous fusion record of maintaining a steady-state plasma. This week, scientists working on the Experimental Advanced Superconducting Tokamak (EAST) announced that they had produced a steady-state high-confinement plasma for 1066 seconds, breaking EAST’s previous 2023 record of 403 seconds.

EAST is an experimental superconducting tokamak fusion device located in Hefei, China. Operated by the Institute of Plasma Physics (AISPP) at the Hefei Institute of Physical Science, it began operations in 2006. It is the first tokamak to contain a deuterium plasma using superconducting niobium-titanium toroidal and poloidal magnets.

EAST has recently undergone several upgrades, notably with new plasma diagnostic tools and a doubling in the power of the plasma heating system. EAST is also acting as a testbed for the ITER fusion reactor that is currently being built in Cadarache, France.

The EAST tokamak is able to maintain a plasma in the so-called “H‐mode”. This is the high-confinement regime that modern tokamaks, including ITER, employ. It occurs when the plasma undergoes intense heating by a neutral beam and results in a sudden improvement of plasma confinement by a factor of two.

In 2017 scientists at EAST broke the 100 seconds barrier for a steady-state H-mode plasma and then in 2023 achieved a 403 seconds, a world record at the time. On Monday, EAST officials announced that they had almost tipled that time, delivering H-mode operation for 1066 seconds.

ASIPP director Song Yuntao notes that the new record is “monumental” and represents a “critical step” toward realizing a functional fusion reactor. “A fusion device must achieve stable operation at high efficiency for thousands of seconds to enable the self-sustaining circulation of plasma,” he says, “which is essential for the continuous power generation of future fusion plants”.

The post China’s Experimental Advanced Superconducting Tokamak smashes fusion confinement record appeared first on Physics World.

New candidate emerges for a universal quantum electrical standard

23 janvier 2025 à 10:00

Physicists in Germany have developed a new way of defining the standard unit of electrical resistance. The advantage of the new technique is that because it is based on the quantum anomalous Hall effect rather than the ordinary quantum Hall effect, it does not require the use of applied magnetic fields. While the method in its current form requires ultracold temperatures, an improved version could allow quantum-based voltage and resistance standards to be integrated into a single, universal quantum electrical reference.

Since 2019, all base units in the International System of Units (SI) have been defined with reference to fundamental constants of nature. For example, the definition of the kilogram, which was previously based on a physical artefact (the international prototype kilogram), is now tied to Planck’s constant, h.

These new definitions do come with certain challenges. For example, today’s gold-standard way to experimentally determine the value of h (as well the elementary charge e, another base SI constant) is to measure a quantized electrical resistance (the von Klitzing constant RK = h/e2) and a quantized voltage (the Josephson constant KJ = 2e/h). With RK and KJ pinned down, scientists can then calculate e and h.

To measure RK with high precision, physicists use the fact that it is related to the quantized values of the Hall resistance of a two-dimensional electron system (such as the ones that form in semiconductor heterostructures) in the presence of a strong magnetic field. This quantized change in resistance is known as the quantum Hall effect (QHE), and in semiconductors like GaAs or AlGaAs, it shows up at fields of around 10 Tesla. In graphene, a two-dimensional carbon sheet, fields of about 5 T are typically required.

The problem with this method is that KJ is measured by means of a separate phenomenon known as the AC Josephson effect, and the large external magnetic fields that are so essential to the QHE measurement render Josephson devices inoperable. According to Charles Gould of the Institute for Topological Insulators at the University of Würzburg (JMU), who led the latest research effort, this makes it difficult to integrate a QHE-based resistance standard with the voltage standard.

A way to measure RK at zero external magnetic field

Relying on the quantum anomalous Hall effect (QAHE) instead would solve this problem. This variant of the QHE arises from electron transport phenomena recently identified in a family of materials known as ferromagnetic topological insulators. Such quantum spin Hall systems, as they are also known, conduct electricity along their (quantized) edge channels or surfaces, but act as insulators in their bulk. In these materials, spontaneous magnetization means the QAHE manifests as a quantization of resistance even at weak (or indeed zero) magnetic fields.

In the new work, Gould and colleagues made Hall resistance quantization measurements in the QAHE regime on a device made from V-doped (Bi,Sb)2Te3. These measurements showed that the relative deviation of the Hall resistance from RK at zero external magnetic field is just (4.4 ± 8.7) nΩ Ω−1. The method thus makes it possible to determine RK at zero magnetic field with the needed precision — something Gould says was not previously possible.

The snag is that the measurement only works under demanding experimental conditions: extremely low temperatures (below about 0.05 K) and low electrical currents (below 0.1 uA). “Ultimately, both these parameters will need to be significantly improved for any large-scale use,” Gould explains. “To compare, the QHE works at temperatures of 4.2 K and electrical currents of about 10 uA; making its detection much easier and cheaper to operate.”

Towards a universal electrical reference instrument

The new study, which is detailed in Nature Electronics, was made possible thanks to a collaboration between two teams, he adds. The first is at Würzburg, which has pioneered studies on electron transport in topological materials for some two decades. The second is at the Physikalisch-Technische Bundesanstalt (PTB) in Braunschweig, which has been establishing QHE-based resistance standards for even longer. “Once the two teams became aware of each other’s work, the potential of a combined effort was obvious,” Gould says.

Because the project brings together two communities with very different working methods and procedures, they first had to find a window of operations where their work could co-exist. “As a simple example,” explains Gould, “the currents of ~100 nA used in the present study are considered extremely low for metrology, and extreme care was required to allow the measurement instrument to perform under such conditions. At the same time, this current is some 200 times larger than that typically used when studying topological properties of materials.”

As well as simplifying access to the constants h and e, Gould says the new work could lead to a universal electrical reference instrument based on the QAHE and the Josephson effect. Beyond that, it could even provide a quantum standard of voltage, resistance, and (by means of Ohm’s law) current, all in one compact experiment.

The possible applications of the QAHE in metrology have attracted a lot of attention from the European Union, he adds. “The result is a Europe-wide EURAMET metrology consortium QuAHMET aimed specifically at further exploiting the effect and operation of the new standard at more relaxed experimental conditions.”

The post New candidate emerges for a universal quantum electrical standard appeared first on Physics World.

Nanocrystals measure tiny forces on tiny length scales

Par : No Author
22 janvier 2025 à 18:14

Two independent teams in the US have demonstrated the potential of using the optical properties of nanocrystals to create remote sensors that measure tiny forces on tiny length scales. One team is based at Stanford University and used nanocrystals to measure the micronewton-scale forces exerted by a worm as it chewed bacteria. The other team is based at several institutes and used the photon avalanche effect in nanocrystals to measure sub-nanonewton to micronewton forces. The latter technique could potentially be used to study forces involved in processes such as stem cell differentiation.

Remote sensing of forces at small scales is challenging, especially inside living organisms. Optical tweezers cannot make remote measurements inside the body, while fluorophores – molecules that absorb and re-emit light – can measure forces in organisms, but have limited range, problematic stability or, in the case of quantum dots, toxicity. Nanocrystals with optical properties that change when subjected to external forces offer a way forward.

At Stanford, materials scientist Jennifer Dionne led a team that used nanocrystals doped with ytterbium and erbium. When two ytterbium atoms absorb near-infrared photons, they can then transfer energy to a nearby erbium atom. In this excited state, the erbium can either decay directly to its lowest energy state by emitting red light, or become excited to an even higher-energy state that decays by emitting green light. These processes are called upconversion.

Colour change

The ratio of green to red emission depends on the separation between the ytterbium and erbium atoms, and the separation between the erbium atoms – explains Dionne’s PhD student Jason Casar, who is lead author of a paper describing the Stanford research. Forces on the nanocrystal can change these separations and therefore affect that ratio.

The researchers encased their nanocrystals in polystyrene vessels approximately the size of a E coli bacterium. They then mixed the encased nanoparticles with E coli bacteria that were then fed to tiny nematode worms. To extract the nutrients, the worm’s pharynx needs to break open the bacterial cell wall. “The biological question we set out to answer is how much force is the bacterium generating to achieve that breakage?” explains Stanford’s Miriam Goodman.

The researchers shone near-infrared light on the worms, allowing them to monitor the flow of the nanocrystals. By measuring the colour of the emitted light when the particles reached the pharynx, they determined the force it exerted with micronewton-scale precision.

Meanwhile, a collaboration of scientists at Columbia University, Lawrence Berkeley National Laboratory and elsewhere has shown that a process called photon avalanche can be used to measure even smaller forces on nanocrystals. The team’s avalanching nanoparticles (ANPs) are sodium yttrium fluoride nanocrystals doped with thulium – and were discovered by the team in 2021.

The fun starts here

The sensing process uses a laser tuned off-resonance from any transition from the ground state of the ANP. “We’re bathing our particles in 1064 nm light,” explains James Schuck of Columbia University, whose group led the research. “If the intensity is low, that all just blows by. But if, for some reason, you do eventually get some absorption – maybe a non-resonant absorption in which you give up a few phonons…then the fun starts. Our laser is resonant with an excited state transition, so you can absorb another photon.”

This creates a doubly excited state that can decay radiatively directly to the ground state, producing an upconverted photon. Or, it energy can be transferred to a nearby thulium atom, which becomes resonant with the excited state transition and can excite more thulium atoms into resonance with the laser. “That’s the avalanche,” says Schuck; “We find on average you get 30 or 40 of these events – it’s analogous to a chain reaction in nuclear fission.”

Now, Schuck and colleagues have shown that the exact number of photons produced in each avalanche decreases when the nanoparticle experiences compressive force. One reason is that the phonon frequencies are raised as the lattice is compressed, making non-radiatively decay energetically more favourable.

The thulium-doped nanoparticles decay by emitting either red or near infrared photons. As the force increases, the red dims more quickly, causing a change in the colour of the emitted light. These effects allowed the researchers to measure forces from the sub-nanonewton to the micronewton range – at which point the light output from the nanoparticles became too low to detect.

Not just for forces

Schuck and colleagues are now seeking practical applications of their discovery, and not just for measuring forces.

“We’re discovering that this avalanching process is sensitive to a lot of things,” says Schuck. “If we put these particles in a cell and we’re trying to measure a cellular force gradient, but the cell also happened to change its temperature, that would also affect the brightness of our particles, and we would like to be able to differentiate between those things. We think we know how to do that.”

If the technique could be made to work in a living cell, it could be used to measure tiny forces such as those involved in the extra-cellular matrix that dictate stem cell differentiation.

Andries Meijerink of Utrecht University in the Netherlands believes both teams have done important work that is impressive in different ways. Schuck and colleagues for unveiling a fundamentally new force sensing technique and Dionne’s team for demonstrating a remarkable practical application.

However, Meijerink is sceptical that photon avalanching will be useful for sensing in the short term. “It’s a very intricate process,” he says, adding, “There’s a really tricky balance between this first absorption step, which has to be slow and weak, and this resonant absorption”. Nevertheless, he says that researchers are discovering other systems that can avalanche. “I’m convinced that many more systems will be found,” he says.

Both studies are described in Nature. Dionne and colleagues report their results here, and Schuck and colleagues here.

The post Nanocrystals measure tiny forces on tiny length scales appeared first on Physics World.

IOP president Keith Burnett outlines a ‘pivotal’ year ahead for UK physics

Par : No Author
22 janvier 2025 à 15:37

Last year was the year of elections and 2025 is going to be the year of decisions.

After many countries, including the UK, Ireland and the US, went to the polls in 2024, the start of 2025 will see governments at the beginning of new terms, forced to respond swiftly to mounting economic, social, security, environmental and technological challenges.

These issues would be difficult to address at any given time, but today they come amid a turbulent geopolitical context. Governments are often judged against short milestones – the first 100 days or a first budget – but urgency should not come at the cost of thinking long-term, because the decisions over the next few months will shape outcomes for years, perhaps decades, to come. This is no less true for science than it is for health and social care, education or international relations.

In the UK, the first half of the year will be dominated by the government’s spending review. Due in late spring, it could be one of the toughest political tests for UK science, as the implications of the tight spending plans announced in the October budget become clear. Decisions about departmental spending will have important implications for physics funding, from research to infrastructure, facilities and teaching.

One of the UK government’s commitments is to establish 10-year funding cycles for key R&D activities – a policy that could be a positive improvement. Physics discoveries often take time to realise in full, but their transformational nature is indisputable. From fibre-optic communications to magnetic resonance imaging, physics has been indispensable to many of the world’s most impactful and successful innovations.

Emerging technologies, enabled by physicists’ breakthroughs in fields such as materials science and quantum physics, promise to transform the way we live and work, and create new business opportunities and open up new markets. A clear, comprehensive and long-term vision for R&D would instil confidence among researchers and innovators, and long-term and sustainable R&D funding would enable people and disruptive ideas to flourish and drive tomorrow’s breakthroughs.

Alongside the spending review, we are also expecting the publication of the government’s industrial strategy. The focus of the green paper published last year was an indication of how the strategy will place significance on science and technology in positioning the UK for economic growth.

If we don’t recognise the need to fund more physicists, we will miss so many of the opportunities that lie ahead

Physics-based industries are a foundation stone for the UK economy and are highly productive, as highlighted by research commissioned by the Institute of Physics, which publishes Physics World. Across the UK, the physics sector generates £229bn gross value added, or 11% of total UK gross domestic product. It creates a collective turnover of £643bn, or £1380bn when indirect and induced turnover is included.

Labour productivity in physics-based businesses is also strong at £84 300 per worker, per year. So, if physics is not at the heart of this effort, then the government’s mission of economic revival is in danger of failing to get off the launch pad.

A pivotal year

Another of the new government’s policy priorities is the strategic defence review, which is expected to be published later this year. It could have huge implications for physics given its core role in many of the technologies that contribute to the UK’s defence capabilities. The changing geopolitical landscape, and potential for strained relations between global powers, may well bring research security to the front of the national mind.

Intellectual property, and scientific innovation, are some of the UK’s greatest strengths and it is right to secure them. But physics discoveries in particular can be hampered by overzealous security measures. So much of the important work in our discipline comes from years of collaboration between researchers across the globe. Decisions about research security need to protect, not hamper, the future of UK physics research.

This year could also be pivotal for UK universities, as securing their financial stability and future will be one of the major challenges. Last year, the pressures faced by higher education institutions became apparent, with announcements of course closures, redundancies and restructures as a way of saving money. The rise in tuition fees has far from solved the problem, so we need to be prepared for more turbulence coming for the higher education sector.

These things matter enormously. We have heard that universities are facing a tough situation, and it’s getting harder for physics departments to exist. But if we don’t recognise the need to fund more physicists, we will miss so many of the opportunities that lie ahead.

As we celebrate the International Year of Quantum Science and Technology that marks the centenary of the initial development of quantum mechanics by Werner Heisenberg, 2025 is a reminder of how the benefits of physics span over decades.

We need to enhance all the vital and exciting developments that are happening in physics departments. The country wants and needs a stronger scientific workforce – just think about all those individuals who studied physics and now work in industries that are defending the country – and that workforce will be strongly dependent on physics skills. So our priority is to make sure that physics departments keep doing world-leading research and preparing the next generation of physicists that they do so well.

The post IOP president Keith Burnett outlines a ‘pivotal’ year ahead for UK physics appeared first on Physics World.

Why telling bigger stories is the only way to counter misinformation

22 janvier 2025 à 12:43

If aliens came to Earth and tried to work out how we earthlings make sense of our world, they’d surely conclude that we take information and slot it into pre-existing stories – some true, some false, some bizarre. Ominously, these aliens would be correct. You don’t need to ask earthling philosophers, just look around.

Many politicians and influencers, for instance, are convinced that scientific evidence does not tell the reality about, for instance, autism or AIDS, the state of the atmosphere or the legitimacy of elections, or even about aliens. Truth comes to light only when you “know the full story”, which will eventually reveal the scientific data to be deceptive or irrelevant.

To see how this works in practice, suppose you hear someone say that a nearby lab is leaking x picocuries of a radioactive substance, potentially exposing you to y millirems of dose. How do you know if you’re in danger? Well, you’ll instinctively start going through a mental checklist of questions.

Who’s speaking – scientist, politician, reporter or activist? If it’s a scientist, are they from the government, a university, or an environmental or anti-nuclear group? You might then wonder: how trustworthy are the agencies that regulate the substance? Is the lab a good neighbour, or did it cover up past incidents? How much of the substance is truly harmful?

Your answers to all these questions will shape the story you tell yourself. You might conclude: “The lab is a responsible organization and will protect me”. Or perhaps you’ll think: “The lab is a thorn in the side of the community and is probably doing weapons-related work. The leak’s a sign of something far worse.”

Perhaps your story will be: “Those environmentalists are just trying to scare us and the data indicate the leak is harmless”. Or maybe it’ll be: “I knew it! The lab’s sold out, the data are terrifying, and the activists are revealing the real truth”. Such stories determine the meaning of the picocuries and millirems for humans, not the other way around.

Acquiring data

Humans gain a sense of what’s happening in several ways. Three of them, to use philosophical language, are deferential, civic and melodramatic epistemology.

In “deferential epistemology”, citizens habitually take the word of experts and institutions about things like the dangers of picocuries and exposures of millirems. In his 1624 book New Atlantis, the philosopher Francis Bacon famously crafted a fictional portrait of an island society where deferential epistemology rules and people instinctively trust the scientific infrastructure.

Earthlings haven’t seen deferential epistemology in a while

We may think this is how people ought to behave. But Bacon, who was also a politician, understood that deference to experts is not automatic and requires constantly curating the public face of the scientific infrastructure. Earthlings haven’t seen deferential epistemology in a while.

“Civic epistemology”, meanwhile, is how people acquire knowledge in the absence of that curation. Such people don’t necessarily reject experts but hear their voices alongside many others claiming to know best how to pursue our interests and values. Civic epistemology is when we negotiate daily life not by first consulting scientists but by pursuing our concerns with a mix of habit, trust, experience and friendly advice.

We sometimes don’t, in fact, take scientific advice when it collides with how we already behave; we may smoke or drink, for instance, despite warnings not to. Or we might seek guidance from non-scientists about things like the harms of radiation.

Finally, what I call “melodramatic epistemology” draws on the word “melodrama”, a genre of theatre involving extreme plots, obvious villains, emotional appeal, sensational language and moral outrage (the 1939 film Gone with the Wind comes to mind).

A melodramatic lens can be a powerful and irresistible way for humans to digest difficult and emotionally charged events

Melodramas were once considered culturally insignificant, but scholars such as Peter Brooks from Yale University have shown that a melodramatic lens can be a powerful and irresistible way for humans to digest difficult and emotionally charged events. The clarity, certainty and passion provided by a melodramatic read on a situation tends to displace the complexities, uncertainties and dispassion of scientific evaluation and evidence.

One example from physics occurred at the Lawrence Berkeley Laboratory in the late 1990s when activists fought, successfully, for the closing of its National Tritium Labeling Facility (NTLF). As I have written before, the NTLF had successfully developed techniques for medical studies while releasing tritium emissions well below federal and state environmental standards.

Activists, however, used melodramatic epistemology to paint the NTLF’s scientists as villains spreading breast cancer throughout the area, and denounced them as making “a terrorist attack on the citizens of Berkeley”. One activist called the scientists “piano players in a nuclear whorehouse.”

The critical point

The aliens studying us would worry most about melodramatic epistemology. Melodramatic epistemology, though dangerous, is nearly impervious to being altered, for any contrary data, studies and expert judgment are considered to spring from the villain’s allies and therefore to incite rather than allay fear.

Two US psychologists – William Brady from Northwestern University and Molly Crockett from Princeton University – recently published a study of how and why misinformation spreads (Science 386 991). By analysing data from Facebook and Twitter and by conducting real experiments with participants, they found that sources of misinformation evoke more outrage than trustworthy sources. Worse still, the outrage encourages us to share the misinformation even if we haven’t fully read the original source.

This makes it hard to counter misinformation. As the authors tactfully conclude: “Outrage-evoking misinformation may be difficult to mitigate with interventions that assume users want to share accurate information”.

The best, and perhaps only, way to challenge melodramatic stories is to write bigger, more encompassing stories that reveal that a different plot is unfolding

In my view, the best, and perhaps only, way to challenge melodramatic stories is to write bigger, more encompassing stories that reveal that a different plot is unfolding. Such a story about the NTLF, for instance, would comprise story lines about the benefits of medical techniques, the testing of byproducts, the origin of regulations of toxins, the perils of our natural environment, the nature of fear and its manipulation, and so forth. In such a big story, those who promote melodramatic epistemology show up as an obvious, and dangerous, subplot.

If the aliens see us telling such bigger stories, they might not give up earthlings for lost.

The post Why telling bigger stories is the only way to counter misinformation appeared first on Physics World.

SMART spherical tokamak produces its first plasma

21 janvier 2025 à 16:25

A novel fusion device based at the University of Seville in Spain has achieved its first plasma. The SMall Aspect Ratio Tokamak (SMART) is a spherical tokamak that can operate with a “negative triangularity” – the first spherical tokamak specifically designed to do so. Work performed on the machine could be useful when designing compact fusion power plants based on spherical tokamak technology.

SMART has been constructed by the University of Seville’s Plasma Science and Fusion Technology Laboratory. With a vessel dimension of 1.6 × 1.6 m, SMART has a 30 cm diameter solenoid wrapped around 12 toroidal field coils while eight poloidal field coils are used to shape the plasma.

Triangularity refers to the shape of the plasma relative to the tokamak. The cross section of the plasma in a tokamak is typically shaped like a “D”. When the straight part of the D faces the centre of the tokamak, it is said to have positive triangularity. When the curved part of the plasma faces the centre, however, the plasma has negative triangularity.

It is thought that negative triangularity configurations can better suppress plasma instabilities that expel particles and energy from the plasma, helping to prevent damage to the tokamak wall.

Last year, researchers at the University of Seville began to prepare the tokamak’s inner walls for a high pressure plasma by heating argon gas with microwaves. When those tests were successful, engineers then worked toward producing the first plasma.

“This is an important achievement for the entire team as we are now entering the operational phase,” notes SMART principal investigator Manuel García Muñoz. “The SMART approach is a potential game changer with attractive fusion performance and power handling for future compact fusion reactors. We have exciting times ahead.”

The post SMART spherical tokamak produces its first plasma appeared first on Physics World.

When charging quantum batteries, decoherence is a friend, not a foe

21 janvier 2025 à 13:10

Devices like lasers and other semiconductor-based technologies operate on the principles of quantum mechanics, but they only scratch the surface. To fully exploit quantum phenomena, scientists are developing a new generation of quantum-based devices. These devices are advancing rapidly, fuelling what many call the “second quantum revolution”.

One exciting development in this domain is the rise of next-generation energy storage devices known as quantum batteries (QBs).  These devices leverage exotic quantum phenomena such as superposition, coherence, correlation and entanglement to store and release energy in ways that conventional batteries cannot. However, practical realization of QBs has its own challenges  such as reliance on fragile quantum states and difficulty in operating at room temperature.

A recent theoretical study by Rahul Shastri and colleagues from IIT Gandhinagar, India, in collaboration with researchers at China’s Zhejiang University and the China Academy of Engineering Physics takes significant strides towards understanding how QBs can be charged faster and more efficiently, thereby lowering some of the barriers restricting their use.

How does a QB work?

The difference between charging a QB and charging a mobile phone is that with a QB, both the battery and the charger are quantum systems. Shastri and colleagues focused on two such systems: a harmonic oscillator (HO) and a two-level system.  While a two-level system can exist in just two energy states, a harmonic oscillator has an evenly spaced range of energy levels. These systems therefore represent two extremes – one with a discrete, bounded energy range and the other with a more complex, unbounded energy spectrum approaching a continuous limit – making them ideal for exploring the versality of QBs.

In the quantum HO-based setup, a higher-energy HO acts as the charger and a lower-energy one as the battery. When the two are connected, or coupled, energy transfers from the charger to the battery. The two-level system follows the same working principle.  Such coupled quantum systems are routinely realized in experiments.

Using decoherence as a tool to improve QB performance

The study’s findings, which are published in npj Quantum Information, are both surprising and promising, illustrating how a phenomenon typically seen as a challenge in quantum systems – decoherence – can become a solution.

The term “decoherence” refers to the process where a quantum system loses its unique quantum properties (such as quantum correlation, coherence and entanglement). The key trigger for decoherence is quantum noise caused by interactions between a quantum system and its environment.

Since no real-world physical system is perfectly isolated, such noise is unavoidable, and even minute amounts of environmental noise can lead to decoherence. Maintaining quantum coherence is thus extremely challenging even in controlled laboratory settings, let alone industrial environments producing large-scale practical devices. For this reason, decoherence represents one of the most significant obstacles in advancing quantum technologies towards practical applications.

Shastri and colleagues, however discovered a way to turn this foe into a friend. “Instead of trying to eliminate these naturally occurring environmental effects, we ask: why not use them to our advantage?” Shashtri says.

The method they developed speeds up the charging process using a technique called controlled dephasing. Dephasing is a form of decoherence that usually involves the gradual loss of quantum coherence, but the researchers found that when managed carefully, it can actually boost the battery’s performance.

Dissipative effects, traditionally seen as a hindrance, can be harnessed to enhance performance

Rahul Shastri

To understand how this works, it’s important to note that at low levels of dephasing, the battery undergoes smooth energy oscillations. Too much dephasing, however, freezes these oscillations in what’s known as the quantum Zeno effect, essentially stalling the energy transfer. But with just the right amount of dephasing, the battery charges faster while maintaining stability. By precisely controlling the dephasing rate, therefore, it becomes possible to strike a balance that significantly improves charging speed while still preserving stability. This balance leads to quicker, more robust charging that could overcome challenges posed by environmental factors.

“Our study shows how dissipative effects, traditionally seen as a hindrance, can be harnessed to enhance performance,” Shastri notes. This opens the door to scalable, robust quantum battery designs, which could be extremely useful for energy management in quantum computing and other quantum-enabled applications.

Implications for scalable quantum technologies

The results of this study are encouraging for the quantum-technology industry. As per Shastri, using dephasing to optimize the charging speed and stability of QBs not only advances fundamental understanding but also addresses practical challenges in quantum energy storage.

“Our proposed method could be tested on existing platforms such as superconducting qubits and NMR systems, where dephasing control is already experimentally feasible,” he says. These platforms offer experimentalists a tangible starting point for verifying the study’s predictions and further refining QB performance.

Experimentalists testing this theory will face challenges. Examples include managing additional decoherence mechanisms like amplitude damping and achieving the ideal balance of controlled dephasing in realistic setups. However, Shastri says that these challenges present valuable opportunities to refine and expand the proposed theoretical model for optimizing QB performance under practical conditions. The second quantum revolution is already underway, and QBs might just be the power source that charges our quantum future.

The post When charging quantum batteries, decoherence is a friend, not a foe appeared first on Physics World.

❌
❌