↩ Accueil

Vue normale

Reçu aujourd’hui — 28 mai 20256.5 📰 Sciences English

Shengxi Huang: how defects can boost 2D materials as single-photon emitters

28 mai 2025 à 17:01
Photo of researchers in a lab at Rice University.
Hidden depths Shengxi Huang (left) with members of her lab at Rice University in the US, where she studies 2D materials as single-photon sources. (Courtesy: Jeff Fitlow)

Everyday life is three dimensional, with even a sheet of paper having a finite thickness. Shengxi Huang from Rice University in the US, however, is attracted by 2D materials, which are usually just one atomic layer thick. Graphene is perhaps the most famous example — a single layer of carbon atoms arranged in a hexagonal lattice. But since it was first created in 2004, all sorts of other 2D materials, notably boron nitride, have been created.

An electrical engineer by training, Huang did a PhD at the Massachusetts Institute of Technology and postdoctoral research at Stanford University before spending five years as an assistant professor at the Pennsylvania State University. Huang has been at Rice since 2022, where she is now an associate professor in the Department of Electrical and Computer Engineering, the Department of Material Science and NanoEngineering, and the Department of Bioengineering.

Her group at Rice currently has 12 people, including eight graduate students and four postdocs. Some are physicists, some are engineers, while others have backgrounds in material science or chemistry. But they all share an interest in understanding the optical and electronic properties of quantum materials and seeing how they can be used, for example, as biochemical sensors. Lab equipment from Picoquant is vital in helping in that quest, as Huang explains in an interview with Physics World.

Why are you fascinated by 2D materials?

I’m an electrical engineer by training, which is a very broad field. Some electrical engineers focus on things like communication and computing, but others, like myself, are more interested in how we can use fundamental physics to build useful devices, such as semiconductor chips. I’m particularly interested in using 2D materials for optoelectronic devices and as single-photon emitters.

What kinds of 2D materials do you study?

The materials I am particularly interested in are transition metal dichalcogenides, which consist of a layer of transition-metal atoms sandwiched between two layers of chalcogen atoms – sulphur, selenium or tellurium. One of the most common examples is molybdenum disulphide, which in its monolayer form has a layer of sulphur on either side of a layer of molybdenum. In multi-layer molybdenum disulphide, the van der Waals forces between the tri-layers are relatively weak, meaning that the material is widely used as a lubricant – just like graphite, which is a many-layer version of graphene.

Why do you find transition metal dichalcogenides interesting?

Transition metal dichalcogenides have some very useful optoelectronic properties. In particular, they emit light whenever the electron and hole that make up an “exciton” recombine. Now because these dichalcogenides are so thin, most of the light they emit can be used. In a 3D material, in contrast, most light is generated deep in the bulk of the material and doesn’t penetrate beyond the surface. Such 2D materials are therefore very efficient and, what’s more, can be easily integrated onto chip-based devices such as waveguides and cavities.

Transition metal dichalcogenide materials also have promising electronic applications, particularly as the active material in transistors. Over the years, we’ve seen silicon-based transistors get smaller and smaller as we’ve followed Moore’s law, but we’re rapidly reaching a limit where we can’t shrink them any further, partly because the electrons in very thin layers of silicon move so slowly. In 2D transition metal dichalcogenides, in contrast, the electron mobility can actually be higher than in silicon of the same thickness, making them a promising material for future transistor applications.

What can such sources of single photons be used for?

Single photons are useful for quantum communication and quantum cryptography. Carrying information as zero and one, they basically function as a qubit, providing a very secure communication channel. Single photons are also interesting for quantum sensing and even quantum computing. But it’s vital that you have a highly pure source of photons. You don’t want them mixed up with “classical photons”, which — like those from the Sun — are emitted in bunches as otherwise the tasks you’re trying to perform cannot be completed.

What approaches are you taking to improve 2D materials as single-photon emitters?

What we do is introduce atomic defects into a 2D material to give it optical properties that are different to what you’d get in the bulk. There are several ways of doing this. One is to irradiate a sample with ions or electrons, which can bombard individual atoms out to generate “vacancy defects”. Another option is to use plasmas, whereby atoms in the sample get replaced by atoms from the plasma.

So how do you study the samples?

We can probe defect emission using a technique called photoluminescence, which basically involves shining a laser beam onto the material. The laser excites electrons from the ground state to an excited state, prompting them to emit light. As the laser beam is about 500-1000 nm in diameter, we can see single photon emission from an individual defect if the defect density is suitable.

Photo of researchers in a lab at Rice University
Beyond the surface Shengxi Huang (second right) uses equipment from PicoQuant to probe 2D materials. (Courtesy: Jeff Fitlow)

What sort of experiments do you do in your lab?

We start by engineering our materials at the atomic level to introduce the correct type of defect. We also try to strain the material, which can increase how many single photons are emitted at a time. Once we’ve confirmed we’ve got the correct defects in the correct location, we check the material is emitting single photons by carrying out optical measurements, such as photoluminescence. Finally, we characterize the purity of our single photons – ideally, they shouldn’t be mixed up with classical photons but in reality, you never have a 100% pure source. As single photons are emitted one at a time, they have different statistical characteristics to classical light. We also check the brightness and lifetime of the source, the efficiency, how stable it is, and if the photons are polarized. In fact, we have a feedback loop: what improvements can we do at the atomic level to get the properties we’re after?

Is it difficult adding defects to a sample?

It’s pretty challenging. You want to add just one defect to an area that might be just one micron square so you have to control the atomic structure very finely. It’s made harder because 2D materials are atomically thin and very fragile. So if you don’t do the engineering correctly, you may accidentally introduce other types of defects that you don’t want, which will alter the defects’ emission.

What techniques do you use to confirm the defects are in the right place?

Because the defect concentration is so low, we cannot use methods that are typically used to characterise materials, such as X-ray photo-emission spectroscopy or scanning electron microscopy. Instead, the best and most practical way is to see if the defects generate the correct type of optical emission predicted by theory. But even that is challenging because our calculations, which we work on with computational groups, might not be completely accurate.

How do your PicoQuant instruments help in that regard?

We have two main pieces of equipment – a MicroTime 100 photoluminescence microscope and a FluoTime 300 spectrometer. These have been customized to form a Hanbury Brown Twiss interferometer, which measures the purity of a single photon source. We also use the microscope and spectrometer to characterise photoluminescence spectrum and lifetime. Essentially, if the material emits light, we can then work out how long it takes before the emission dies down.

Did you buy the equipment off-the-shelf?

It’s more of a customised instrument with different components – lasers, microscopes, detectors and so on — connected together so we can do multiple types of measurement. I put in a request to Picoquant, who discussed my requirements with me to work out how to meet my needs. The equipment has been very important for our studies as we can carry out high-throughput measurements over and over again. We’ve tailored it for our own research purposes basically.

So how good are your samples?

The best single-photon source that we currently work with is boron nitride, which has a single-photon purity of 98.5% at room temperature. In other words, for every 200 photons only three are classical. With transition-metal dichalcogenides, we get a purity of 98.3% at cryogenic temperatures.

What are your next steps?

There’s still lots to explore in terms of making better single-photon emitters and learning how to control them at different wavelengths. We also want to see if these materials can be used as high-quality quantum sensors. In some cases, if we have the right types of atomic defects, we get a high-quality source of single photons, which we can then entangle with their spin. The emitters can therefore monitor the local magnetic environment with better performance than is possible with classical sensing methods.

The post Shengxi Huang: how defects can boost 2D materials as single-photon emitters appeared first on Physics World.

Air Force Research Laboratory Awards Moog Contract to Develop New Multimode Propulsion System to Enhance Dynamic Space Operations

28 mai 2025 à 14:30
Moog logo

East Aurora, NY – Moog Inc. (NYSE: MOG.A and MOG.B), a worldwide designer, manufacturer and systems integrator of high-performance precision motion and fluid controls and control systems, announced today that […]

The post Air Force Research Laboratory Awards Moog Contract to Develop New Multimode Propulsion System to Enhance Dynamic Space Operations appeared first on SpaceNews.

Richard Bond and George Efstathiou share the 2025 Shaw Prize in Astronomy

28 mai 2025 à 14:00

The 2025 Shaw Prize in Astronomy has been awarded to Richard Bond and George Efstathiou “for their pioneering research in cosmology, in particular for their studies of fluctuations in the cosmic microwave background”. The prize citation continues, “Their predictions have been verified by an armada of ground-, balloon- and space-based instruments, leading to precise determinations of the age, geometry, and mass–energy content of the universe”.

Efstathiou is professor of astrophysics at the University of Cambridge in the UK. Bond is a professor at the Canadian Institute for Theoretical Astrophysics (CITA) and university professor at the University of Toronto in Canada. They share the $1.2m prize money equally.

The annual award is given by the Shaw Prize Foundation, which was founded in 2002 by the Hong Kong-based filmmaker, television executive and philanthropist Run Run Shaw (1907–2014). It will be presented at a ceremony in Hong Kong on 21 October. There are also Shaw Prizes for life sciences and medicine; and mathematical sciences.

Bond studied mathematics and physics at Toronto. In 1979 he completed a PhD in theoretical physics at the California Institute of Technology (Caltech). He directed CITA in 1996-2006.

Efstathiou studied physics at Oxford before completing a PhD in astronomy at the UK’s Durham University in 1979. He is currently director of the Institute of Astronomy in Cambridge.

The post Richard Bond and George Efstathiou share the 2025 Shaw Prize in Astronomy appeared first on Physics World.

No laughing matter: a comic book about the climate crisis

28 mai 2025 à 12:00
Comic depicting a parachutist whose chute is on fire and their thought process about not using their backup chute
Blunt message Anti-nuclear thinking is mocked in World Without End by Jean-Marc Jancovici and Christophe Blain. (Published by Particular Books. Illustration © DARGAUD — Blancovici & Blain)

Comics are regarded as an artform in France, where they account for a quarter of all book sales. Nevertheless, the graphic novel World Without End: an Illustrated Guide to the Climate Crisis was a surprise French bestseller when it first came out in 2022. Taking the form of a Socratic dialogue between French climate expert Jean-Marc Jancovici and acclaimed comic artist Christophe Blain, it’s serious, scientific stuff.

Now translated into English by Edward Gauvin, the book follows the conventions of French-language comic strips or bandes dessinées. Jancovici is drawn with a small nose – denoting seriousness – while Blain’s larger nose signals humour. The first half explores energy and consumption, with the rest addressing the climate crisis and possible solutions.

Overall, this is a Trojan horse of a book: what appears to be a playful comic is packed with dense, academic content. Though marketed as a graphic novel, it reads more like illustrated notes from a series of sharp, provocative university lectures. It presents a frightening vision of the future and the humour doesn’t always land.

The book spans a vast array of disciplines – not just science and economics but geography and psychology too. In fact, there’s so much to unpack that, had I Blain’s skills, I might have reviewed it in the form of a comic strip myself. The old adage that “a picture is worth a thousand words” has never rung more true.

Absurd yet powerful visual metaphors feature throughout. We see a parachutist with a flaming main chute that represents our dependence on fossil fuels. The falling man jettisons his reserve chute – nuclear power – and tries to knit an alternative using clean energy, mid-fall. The message is blunt: nuclear may not be ideal, but it works.

World Without End is bold, arresting, provocative and at times polemical.

The book is bold, arresting, provocative and at times polemical. Charts and infographics are presented to simplify complex issues, even if the details invite scrutiny. Explanations are generally clear and concise, though the author’s claim that accidents like Chernobyl and Fukushima couldn’t happen in France smacks of hubris.

Jancovici makes plenty of attention-grabbing statements. Some are sound, such as the notion that fossil fuels spared whales from extinction as we didn’t need this animal’s oil any more. Others are dubious – would a 4 °C temperature rise really leave a third of humanity unable to survive outdoors?

But Jancovici is right to say that the use of fossil fuels makes logical sense. Oil can be easily transported and one barrel delivers the equivalent of five years of human labour. A character called Armor Man (a parody of Iron Man) reminds us that fossil fuels are like having 200 mechanical slaves per person, equivalent to an additional 1.5 trillion people on the planet.

Fossil fuels brought prosperity – but now threaten our survival. For Jancovici, the answer is nuclear power, which is perhaps not surprising as it produces 72% of electricity in the author’s homeland. But he cherry picks data, accepting – for example – the United Nations figure that only about 50 people died from the Chernobyl nuclear accident.

While acknowledging that many people had to move following the disaster, the author downplays the fate of those responsible for “cleaning up” the site, the long-term health effects on the wider population and the staggering economic impact – estimated at €200–500bn. He also sidesteps nuclear-waste disposal and the cost and complexity of building new plants.

While conceding that nuclear is “not the whole answer”, Jancovici dismisses hydrogen and views renewables like wind and solar as too intermittent – they require batteries to ensure electricity is supplied on demand – and diffuse. Imagine blanketing the Earth in wind turbines.

Cartoon of a doctor and patient. The patient has increased their alcohol intake but also added in some healthy orange juice
Humorous point A joke from World Without End by Jean-Marc Jancovici and Christophe Blain. (Published by Particular Books. Illustration © DARGAUD — Blancovici & Blain)

Still, his views on renewables seem increasingly out of step. They now supply nearly 30% of global electricity – 13% from wind and solar, ahead of nuclear at 9%. Renewables also attract 70% of all new investment in electricity generation and (unlike nuclear) continue to fall in price. It’s therefore disingenuous of the author to say that relying on renewables would be like returning to pre-industrial life; today’s wind turbines are far more efficient than anything back then.

Beyond his case for nuclear, Jancovici offers few firm solutions. Weirdly, he suggests “educating women” and providing pensions in developing nations – to reduce reliance on large families – to stabilize population growth. He also cites French journalist Sébastien Bohler, who thinks our brains are poorly equipped to deal with long-term threats.

But he says nothing about the need for more investment in nuclear fusion or for “clean” nuclear fission via, say, liquid fluoride thorium reactors (LFTRs), which generate minimal waste, won’t melt down and cannot be weaponized.

Perhaps our survival depends on delaying gratification, resisting the lure of immediate comfort, and adopting a less extravagant but sustainable world. We know what changes are needed – yet we do nothing. The climate crisis is unfolding before our eyes, but we’re paralysed by a global-scale bystander effect, each of us hoping someone else will act first. Jancovici’s call for “energy sobriety” (consuming less) seems idealistic and futile.

Still, World Without End is a remarkable and deeply thought-provoking book that deserves to be widely read. I fear that it will struggle to replicate its success beyond France, though Raymond Briggs’ When the Wind Blows – a Cold War graphic novel about nuclear annihilation – was once a British bestseller. If enough people engaged with the book, it would surely spark discussion and, one day, even lead to meaningful action.

  • 2024 Particular Books £25.00hb 196pp

The post No laughing matter: a comic book about the climate crisis appeared first on Physics World.

The evolution of the metre: How a product of the French Revolution became a mainstay of worldwide scientific collaboration

28 mai 2025 à 10:00

The 20th of May is World Metrology Day, and this year it was extra special because it was also the 150th anniversary of the treaty that established the metric system as the preferred international measurement standard. Known as the Metre Convention, the treaty was signed in 1875 in Paris, France by representatives of all 17 nations that belonged to the Bureau International des Poids et Mesures (BIPM) at the time, making it one of the first truly international agreements. Though nations might come and go, the hope was that this treaty would endure “for all times and all peoples”.

To celebrate the treaty’s first century and a half, the BIPM and the United Nations Educational, Scientific and Cultural Organisation (UNESCO) held a joint symposium at the UNESCO headquarters in Paris. The event focused on the achievements of BIPM as well as the international scientific collaborations the Metre Convention enabled. It included talks from the Nobel prize-winning physicist William Phillips of the US National Institute of Standards and Technology (NIST) and the BIPM director Martin Milton, as well as panel discussions on the future of metrology featuring representatives of other national metrology institutes (NMIs) and metrology professionals from around the globe.

A long and revolutionary tradition

The history of metrology dates back to ancient times. As UNESCO’s Hu Shaofeng noted in his opening remarks, the Egyptians recognized the importance of precision measurements as long ago as the 21st century BCE.  Like other early schemes, the Egyptians’ system of measurement used parts of the human body as references, with units such as the fathom (the length of a pair of outstretched arms) and the foot. This was far from ideal since, as Phillips pointed out in his keynote address, people come in various shapes and sizes. These variations led to a profusion of units. By some estimates, pre-revolutionary France had a whopping 250,000 different measures, with differences arising not only between towns but also between professions.

The French Revolutionaries were determined to put an end to this mess. In 1795, just six years after the Revolution, the law of 18 Geminal An III (according to the new calendar of the French Republic) created a preliminary version of the world’s first metric system. The new system tied length and mass to natural standards (the metre was originally one-forty-millionth of the Paris meridian, while the kilogram is the mass of a cubic decimetre of water), and it became the standard for all of France in 1799. That same year, the system also became more practical, with units becoming linked, for the first time, to physical artefacts: a platinum metre and kilogram deposited in the French National Archives.

When the Metre Convention adopted this standard internationally 80 years later, it kick-started the construction of new length and mass standards. The new International Prototype of the Metre and International Prototype of the Kilogram were manufactured in 1879 and officially adopted as replacements for the Revolutionaries’ metre and kilogram in 1889, though they continued to be calibrated against the old prototypes held in the National Archives.

A short history of the BIPM

The BIPM itself was originally conceived as a means of reconciling France and Germany after the 1870–1871 Franco–Prussian War. At first, its primary roles were to care for the kilogram and metre prototypes and to calibrate the standards of its member states. In the opening decades of the 20th century, however, it extended its activities to cover other kinds of measurements, including those related to electricity, light and radiation. Then, from the 1960s onwards, it became increasingly interested in improving the definition of length, thanks to new interferometer technology that made it possible to measure distance at a precision rivalling that of the physical metre prototype.

Photo of William Phillips on stage at the Metre Convention symposium, backed by a large slide that reads "The Revolutionary Dream: A tous les temps, a tous les peuples, For all times, for all peoples". The slide also contains two large symbolic medallions, one showing a female figure dressed in Classical garments holding out a metre ruler under the logo "A tous les temps, a tous les peuples" and another showing a winged figure measuring the Earth with an instrument.
Metre man: William Phillips giving the keynote address at the Metre Convention’s 150th anniversary symposium. (Courtesy: Isabelle Dumé)

It was around this time that the BIPM decided to replace its expanded metric system with a framework encompassing the entire field of metrology. This new framework consisted of six basic units – the metre, kilogram, second, ampere, degree Kelvin (later simply the kelvin), candela and mole – plus a set of “derived” units (the Newton, Hertz, Joule and Watt) built from the six basic ones. Thus was born the International System of Units, or SI after the French initials for Système International d’unités.

The next major step – a “brilliant choice”, in Phillips’ words – came in 1983, when the BIPM decided to redefine the metre in terms of the speed of light. In the future, the Bureau decreed that the metre would officially be the length travelled by light in vacuum during a time interval of 1/299,792,458 seconds.

This decision set the stage for defining the rest of the seven base units in terms of natural fundamental constants. The most recent unit to join the club was the kilogram, which was defined in terms of the Planck constant, h, in 2019. In fact, the only base unit currently not defined in terms of a fundamental constant is the second, which is instead determined by the transition between the two hyperfine levels of the ground state of caesium-133. The international metrology community is, however, working to remedy this, with meetings being held on the subject in Versailles this month.

Measurement affects every aspect of our daily lives, and as the speakers at last week’s celebrations repeatedly reminded the audience, a unified system of measurement has long acted as a means of building trust across international and disciplinary borders. The Metre Convention’s survival for 150 years is proof that peaceful collaboration can triumph, and it has allowed humankind to advance in ways that would not have been possible without such unity. A lesson indeed for today’s troubled world.

The post The evolution of the metre: How a product of the French Revolution became a mainstay of worldwide scientific collaboration appeared first on Physics World.

Reçu hier — 27 mai 20256.5 📰 Sciences English

SpaceNews Names Kamal Flucker as Vice President of Global Sales to Lead International Growth

27 mai 2025 à 21:42

Washington, D.C. – SpaceNews announces the promotion of Kamal Flucker to Vice President of Global Sales, a move that reflects both his near decade-long commitment to the brand and SpaceNews’s […]

The post SpaceNews Names Kamal Flucker as Vice President of Global Sales to Lead International Growth appeared first on SpaceNews.

The Physics Chanteuse: when science hits a high note

27 mai 2025 à 17:00

What do pulsars, nuclear politics and hypothetical love particles have in common? They’ve all inspired songs by Lynda Williams – physicist, performer and self-styled “Physics Chanteuse”.

In this month’s Physics World Stories podcast, host Andrew Glester is in conversation with Williams, whose unique approach to science communication blends physics with cabaret and satire. You’ll be treated to a selection of her songs, including a toe-tapping tribute to Jocelyn Bell Burnell, the Northern Irish physicist who discovered pulsars.

Williams discusses her writing process, which includes a full-blooded commitment to getting the science right. She describes how her shows evolve throughout the course of a tour, how she balances life on the road with other life commitments, and how Kip Thorne once arranged for her to perform at a birthday celebration for Stephen Hawking. (Yes, really.)

Her latest show, Atomic Cabaret, dives into the existential risks of the nuclear age, marking 80 years since Hiroshima and Nagasaki. The one-woman musical kicks off in Belfast on 18 June and heads to the Edinburgh Festival in August.

If you like your physics with a side of showbiz and social activism, this episode hits all the right notes. Find out more at Lynda’s website.

The post The Physics Chanteuse: when science hits a high note appeared first on Physics World.

💾

APIO16 Radiation-Hardened 16-Bit I/O Expander Sets a New Benchmark for Resilient System Design

27 mai 2025 à 16:55
Apogee Semiconductor logo

Apogee Semiconductor Launches Industry-First Rad-Hard 16-bit I/O Expander for Extreme Environments Apogee Semiconductor proudly announces APIO16, the industry’s first radiation-hardened 16-bit I/O expander, engineered to streamline digital expansion in spaceborne […]

The post APIO16 Radiation-Hardened 16-Bit I/O Expander Sets a New Benchmark for Resilient System Design appeared first on SpaceNews.

The quantum eraser doesn’t rewrite the past – it rewrites observers

27 mai 2025 à 15:00

“Welcome to this special issue of Physics World, marking the 200th anniversary of quantum mechanics. In this double-quantum edition, the letters in this text are stored using qubits. As you read, you project the letters into a fixed state, and that information gets copied into your mind as the article that you are reading. This text is actually in a superposition of many different articles, but only one of them gets copied into your memory. We hope you enjoy the one that you are reading.”

That’s how I imagine the opening of the 2125 Physics World quantum special issue, when fully functional quantum computers are commonplace, and we have even figured out how to control individual qubits on display screens. If you are lucky enough to experience reading such a magazine, you might be disappointed as you can read only one of the articles the text gets projected into. The problem is that by reading the superposition of articles, you made them decohere, because you copied the information about each letter into your memory. Can you figure out a way to read the others too? After all, more Physics World articles is always better.

A possible solution may be if you could restore the coherence of the text by just erasing your memory of the particular article you read. Once you no longer have information identifying which article your magazine was projected into, there is then no fundamental reason for it to remain decohered into a single state. You could then reread it to enjoy a different article.

While this thought experiment may sound fantastical, the concept is closely connected to a mind-bending twist on the famous double-slit experiment, known as the delayed-choice quantum eraser. It is often claimed to exhibit a radical phenomenon: where measurements made in the present alter events that occurred in the past. But is such a paradoxical suggestion real, even in the notoriously strange quantum realm?

A double twist on the double slit

In a standard double-slit experiment, photons are sent one by one through two slits to create an interference pattern on a screen, illustrating the wave-like behaviour of light. But if we add a detector that can spot which of the two slits the photon goes through, the interference disappears and we see only two distinct clumps on the screen, signifying particle-like behaviour. Crucially, gaining information about which path the photon took changes the photon’s quantum state, from the wave-like interference pattern to the particle-like clumps.

The first twist on this thought experiment is attributed to proposals from physicist John Wheeler in 1978, and a later collaboration with Wojciech Zurek in 1983. Wheeler’s idea was to delay the measurement of which slit the photon goes through. Instead of measuring the photon as it passes through the double-slit, the measurement could be delayed until just before the photon hits the screen. Interestingly, the delayed detection of which slit the photon goes through still determines whether or not it displays the wave-like or particle-like behaviour. In other words, even a detection done long after the photon has gone through the slit determines whether or not that photon is measured to have interfered with itself.

If that’s not strange enough, the delayed-choice quantum eraser is a further modification of this idea. First proposed by American physicists Marlan Scully and Kai Drühl in 1982 (Phys. Rev. A 25 2208), it was later experimentally implemented by Yoon-Ho Kim and collaborators using photons in 2000 (Phys. Rev. Lett. 84 1). This variation adds a second twist: if recording which slit the photon passes through causes it to decohere, then what happens if we were to erase that information? Imagine shrinking the detector to a single qubit that becomes entangled with the photon: “left” slit might correlate to the qubit being 0, “right” slit to 1. Instead of measuring whether the qubit is a 0 or 1 (revealing the path), we could measure it in a complementary way, randomising the 0s and 1s (erasing the path information).

1 Delayed detections, path revelations and complementary measurements

Detailed illustration explaining the quantum eraser effect
(Courtesy: Mayank Shreshtha)

This illustration depicts how the quantum eraser restores the wave-like behaviour of photons in a double-slit experiment, using 3D-glasses as an analogy.

The top left box shows the set-up for the standard double-slit experiment. As there are no detectors at the slits measuring which pathway a photon takes, an interference pattern emerges on the screen.  In box 1, detectors are present at each slit, and measuring which slit the photon might have passed through, the interference patter is destroyed. Boxes 2 and 3 show that by erasing the “which-slit” information, the interference patterns are restored. This is done by separating out the photons using the eraser, represented here by a red filter and a blue filter of the 3D glasses. The final box 4 shows that the overall pattern with the eraser has no interference, identical to patten seen in box 1.

In boxes 2, 3 and 4, a detector qubit measures “which-slit” information, with states |0> for left and |1> for right. These are points on the z-axis of the “Bloch sphere”, an abstract representation of the qubit. Then the eraser measures the detector qubit in a complementary way, along the x-axis of the Bloch sphere. This destroys the “which-slit information”, but reveals the red and blue lens information used to filter the outcomes, as depicted in the image of the 3D glasses.

Strikingly, while the screen still shows particle-like clumps overall, these complementary measurements of the single-qubit detector can actually be used to extract a wave-like interference pattern. This works through a sorting process: the two possible outcomes of the complementary measurements are used to separate out the photon detections on the screen. The separated patterns then each individually show bright and dark fringes.

I like to visualize this using a pair of 3D glasses, with one blue and one red lens. Each colour lens reveals a different individual image, like the two separate interference patterns. Without the 3D glasses, you see only the overall sum of the images. In the quantum eraser experiment, this sum of the images is a fully decohered pattern, with no trace of interference. Having access to the complementary measurements of the detector is like getting access to the 3D glasses: you now get an extra tool to filter out the two separate interference patterns.

Rewriting the past – or not?

If erasing the information at the detector lets us extract wave-like patterns, it may seem like we’ve restored wave-like behaviour to an already particle-like photon. That seems truly head-scratching. However, Jonte Hance, a quantum physicist at Newcastle University in the UK, highlights a different conclusion, focused on how the individual interference patterns add up to show the usual decohered pattern. “They all feel like they shouldn’t be able to fit together,” Hance explains. “It’s really showing that the correlations you get through entanglement have to be able to fit every possible way you could measure a system.” The results therefore reveal an intriguing aspect of quantum theory – the rich, counterintuitive structure of quantum correlations from entanglement – rather than past influences.

Even Wheeler himself did not believe the thought experiment implies backward-in-time influence, as explained by Lorenzo Catani, a researcher at the International Iberian Nanotechnology Laboratory (INL) in Portugal. Commenting on the history of the thought experiment, Catani notes that “Wheeler concluded that one must abandon a certain type of realism – namely, the idea that the past exists independently of its recording in the present. As far as I know, only a minority of researchers have interpreted the experiment as evidence for retrocausality.”

Eraser vs Bell: a battle of the bizarre

One physicist who is attempting to unpack this problem is Johannes Fankhauser at the University of Innsbruck, Austria. “I’d heard about the quantum eraser, and it had puzzled me a lot because of all these bizarre claims of backwards-in-time influence”, he explains. “I see something that sounds counterintuitive and puzzling and bizarre and then I want to understand it, and by understanding it, it gets a bit demystified.”

Fankhauser realized that the quantum eraser set-up can be translated into a very standard Bell experiment. These experiments are based on entangling a pair of qubits, the idea being to rule out local “hidden-variable” models of quantum theory. This led him to see that there is no need to explain the eraser using backwards-in-time influence, since the related Bell experiments can be understood without it, as explained in his 2017 paper (Quanta 8 44). Fankhauser then further analysed the thought experiment using the de Broglie–Bohm interpretation of quantum theory, which gives a physical model for the quantum wavefunction (as particles are guided by a “pilot” wave). Using this, he showed explicitly that the outcomes of the eraser experiment can be fully explained without requiring backwards-in-time influences.

So does that mean that the eraser doesn’t tell us anything else beyond what Bell experiments already tell us? Not quite. “It turns different knobs than the Bell experiment,” explains Fankhauser. “I would say it asks the question ‘what do measurements signify?’, and ‘when can I talk about the system having a property?’. That’s an interesting question and I would say we don’t have a full answer to this.”

In particular, the eraser demonstrates the importance that the very act of observation has on outcomes, with the detector playing the role of an observer. “You measure some of its properties, you change another property,” says Fankhauser. “So the next time you measure it, the new property was created through the observation. And I’m trying to formalize this now more concretely. I’m trying to come up with a new approach and framework to study these questions.”

Meanwhile, Catani found an intriguing contrast between Bell experiments and the eraser in his research. “The implications of Bell’s theorem are far more profound,” says Catani. In the 2023 paper (Quantum 7 1119) he co-authored, Catani considers a model for classical physics, with an extra condition: there is a restriction on what you can know about the underlying physical states. Applying this model to the quantum eraser, he finds that its results can be reproduced by such a classical theory. By contrast, the classical model cannot reproduce the statistical violations of a Bell experiment. This shows that having incomplete knowledge of the physical state is not, by itself, enough to explain the strange results of the Bell experiment. It is therefore demonstrating a more powerful deviation from classical physics than the eraser. Catani also contrasts the mathematical rigour of the two cases. While Bell experiments are based on explicitly formulated assumptions, claims about backwards-in-time influence in the quantum eraser rely on a particular narrative – one that gives rise to the apparent paradox

The eraser as a brainteaser

Physicists therefore broadly agree that the mathematics of the quantum eraser thought experiment fits well within standard quantum theory. Even so, Hance argues that formal results alone are not the entire story: “This is something we need to pick apart, not just in terms of mathematical assumptions, but also in terms of building intuitions for us to be able to actually play around with what quantumness is.” Hance has been analysing the physical implications of different assumptions in the thought experiment, with some options discussed in his 2021 preprint (arXiv:2111.09347) with collaborators on the quantum eraser paradox.

It therefore provides a tool for understanding how quantum correlations match up in a way that is not described by classical physics. “It’s a great thinking aid – partly brainteaser, partly demonstration of the nature of this weirdness.”

Information, observers and quantum computers

Every quantum physicist takes something different from the quantum eraser, whether it is a spotlight on the open problems surrounding the properties of measured systems; a lesson from history in mathematical rigour; or a counterintuitive puzzle to make sense of. For a minority that deviate from standard approaches to quantum theory, it may even be some form of backwards-in-time influence.

For myself, as explained in my video on YouTube and my 2023 paper (IEEE International Conference on Quantum Computing and Engineering 10.1109/QCE57702.2023.20325) on quantum thought experiments, the most dramatic implication of the quantum eraser is explaining the role of observers in the double-slit experiment. The quantum eraser emphasizes that even a single entanglement between qubits will cause decoherence, whether or not it is measured afterwards – meaning that no mysterious macroscopic observer is required. This also explains why building a quantum computer is so challenging, as unwanted entanglement with even one particle can cause the whole computation to collapse into a random state.

The quantum eraser emphasizes that even a single entanglement between qubits will cause decoherence, whether or not it is measured afterwards – meaning that no mysterious macroscopic observer is required

Where does this leave the futuristic readers of our 200-year double-quantum special issue of Physics World? Simply erasing their memories is not enough to restore the quantum behaviour of the article. It is too late to change which article was selected. Though, following an eraser-type protocol, our futurists can do one better than those sneaky magazine writers: they can use the outcomes of complementary measurements on their memory, to sort the article into two individual smaller articles, each displaying their own quantum entanglement structure that was otherwise hidden. So even if you can’t use the quantum eraser to rewrite the past, perhaps it can rewrite what you read in the future.

This article forms part of Physics World‘s contribution to the 2025 International Year of Quantum Science and Technology (IYQ), which aims to raise global awareness of quantum physics and its applications.

Stayed tuned to Physics World and our international partners throughout the next 12 months for more coverage of the IYQ.

Find out more on our quantum channel.

The post The quantum eraser doesn’t rewrite the past – it rewrites observers appeared first on Physics World.

Has bismuth been masquerading as a topological material?

27 mai 2025 à 14:00

Bismuth has puzzled scientists for nearly 20 years. Notably, the question of whether it is topological – that is, whether electrons behave differently on its surface than they do inside it – gets different answers depending on whether you ask a theorist or an experimentalist. Researchers in Japan now say they have found a way to resolve this conflict. A mechanism called surface relaxation, they report, may have masked or “blocked” bismuth’s true topological nature.

The classic way of describing topology is to compare objects that have a hole, such as a doughnut or a coffee mug, with objects that don’t, such as a muffin. Although we usually think of doughnuts as having more in common with muffins than with mugs – you can’t eat a mug – the fact that they have the same number of holes means the mug and doughnut share topological features that the muffin does not.

While no-one has ever wondered whether they can eat an electron, scientists have long been curious about whether materials conduct electricity. As it turns out, topology is one way of answering that question.

“Previously, people classified materials as metallic or insulating,” says Yuki Fuseya, a quantum solid state physicist at Kobe University. Beginning in the 2000s, however, Fuseya says scientists started focusing more on the topology of the electrons’ complex wavefunctions. This enriched our understanding of how materials behave, because wavefunctions with apparently different shapes can share important topological features.

For example, if the topology of certain wavefunctions on a material’s surface corresponds to that of apparently different wavefunctions within its bulk, the material may be insulating in its bulk, yet still able to conduct electricity on its surface. Materials with this property are known as topological insulators, and they have garnered a huge amount of interest due to the possibility of exploiting them in quantum computing, spintronics and magnetic devices.

Topological or not topological

While it’s not possible to measure the topology of wavefunctions directly, it is generally possible to detect whether a material supports certain surface states. This information can then be used to infer something about its bulk using the so-called bulk-edge state correspondence.

In bismuth, the existence of these surface states ought to indicate that the bulk material is topologically trivial. However, experiments have delivered conflicting information.

Fuseya was intrigued. “If you look at the history of solid-state physics, many physical phenomena were found firstly in bismuth,” he tells Physics World. Examples include diamagnetism, the Seebeck effect and the Shubnikov-de Haas effect, as well as phenomena related to the giant spin Hall effect and the potential for Turing patterns that Fuseya discovered himself. “That’s one of the reasons why I am so interested in bismuth,” he says.

Fuseya’s interest attracted colleagues with different specialisms. Using density functional theory, Rikako Yaguchi of the University of Electro-Communications in Tokyo calculated that layers of bismuth’s crystal lattice expand, or relax, by 3-6% towards the surface. According to Fuseya, this might not have seemed noteworthy. However, since the team was already looking at bismuth’s topological properties, another colleague, Kazuki Koie, went ahead and calculated how this lattice expansion changed the material’s surface wavefunction.

These calculations showed that the expansion is, in fact, significant. This is because bismuth is close to the topological transition point, where a change in parameters can flip the shape of the wavefunction and give topological properties to a material that was once topologically trivial. Consequently, the reason it is not possible to observe surface states indicating that bulk bismuth is topologically trivial is that the material is effectively different – and topologically non-trivial – on its surface.

Topological blocking

Although “very surprised” at first, Fuseya says that after examining the physics in more detail, they found the result “quite reasonable”. They are now looking for evidence of similar “topological blocking” in other materials near the transition point, such as lead telluride and tin telluride.

“It is remarkable that there are still big puzzles when trying to match data to the theoretical predictions,” says Titus Mangham Neupert, a theoretical physicist at the University of Zurich, Switzerland, who was not directly involved in the research. Since “so many compounds that made the headlines in topological physics” contain bismuth, Neupert says it will be interesting to re-evaluate existing experiments and conceive new ones. “In particular, the implication for higher-order topology could be tested,” he says.

Fuseya’s team is already studying how lattice relaxation might affect hinges where two surfaces come together. In doing so, they hope to understand why angle resolved photoemission spectroscopy (ARPES), which probes surfaces, yields results that contradict those from scanning tunnelling microscopy experiments, which probe hinges. “Maybe we can find a way to explain every experiment consistently,” Fuseya says. The insights they gain, he adds, might also be useful for topological engineering: by bending a material, scientists could alter its lattice constants, and thereby tailor its topological properties.

This aspect also interests Zeila Zanolli and Matthieu Verstraete of Utrecht University in the Netherlands. Though not involved in the current study, they had previously shown that free-standing two-dimensional bismuth (bismuthene) can take on several geometric structures in-plane – not all of which are topological – depending on the material’s strain, bonding coordination and directionality. The new work, they say, “opens the way to (computational) design of topological materials, playing with symmetries, strain and the substrate interface”.

The research is published in Physical Review B.

The post Has bismuth been masquerading as a topological material? appeared first on Physics World.

Proton arc therapy eliminates hard-to-treat cancer with minimal side effects

27 mai 2025 à 09:30

Head-and-neck cancers are difficult to treat with radiation therapy because they are often located close to organs that are vital for patients to maintain a high quality-of-life. Radiation therapy can also alter a person’s shape, through weight loss or swelling, making it essential to monitor such changes throughout the treatment to ensure effective tumour targeting.

Researchers from Corewell Health William Beaumont University Hospital have now used a new proton therapy technique called step-and-shoot proton arc therapy (a spot-scanning proton arc method) to treat head-and-neck cancer in a human patient – the first person in the US to receive this highly accurate treatment.

“We envisioned that this technology could significantly improve the quality of treatment plans for patients and the treatment efficiency compared with the current state-of-the-art technique of intensity-modulated proton therapy (IMPT),” states senior author Xuanfeng Ding.

Progression towards dynamic proton arc therapy

“The first paper on spot-scanning proton arc therapy was published in 2016 and the first prototype for it was built in 2018,” says Ding. However, step-and-shoot proton arc therapy is an interim solution towards a more advanced technique known as dynamic proton arc therapy – which delivered its first pre-clinical treatment in 2024. Dynamic proton arc therapy is still undergoing development and regulatory approval clearance, so researchers have chosen to use step-and-shoot proton arc therapy clinically in the meantime.

Other proton therapies are more manual in nature and require a lot of monitoring, but the step-and-shoot technology delivers radiation directly to a tumour in a more continuous and automated fashion, with less lag time between radiation dosages. “Step-and-shoot proton arc therapy uses more beam angles per plan compared to the current clinical practice using IMPT and optimizes the spot and energy layers sparsity level,” explains Ding.

The extra beam angles provide a greater degree-of-freedom to optimize the treatment plan and provide a better dose conformity, robustness and linear energy transfer (LET, the energy deposited by ionizing radiation) through a more automated approach. During treatment delivery, the gantry rotates to each beam angle and stops to deliver the treatment irradiation.

In the dynamic proton arc technique that is also being developed, the gantry rotates continuously while irradiating the proton spot or switching energy layer. The step-and-shoot proton arc therapy therefore acts as an interim stage that is allowing more clinical data to be acquired to help dynamic proton arc therapy become clinically approved. The pinpointing ability of these proton therapies enables tumours to be targeted more precisely without damaging surrounding healthy tissue and organs.

The first clinical treatment

The team trialled the new technique on a patient with adenoid cystic carcinoma in her salivary gland – a rare and highly invasive cancer that’s difficult to treat as it targets the nerves in the body. This tendency to target nerves also means that fighting such tumours typically causes a lot of side effects. Using the new step-and-shoot proton arc therapy, however, the patient experienced minimal side effects and no radiation toxicity to other areas of her body (including the brain) after 33 treatments. Since finishing her treatment in August 2024, she continues to be cancer-free.

Tiffiney Beard and Rohan Deraniyagala
First US patient Tiffiney Beard, who underwent step-and-shoot proton arc therapy to treat her rare head-and-neck cancer, at a follow-up appointment with Rohan Deraniyagala. (Courtesy: Emily Rose Bennett, Corewell Health)

“Radiation to the head-and-neck typically results in dryness of the mouth, pain and difficulty swallowing, abnormal taste, fatigue and difficulty with concentration,” says Rohan Deraniyagala, a Corewell Health radiation oncologist involved with this research. “Our patient had minor skin irritation but did not have any issues with eating or performing at her job during treatment and for the last year since she was diagnosed.”

Describing the therapeutic process, Ding tells Physics World that “we developed an in-house planning optimization algorithm to select spot and energy per beam angle so the treatment irradiation time could be reduced to four minutes. However, because the gantry still needs to stop at each beam angle, the total treatment time is about 16 minutes per fraction.”

On monitoring the progression of the tumour over time and developing treatment plans, Ding confirms that the team “implemented a machine-learning-based synthetic CT platform which allows us to track the daily dosage of radiation using cone-beam computed tomography (CBCT) so that we can schedule an adaptive treatment plan for the patient.”

On the back of this research, Ding says that the next step is to help further develop the dynamic proton arc technique – known as DynamicARC – in collaboration with industry partner IBA.

The research was published in the International Journal of Particle Therapy.

The post Proton arc therapy eliminates hard-to-treat cancer with minimal side effects appeared first on Physics World.

Reçu avant avant-hier6.5 📰 Sciences English
❌