Heart failure is a serious condition that occurs when a damaged heart loses its ability to pump blood around the body. It affects as many as 100 million people worldwide and it is a progressive disease such that five years after a diagnosis, 50% of patients with heart failure will be dead.
The UK-based company Ceryx Medical has created a new bioelectronic device called Cysoni, which is designed to adjust the pace of the heart as a patient breathes in and out. This mimics a normal physiological process called respiratory sinus arrhythmia, which can be absent in people with heart failure. The company has just began the first trial of Cysoni on human subjects.
This podcast features the biomedical engineer Stuart Plant and the physicist Ashok Chauhan, who are Ceryx Medical’s CEO and senior scientist respectively. In a wide-ranging conversation with Physics World’s Margaret Harris, they talk about how bioelectronics could be used treat heart failure and some other diseases. Chauhan and Plant also chat about challenges and rewards of developing medical technologies within a small company.
What exactly is ice cream? For most of us, it’s a tasty frozen dessert, but to food scientists like Douglas Goff, it’s also a marvel of physics and chemistry. Ice cream is a complex multiphase material, containing emulsion, foam, crystals, solutes and solvent. Whether made in a domestic kitchen or on a commercial scale, ice cream requires a finely tuned ratio of ingredients and precision control during mixing, churning and freezing.
Goff is a researcher in food science at the University of Guelph in Canada and an expert in the science of ice cream. In addition to his research studying, among other things, structure and ingredient functionality in ice cream, Goff is also the instructor on the University of Guelph’s annual ice-cream course, which, having been taught since 1914, is the longest-running at the university.
In a conversation with Physics World’s Hamish Johnston, Goff explains the science of ice cream, why it’s so hard to make vegan ice cream and how his team performs electron microscopy experiments without their samples melting.
How would you describe the material properties of ice cream to a physicist?
Ice cream is an incredibly complex multi-phase system. It starts as an emulsion, where fat droplets are dispersed in a sugary water-based solution. Then we whip the emulsion to incorporate an air phase into it – this is called foaming (see “Phases in ice cream”). In a frozen tub of ice cream, about half of the volume is air. That air is present in the form of tiny bubbles that are distributed throughout the product.
Then we partially freeze the aqueous phase, turning at least half of the water into microscopically small ice crystals. The remaining unfrozen phase is what makes the ice cream soft, scoopable and chewable. It remains unfrozen because of all the sugar that’s dissolved in it, which depresses the freezing point.
So you end up with fat droplets in the form of an emulsion, air bubbles in the form of a foam, a partially crystalline solvent in the form of ice crystals, and a concentrated sugar solution.
Phases in ice cream
Emulsion: Some liquids, such as oil and water, will not mix if a droplet of one is added to the other – they are said to be immiscible. If many droplets of one liquid can be stabilized in another without coalescing, the resulting mixture is called an emulsion (left image).
Foam: A foam, like an emulsion, consists of two phases where one is dispersed in the other. In the case of foam, many tiny gas bubbles are trapped in a liquid or solid (right image).
Glass: When a liquid is cooled below a certain temperature, it generally undergoes a first-order phase transition to a solid crystal. However, if a liquid can be cooled below its freezing point without crystallizing (supercooling) – for example, if it is cooled very quickly, it may form glass – an amorphous solid with a disordered, liquid-like structure but solid-like mechanical properties. The temperature at which the glass forms, marked by a rapid increase in the material’s viscosity, is called the glass transition temperature.
What are the length scales of the different phases in the ice cream?
We’ve done a lot of electron microscopy research studying this in my lab. In fact, our research was some of the very first that utilized electron microscopy techniques for the structure of ice cream. The fat droplets are about one micron in diameter and the air bubbles, depending on the equipment that’s used, would be about 20 to 30 microns in diameter. The ice crystals are in the 10 to 20 micron size range.
It really is a beautiful thing to look at under an electron microscope, depending on the technique that you use (see image).
What are the big differences between ice cream that’s made in a commercial setting versus a domestic kitchen?
The freezing and whipping happen at the same time whether it’s an ice cream maker in the kitchen or a commercial operation. The biggest difference between what you do in the kitchen and what they’re going to do in the factory is the structure of the ice cream. Homemade ice cream is fine for maybe a day or two, but it starts to get icy pretty quickly, whereas we want a shelf life of months to a year when ice cream is made commercially.
This is because of the way the ice phase evolves over time – a process called recrystallization. If ice cream warms up it starts to melt. When the temperature is lowered again, water is frozen back into the ice phase, but it doesn’t create new ice crystals, it just grows onto the existing ice crystals.
This means that if ice cream is subject to lots of temperature fluctuation during storage, it’s going to degrade and become icy much quicker than if it was stored at a constant temperature. The warmer the temperature, the faster the rate of recrystallization. Commercial freezing equipment will give you much smaller ice crystal size than homemade ice cream machines. Low and constant temperature storage is what everybody strives for, and so the lower the temperature and the more constant it is, and the smaller the ice crystals are to begin with, the longer your shelf life before changes start occurring.
There’s also another structural element that is important for the long-term storage of ice cream. When that unfrozen sugary solvent phase gets concentrated enough, it can undergo a glass transition (see “Phases in ice cream”). Glass is an amorphous solid, so if this happens, there will be no movement of water or solute within the system and it can remain unchanged for years. For ice cream, the glass transition temperature is around –28 to –32° C so if you want long-term storage, you have to get down below that that glass transition temperature.
The third thing is the addition of stabilisers. Those are things like locust bean gum, guar gum or cellulose gum and there are some novel ones as well. What those do is increase the viscosity in the unfrozen phase. This slows down the rate of ice recrystallization because it slows down the diffusion of water and the growth of ice.
There are also some other novel agents that can prevent ice from recrystallizing into large crystals. One of these is called propylene glycol monostearate, it absorbs onto the surface of an ice crystal and prevents it from growing as the temperature fluctuates. This is also something we see in nature. Some insect, fish and plant species that live in cold environments have proteins that control the growth of ice in their blood and tissues. A lot of fish, for example, swim around with minute ice crystals in their in their body, but the proteins prevent the crystals from getting big enough to cause harm.
How does adding flavourings to ice cream change the manufacturing process?
When you think about ice cream around the world, there are hundreds of different flavours. The important question is whether the flavouring will impact the solution or emulsion.
For example, a chocolate chip will be inert, it’s not going to interact at all with the rest of the matrix. Strawberries on the other hand, really impact the system because of the high sugar content in the fruit preparation. We need to add sugar to the fruit to make sure it is softer than the ice cream itself – you don’t want to bite into ice cream and find a hard, frozen berry. The problem is that some of that sugar will diffuse into the unfrozen phase and lower its freezing point. This means that if you don’t do anything to the formulation, strawberry ice cream will be softer than something like vanilla because of the added sugar.
Another example would be alcohol-based flavours, anything from rum to Baileys Irish Cream or Frangelico, or even wine and beer. They’re very popular but the alcohol depresses the freezing point, so if you add enough to give you the flavour intensity that you want, your product won’t freeze. In that case, you might need to add less of the alcohol and a little bit more of a de-alcoholized flavouring.
You can try to make ice cream with just about any flavour, but you certainly have to look at what that flavouring is going to do to the structure and things like shelf life and so on.
Nowadays one can also buy vegan ice creams. How do the preparation and ingredients differ compared to dairy products?
A lot of it will be similar. We’re going to have an emulsified fat source, typically something like coconut oil or palm kernel oil, and then there’s the sugar, stabilisers and so on that you would have in a dairy ice cream.
The difference is the protein. Milk protein is both a very good foaming agent and a very good emulsifying agent. [Emulsifying and foaming agents are molecules that stabilize foams and emulsions. The molecules attach to the surface of the liquid droplets or air bubbles and stop them from coalescing with each other.] Plant proteins aren’t very good at either. If you look at cashew, almond or soy-based products, you’ll find additional ingredients to deliver the functionality that we would otherwise get from the milk protein.
What techniques do you use to study ice cream? And how do you stop the ice cream from melting during an experiment?
The workhorses of instrumentation for research are particle size analysis, electron microscopy and rheology (see “Experimental techniques”).
So first there’s laser light scattering which tells us everything we need to know about the fat globules and fat structure (see “Experimental techniques”). Then we use a lot of optical microscopy. You either need to put the microscope in a freezer or cold box or have a cold stage where you have the ice cream on a slide inside a chamber that’s cooled with liquid nitrogen. On the electron microscopy side (see “Experimental techniques”), we’ve done a lot of cryo-scanning electron microscopy (SEM), with a low-temperature unit.
We’ve also done a lot of transmission electron microscopy (TEM), which generally uses a different approach. Instead of performing the experiment in cold conditions, we use a chemical that “fixes” the structure in place and then we dry it, typically using a technique called “critical point drying” (see “Experimental techniques”). It’s then sliced into thin samples and studied with the TEM.
Experimental techniques
Rheology: Rheology is the study of the flow and deformation of materials. A rheometer is an apparatus used to measure the response of different materials to applied forces.
Dynamic light scattering (DLS): A laser-based technique used to measure the size distribution of dispersed particles. Dispersed particles such as fat globules in ice cream exhibit Brownian motion, with small particles moving faster than larger particles. The interference of laser light scattered from the particles is used to calculate the characteristic timescale of the Brownian motion and the particle size distribution.
Electron microscopy: Imaging techniques that use a beam of electrons, rather than photons, to image a sample. Scanning electron microscopy (SEM) and transmission electron microscopy (TEM) are two common examples. SEM uses reflected electrons to study the sample surface, whereas TEM uses electrons travelling through a sample to understand its internal structure.
Critical point drying: When a sample is dried in preparation for microscopy experiments, the effects of surface tension between the water in the sample and the surrounding air can cause damage. At the critical point, the liquid and gas phases are indistinguishable, if the water in the sample is at its critical point during dehydration, there is no boundary between the water and vapour, and this protects the structure of the sample.
After decades of studying ice cream, do you still get excited about it?
Oh, absolutely. I’ve been fortunate enough to have travelled to many, many interesting countries and I always see what the ice cream market looks like when I’m there. It’s not just a professional thing. I also like to know what’s going on around the world so I can share that with people. But of course, how can you go wrong with ice cream? It’s such a fun product to be associated with.
Listen to the full interview with Douglas Goff on the Physics World Weekly podcast.
This episode of the Physics World Weekly podcast explores how the concept of humanitarian engineering can be used to provide high quality cancer care to people in low- and middle-income countries (LMICs). This is an important challenge because today only 5% of global radiotherapy resources are located in LMICs, which are home to the majority of the world’s population.
Our guest in this episode of the Physics World Weekly podcast is the Turkish quantum physicist Mete Atatüre, who heads up the Cavendish Laboratory at the UK’s University of Cambridge.
In a conversation with Physics World’s Katherine Skipper, Atatüre talks about hosting Quantour, the quantum light source that is IYQ’s version of the Olympic torch. He also talks about his group’s research on quantum sensors and quantum networks.
This year marked the 70th anniversary of the European Council for Nuclear Research, which is known universally as CERN. To celebrate, we have published a bumper crop of articles on particle and nuclear physics in 2024. Many focus on people and my favourite articles have definitely skewed in that direction. So let’s start with the remarkable life of accelerator pioneer Bruno Touschek.
Born in Vienna in 1921 to a Jewish mother, Bruno Touschek’s life changed when Nazi Germany annexed Austria in 1938. After suffering antisemitism in his hometown and then in Rome, he inexplicably turned down an offer to study in the UK and settled in Germany. There he worked on a “death ray” for the military but was eventually imprisoned by the German secret police. He was then left for dead during a forced march to a concentration camp in 1945. When the war ended a few weeks later, Touschek’s expertise came to the attention of the British, who occupied north-western Germany. He went on to become a leading accelerator physicist and you can read much more about the extraordinary life of Touschek in this article by the physicist and biographer Giulia Pancheri.
Today, the best atomic clocks would only be off by about 10 ms after running for the current age of the universe. But, could these timekeepers soon be upstaged by clocks that use a nuclear, rather than an atomic transition? Such nuclear clocks could rival their atomic cousins when it comes to precision and accuracy. They also promise to be fully solid-state, which means that they could be used in a wide range of commercial applications. This year saw physicists make new measurements and develop new technologies that could soon make nuclear clocks a reality. Click on the headline above to discover how physicists in the US have fabricated all of the components needed to create a nuclear clock made from thorium-229. Also, earlier this year physicists in Germany and Austria showed that they can put nuclei of the isotope into a low-lying metastable state that could be used in a nuclear clock. You can find out more here: “Excitation of thorium-229 brings a working nuclear clock closer”.
In 2024 we launched our Physics World Live series of panel discussions. In September, we explored the future of particle physics with Tara Shears of the UK’s University of Liverpool, Phil Burrows at the University of Oxford in the UK and Tulika Bose at the University of Wisconsin–Madison in the US. Moderated by Physics World’s Michael Banks, the discussion focussed on next-generation particle colliders and how they could unravel the mysteries of the Higgs boson and probe beyond the Standard Model of particle physics. You can watch a video of the event by clicking on the above headline (free registration) or read an article based on the discussion here: “How a next-generation particle collider could unravel the mysteries of the Higgs boson”.
Neutrinos do not fit in nicely with the Standard Model of particle physics because of their non-zero masses. As a result some physicists believe that they offer a unique opportunity to do experiments that could reveal new physics. In a wide-ranging interview, the particle physicist Juan Pedro Ochoa-Ricoux explains why he has devoted much of his career to the study of these elusive subatomic particles. He also looks forward to two big future experiments – JUNO and DUNE – which could change our understanding of the universe.
“Children decide quite early in their life, as early as primary school, if science is for them or not,” explains Çiğdem İşsever – who is leads the particle physics group at DESY in Hamburg, and the experimental high-energy physics group at the Humboldt University of Berlin. İşsever has joined forces with physicists Steven Worm and Becky Parker to create ATLAScraft, which creates a virtual version of CERN’s ATLAS detector in the hugely popular computer game Minecraft. In this profile, the science writer Rob Lea talks to İşsever about her passion for outreach and how she dispels gender stereotypes in science by talking to school children as young as five about her career in physics. İşsever also looks forward to the future of particle physics and what could eventually replace the Large Hadron collider as the world’s premier particle-physics experiment.
This year marked the 70th anniversary of the world’s most famous physics laboratory, so the last two items in my list celebrate that iconic facility nestled between the Alps and the Jura mountains. Formed in the aftermath of the Second World War, which devastated much of Europe, CERN came into being on 29 September 1954. That year also saw the start of construction of the Geneva-based lab’s proton synchrotron, which fired-up in 1959 with an energy of 24 GeV, becoming the world’s highest-energy particle accelerator. The original CERN had 12 member states and that has since doubled to 24, with an additional 10 associate members. The lab has been associated with a number of Nobel laureates and is a shining example of how science can bring nations together after a the trauma of war. Read more about the anniversary here.
When former physicist James Gillies sat down for dinner in 2009 with actors Tom Hanks and Ayelet Zurer, joined by legendary director Ron Howard, he could scarcely believe the turn of events. Gillies was the head of communications at CERN, and the Hollywood trio were in town for the launch of Angels & Demons. The blockbuster film is partly set at CERN with antimatter central to its plot, and is based on the Dan Brown novel. In this Physics World Stories podcast, Gillies looks back on those heady days. Gillies has also written a feature article for us about his Hollywood experience: “Angels & Demons, Tom Hanks and Peter Higgs: how CERN sold its story to the world”.
December might be dark and chilly here in the northern hemisphere, but it’s summer south of the equator – and for many people that means eating ice cream.
It turns out that the physics of ice cream is rather remarkable – as I discovered when I travelled to Canada’s University of Guelph to interview the food scientist Douglas Goff. He is a leading expert on the science of frozen desserts and in this podcast he talks about the unique material properties of ice cream, the analytical tools he uses to study it, and why ice cream goes off when it is left in the freezer for too long.
Errors caused by interactions with the environment – noise – are the Achilles heel of every quantum computer, and correcting them has been called a “defining challenge” for the technology. These two teams, working with very different quantum systems, took significant steps towards overcoming this challenge. In doing so, they made it far more likely that quantum computers will become practical problem-solving machines, not just noisy, intermediate-scale tools for scientific research.
Quantum error correction works by distributing one quantum bit of information – called a logical qubit – across several different physical qubits such as superconducting circuits or trapped atoms. While each physical qubit is noisy, they work together to preserve the quantum state of the logical qubit – at least for long enough to do a computation.
Formidable task
Error correction should become more effective as the number of physical qubits in a logical qubit increases. However, integrating large numbers of physical qubits to create a processor with multiple logical qubits is a formidable task. Furthermore, adding more physical qubits to a logical qubit also adds more noise – and it is not clear whether making logical qubits bigger would make them significantly better. This year’s winners of our Breakthrough of the Year have made significant progress in addressing these issues.
The team led by Lukin and Bluvstein created a quantum processor with 48 logical qubits that can execute algorithms while correcting errors in real time. At the heart of their processor are arrays of neutral atoms. These are grids of ultracold rubidium atoms trapped by optical tweezers. These atoms can be put into highly excited Rydberg states, which enables the atoms to act as physical qubits that can exchange quantum information.
What is more, the atoms can be moved about within an array to entangle them with other atoms. According to Bluvstein, moving groups of atoms around the processor was critical for their success at addressing a major challenge in using logical qubits: how to get logical qubits to interact with each other to perform quantum operations. He describes the system as a “living organism that changes during a computation”.
Their processor used about 300 physical qubits to create up to 48 logical qubits, which were used to perform logical operations. In contrast, similar attempts using superconducting or trapped-ion qubits have only managed to perform logical operations using 1–3 logical qubits.
Willow quantum processor
Meanwhile, the team led by Hartmut Neven made a significant advance in how physical qubits can be combined to create a logical qubit. Using Google’s new Willow quantum processor – which offers up to 105 superconducting physical qubits – they showed that the noise in their logical qubit remained below a maximum threshold as they increased the number of qubits. This means that the logical error rate is suppressed exponentially as the number of physical qubits per logical qubit is increased.
Neven told Physics World that the Google system is “the most convincing prototype of a logical qubit built today”. He said that that Google is on track to develop a quantum processor with 100 or even 1000 logical qubits by 2030. He says that a 1000 logical qubit device could do useful calculations for the development of new drugs or new materials for batteries.
Bluvstein, Lukin and colleagues are already exploring how their processor could be used to study an effect called quantum scrambling. This could shed light on properties of black holes and even provide important clues about the nature of quantum gravity.
You can listen to Neven talk about his team’s research in this podcast. Bluvstein and Lukin talk about their group’s work in this podcast.
The Breakthrough of the Year was chosen by the Physics World editorial team. We looked back at all the scientific discoveries we have reported on since 1 January and picked the most important. In addition to being reported in Physics World in 2024, the breakthrough must meet the following criteria:
Significant advance in knowledge or understanding
Importance of work for scientific progress and/or development of real-world applications
Of general interest to Physics World readers
Before we picked our winners, we released the Physics World Top 10 Breakthroughs for 2024, which served as our shortlist. The other nine breakthroughs are listed below in no particular order.
To a team of researchers at Stanford University in the US for developing a method to make the skin of live mice temporarily transparent. One of the challenges of imaging biological tissue using optical techniques is that tissue scatters light, which makes it opaque. The team, led by Zihao Ou (now at The University of Texas at Dallas), Mark Brongersma and Guosong Hong, found that the common yellow food dye tartrazine strongly absorbs near-ultraviolet and blue light and can help make biological tissue transparent. Applying the dye onto the abdomen, scalp and hindlimbs of live mice enabled the researchers to see internal organs, such as the liver, small intestine and bladder, through the skin without requiring any surgery. They could also visualize blood flow in the rodents’ brains and the fine structure of muscle sarcomere fibres in their hind limbs. The effect can be reversed by simply rinsing off the dye. This “optical clearing” technique has so far only been conducted on animals. But if extended to humans, it could help make some types of invasive biopsies a thing of the past.
To the AEgIS collaboration at CERN, and Kosuke Yoshioka and colleagues at the University of Tokyo, for independently demonstrating laser cooling of positronium. Positronium, an atom-like bound state of an electron and a positron, is created in the lab to allow physicists to study antimatter. Currently, it is created in “warm” clouds in which the atoms have a large distribution of velocities, making precision spectroscopy difficult. Cooling positronium to low temperatures could open up novel ways to study the properties of antimatter. It also enables researchers to produce one to two orders of magnitude more antihydrogen – an antiatom comprising a positron and an antiproton that’s of great interest to physicists. The research also paves the way to use positronium to test current aspects of the Standard Model of particle physics, such as quantum electrodynamics, which predicts specific spectral lines, and to probe the effects of gravity on antimatter.
To Roman Bauer at the University of Surrey, UK, Marco Durante from the GSI Helmholtz Centre for Heavy Ion Research, Germany, and Nicolò Cogno from GSI and Massachusetts General Hospital/Harvard Medical School, US, for creating a computational model that could improve radiotherapy outcomes for patients with lung cancer. Radiotherapy is an effective treatment for lung cancer but can harm healthy tissue. To minimize radiation damage and help personalize treatment, the team combined a model of lung tissue with a Monte Carlo simulator to simulate irradiation of alveoli (the tiny air sacs within the lungs) at microscopic and nanoscopic scales. Based on the radiation dose delivered to each cell and its distribution, the model predicts whether each cell will live or die, and determines the severity of radiation damage hours, days, months or even years after treatment. Importantly, the researchers found that their model delivered results that matched experimental observations from various labs and hospitals, suggesting that it could, in principle, be used within a clinical setting.
To Walter de Heer, Lei Ma and colleagues at Tianjin University and the Georgia Institute of Technology, and independently to Marcelo Lozada-Hidalgo of the University of Manchester and a multinational team of colleagues, for creating a functional semiconductor made from graphene, and for using graphene to make a switch that supports both memory and logic functions, respectively. The Manchester-led team’s achievement was to harness graphene’s ability to conduct both protons and electrons in a device that performs logic operations with a proton current while simultaneously encoding a bit of memory with an electron current. These functions are normally performed by separate circuit elements, which increases data transfer times and power consumption. Conversely, de Heer, Ma and colleagues engineered a form of graphene that does not conduct as easily. Their new “epigraphene” has a bandgap that, like silicon, could allow it to be made into a transistor, but with favourable properties that silicon lacks, such as high thermal conductivity.
To David Moore, Jiaxiang Wang and colleagues at Yale University, US, for detecting the nuclear decay of individual helium nuclei by embedding radioactive lead-212 atoms in a micron-sized silica sphere and measuring the sphere’s recoil as nuclei escape from it. Their technique relies on the conservation of momentum, and it can gauge forces as small as 10-20 N and accelerations as tiny as 10-7 g, where g is the local acceleration due to the Earth’s gravitational pull. The researchers hope that a similar technique may one day be used to detect neutrinos, which are much less massive than helium nuclei but are likewise emitted as decay products in certain nuclear reactions.
To Andrew Denniston at the Massachusetts Institute of Technology in the US, Tomáš Ježo at Germany’s University of Münster and an international team for being the first to unify two distinct descriptions of atomic nuclei. They have combined the particle physics perspective – where nuclei comprise quarks and gluons – with the traditional nuclear physics view that treats nuclei as collections of interacting nucleons (protons and neutrons). The team has provided fresh insights into short-range correlated nucleon pairs – which are fleeting interactions where two nucleons come exceptionally close and engage in strong interactions for mere femtoseconds. The model was tested and refined using experimental data from scattering experiments involving 19 different nuclei with very different masses (from helium-3 to lead-208). The work represents a major step forward in our understanding of nuclear structure and strong interactions.
To Jelena Vučković, Joshua Yang, Kasper Van Gasse, Daniil Lukin, and colleagues at Stanford University in the US for developing a compact, integrated titanium:sapphire laser that needs only a simple green LED as a pump source. They have reduced the cost and footprint of a titanium:sapphire laser by three orders of magnitude and the power consumption by two. Traditional titanium:sapphire lasers have to be pumped with high-powered lasers – and therefore cost in excess of $100,000. In contrast, the team was able to pump its device using a $37 green laser diode. The researchers also achieved two things that had not been possible before with a titanium:sapphire laser. They were able to adjust the wavelength of the laser light and they were able to create a titanium:sapphire laser amplifier. Their device represents a key step towards the democratization of a laser type that plays important roles in scientific research and industry.
To two related teams for their clever use of entangled photons in imaging. Both groups include Chloé Vernière and Hugo Defienne of Sorbonne University in France, who as duo used quantum entanglement to encode an image into a beam of light. The impressive thing is that the image is only visible to an observer using a single-photon sensitive camera – otherwise the image is hidden from view. The technique could be used to create optical systems with reduced sensitivity to scattering. This could be useful for imaging biological tissues and long-range optical communications. In separate work, Vernière and Defienne teamed up with Patrick Cameron at the UK’s University of Glasgow and others to use entangled photons to enhance adaptive optical imaging. The team showed that the technique can be used to produce higher-resolution images than conventional bright-field microscopy. Looking to the future, this adaptive optics technique could play a major role in the development of quantum microscopes.
To the China National Space Administration for the first-ever retrieval of material from the Moon’s far side, confirming China as one of the world’s leading space nations. Landing on the lunar far side – which always faces away from Earth – is difficult due to its distance and terrain of giant craters with few flat surfaces. At the same time, scientists are interested in the unexplored far side and why it looks so different from the near side. The Chang’e-6 mission was launched on 3 May consisting of four parts: an ascender, lander, returner and orbiter. The ascender and lander successfully touched down on 1 June in the Apollo basin, which lies in the north-eastern side of the South Pole-Aitken Basin. The lander used its robotic scoop and drill to obtain about 1.9 kg of materials within 48 h. The ascender then lifted off from the top of the lander and docked with the returner-orbiter before the returner headed back to Earth, landing in Inner Mongolia on 25 June. In November, scientists released the first results from the mission finding that fragments of basalt – a type of volcanic rock – date back to 2.8 billion years ago, indicating that the lunar far side was volcanically active at that time. Further scientific discoveries can be expected in the coming months and years ahead as scientists analyze more fragments.
Physics World‘s coverage of the Breakthrough of the Year is supported by Reports on Progress in Physics, which offers unparalleled visibility for your ground-breaking research.
In this episode of the Physics World Weekly podcast, Bluvstein and Lukin explain the crucial role that error correction is playing in the development of practical quantum computers. They also describe how atoms are moved around their quantum processor and why this coordinated motion allowed them to create logical qubits and use those qubits to perform quantum computations.
Physics World‘s coverage of the Breakthrough of the Year is supported by Reports on Progress in Physics, which offers unparalleled visibility for your ground-breaking research.
In this episode of the Physics World Weekly podcast, Neven talks about Google’s new Willow quantum processor, which integrates 105 superconducting physical qubits. He also explains how his team used these qubits to create logical qubits with error rates that dropped exponentially with the number of physical qubits used. He also outlines Googles ambitious plan to create a processor with 100, or even 1000, logical qubits by 2030.
Physics World‘s coverage of the Breakthrough of the Year is supported by Reports on Progress in Physics, which offers unparalleled visibility for your ground-breaking research.
Physics World is delighted to announce its Top 10 Breakthroughs of the Year for 2024, which includes research in nuclear and medical physics, quantum computing, lasers, antimatter and more. The Top Ten is the shortlist for the Physics World Breakthrough of the Year, which will be revealed on Thursday 19 December.
Our editorial team has looked back at all the scientific discoveries we have reported on since 1 January and has picked 10 that we think are the most important. In addition to being reported in Physics World in 2024, the breakthroughs must meet the following criteria:
Significant advance in knowledge or understanding
Importance of work for scientific progress and/or development of real-world applications
Of general interest to Physics World readers
Here, then, are the Physics World Top 10 Breakthroughs for 2024, listed in no particular order. You can listen to Physics World editors make the case for each of our nominees in the Physics World Weekly podcast. And, come back next week to discover who has bagged the 2024 Breakthrough of the Year.
To a team of researchers at Stanford University in the US for developing a method to make the skin of live mice temporarily transparent. One of the challenges of imaging biological tissue using optical techniques is that tissue scatters light, which makes it opaque. The team, led by Zihao Ou (now at The University of Texas at Dallas), Mark Brongersma and Guosong Hong, found that the common yellow food dye tartrazine strongly absorbs near-ultraviolet and blue light and can help make biological tissue transparent. Applying the dye onto the abdomen, scalp and hindlimbs of live mice enabled the researchers to see internal organs, such as the liver, small intestine and bladder, through the skin without requiring any surgery. They could also visualize blood flow in the rodents’ brains and the fine structure of muscle sarcomere fibres in their hind limbs. The effect can be reversed by simply rinsing off the dye. This “optical clearing” technique has so far only been conducted on animals. But if extended to humans, it could help make some types of invasive biopsies a thing of the past.
To the AEgIS collaboration at CERN, and Kosuke Yoshioka and colleagues at the University of Tokyo, for independently demonstrating laser cooling of positronium. Positronium, an atom-like bound state of an electron and a positron, is created in the lab to allow physicists to study antimatter. Currently, it is created in “warm” clouds in which the atoms have a large distribution of velocities, making precision spectroscopy difficult. Cooling positronium to low temperatures could open up novel ways to study the properties of antimatter. It also enables researchers to produce one to two orders of magnitude more antihydrogen – an antiatom comprising a positron and an antiproton that’s of great interest to physicists. The research also paves the way to use positronium to test current aspects of the Standard Model of particle physics, such as quantum electrodynamics, which predicts specific spectral lines, and to probe the effects of gravity on antimatter.
To Roman Bauer at the University of Surrey, UK, Marco Durante from the GSI Helmholtz Centre for Heavy Ion Research, Germany, and Nicolò Cogno from GSI and Massachusetts General Hospital/Harvard Medical School, US, for creating a computational model that could improve radiotherapy outcomes for patients with lung cancer. Radiotherapy is an effective treatment for lung cancer but can harm healthy tissue. To minimize radiation damage and help personalize treatment, the team combined a model of lung tissue with a Monte Carlo simulator to simulate irradiation of alveoli (the tiny air sacs within the lungs) at microscopic and nanoscopic scales. Based on the radiation dose delivered to each cell and its distribution, the model predicts whether each cell will live or die, and determines the severity of radiation damage hours, days, months or even years after treatment. Importantly, the researchers found that their model delivered results that matched experimental observations from various labs and hospitals, suggesting that it could, in principle, be used within a clinical setting.
To Walter de Heer, Lei Ma and colleagues at Tianjin University and the Georgia Institute of Technology, and independently to Marcelo Lozada-Hidalgo of the University of Manchester and a multinational team of colleagues, for creating a functional semiconductor made from graphene, and for using graphene to make a switch that supports both memory and logic functions, respectively. The Manchester-led team’s achievement was to harness graphene’s ability to conduct both protons and electrons in a device that performs logic operations with a proton current while simultaneously encoding a bit of memory with an electron current. These functions are normally performed by separate circuit elements, which increases data transfer times and power consumption. Conversely, de Heer, Ma and colleagues engineered a form of graphene that does not conduct as easily. Their new “epigraphene” has a bandgap that, like silicon, could allow it to be made into a transistor, but with favourable properties that silicon lacks, such as high thermal conductivity.
To David Moore, Jiaxiang Wang and colleagues at Yale University, US, for detecting the nuclear decay of individual helium nuclei by embedding radioactive lead-212 atoms in a micron-sized silica sphere and measuring the sphere’s recoil as nuclei escape from it. Their technique relies on the conservation of momentum, and it can gauge forces as small as 10-20 N and accelerations as tiny as 10-7 g, where g is the local acceleration due to the Earth’s gravitational pull. The researchers hope that a similar technique may one day be used to detect neutrinos, which are much less massive than helium nuclei but are likewise emitted as decay products in certain nuclear reactions.
To Andrew Denniston at the Massachusetts Institute of Technology in the US, Tomáš Ježo at Germany’s University of Münster and an international team for being the first to unify two distinct descriptions of atomic nuclei. They have combined the particle physics perspective – where nuclei comprise quarks and gluons – with the traditional nuclear physics view that treats nuclei as collections of interacting nucleons (protons and neutrons). The team has provided fresh insights into short-range correlated nucleon pairs – which are fleeting interactions where two nucleons come exceptionally close and engage in strong interactions for mere femtoseconds. The model was tested and refined using experimental data from scattering experiments involving 19 different nuclei with very different masses (from helium-3 to lead-208). The work represents a major step forward in our understanding of nuclear structure and strong interactions.
To Jelena Vučković, Joshua Yang, Kasper Van Gasse, Daniil Lukin, and colleagues at Stanford University in the US for developing a compact, integrated titanium:sapphire laser that needs only a simple green LED as a pump source. They have reduced the cost and footprint of a titanium:sapphire laser by three orders of magnitude and the power consumption by two. Traditional titanium:sapphire lasers have to be pumped with high-powered lasers – and therefore cost in excess of $100,000. In contrast, the team was able to pump its device using a $37 green laser diode. The researchers also achieved two things that had not been possible before with a titanium:sapphire laser. They were able to adjust the wavelength of the laser light and they were able to create a titanium:sapphire laser amplifier. Their device represents a key step towards the democratization of a laser type that plays important roles in scientific research and industry.
To two related teams for their clever use of entangled photons in imaging. Both groups include Chloé Vernière and Hugo Defienne of Sorbonne University in France, who as duo used quantum entanglement to encode an image into a beam of light. The impressive thing is that the image is only visible to an observer using a single-photon sensitive camera – otherwise the image is hidden from view. The technique could be used to create optical systems with reduced sensitivity to scattering. This could be useful for imaging biological tissues and long-range optical communications. In separate work, Vernière and Defienne teamed up with Patrick Cameron at the UK’s University of Glasgow and others to use entangled photons to enhance adaptive optical imaging. The team showed that the technique can be used to produce higher-resolution images than conventional bright-field microscopy. Looking to the future, this adaptive optics technique could play a major role in the development of quantum microscopes.
To the China National Space Administration for the first-ever retrieval of material from the Moon’s far side, confirming China as one of the world’s leading space nations. Landing on the lunar far side – which always faces away from Earth – is difficult due to its distance and terrain of giant craters with few flat surfaces. At the same time, scientists are interested in the unexplored far side and why it looks so different from the near side. The Chang’e-6 mission was launched on 3 May consisting of four parts: an ascender, lander, returner and orbiter. The ascender and lander successfully touched down on 1 June in the Apollo basin, which lies in the north-eastern side of the South Pole-Aitken Basin. The lander used its robotic scoop and drill to obtain about 1.9 kg of materials within 48 h. The ascender then lifted off from the top of the lander and docked with the returner-orbiter before the returner headed back to Earth, landing in Inner Mongolia on 25 June. In November, scientists released the first results from the mission finding that fragments of basalt – a type of volcanic rock – date back to 2.8 billion years ago, indicating that the lunar far side was volcanically active at that time. Further scientific discoveries can be expected in the coming months and years ahead as scientists analyze more fragments.
Physics World‘s coverage of the Breakthrough of the Year is supported by Reports on Progress in Physics, which offers unparalleled visibility for your ground-breaking research.
This episode of the Physics World Weekly podcast features a lively discussion about our Top 10 Breakthroughs of 2024, which include important research in nuclear physics, quantum computing, medical physics, lasers and more. Physics World editors explain why we have made our selections and look at the broader implications of this impressive body of research.
The top 10 serves as the shortlist for the Physics World Breakthrough of the Year award, the winner of which will be announced on 19 December.
Links to all the nominees, more about their research and the selection criteria can be found here.
Physics World‘s coverage of the Breakthrough of the Year is supported by Reports on Progress in Physics, which offers unparalleled visibility for your ground-breaking research.
This episode of the Physics World Weekly podcast explores the science and commercial applications of metamaterials with Claire Dancer of the University of Warwick and Alastair Hibbins of the University of Exeter.
They lead the UK Metamaterials Network, which brings together people in academia, industry and governmental agencies to support and expand metamaterial R&D; nurture talent and skills; promote the adoption of metamaterials in the wider economy; and much more.
According to the network, “A metamaterial is a 3D structure with a response or function due to the collective effect of meta-atom elements that is not possible to achieve conventionally with any individual constituent material”.
In a wide-ranging conversation with Physics World’s Matin Durrani, Hibbins and Dancer talk about exciting commercial applications of metamaterials including soundproof materials and lenses for mobile phones – and how they look forward to welcoming the thousandth member of the network sometime in 2025.
Climate science and astronomy have much in common, and this has inspired the astrophysicist Travis Rector to call on astronomers to educate themselves, their students and the wider public about climate change. In this episode of the Physics World Weekly podcast, Rector explains why astronomers should listen to the concerns of the public when engaging about the science of global warming. And, he says the positive outlook of some of his students at the University of Alaska Anchorage makes him believe that a climate solution is possible.
Rector says that some astronomers are reluctant to talk to the public about climate change because they have not mastered the intricacies of the science. Indeed, one aspect of atmospheric physics that has challenged scientists is the role that clouds play in global warming. My second guest this week is the science journalist Michael Allen, who has written a feature article for Physics World called “Cloudy with a chance of warming: how physicists are studying the dynamical impact of clouds on climate change”. He talks about climate feedback mechanisms that involve clouds and how aerosols affect clouds and the climate.
In this episode of the Physics World Weekly podcast I am in conversation with Joanne O’Meara, who has bagged a King Charles III Coronation Medal for her outstanding achievements in science education and outreach. Based at Canada’s University of Guelph, the medical physicist talks about her passion for science communication and her plans for a new science centre.
This episode also features a wide-ranging interview with Burcu Saner Okan, who is principal investigator at Sabanci University’s Sustainable Advanced Materials Research Group in Istanbul, Turkey. She explains how graphene is manufactured today and how the process can be made more sustainable – by using recycled materials as feedstocks, for example. Saner Okan also talks about her commercial endeavours including Euronova.
UHV suitcases address an important challenge facing people who use ultrahigh vacuum (UHV) systems: it can be extremely difficult to move samples from one UHV system to another without the risk of contamination. While some UHV experiments are self contained, it is often the case that research benefits from using cutting-edge analytical techniques that are only available at large facilities such as synchrotrons, free-electron lasers and neutron sources.
Normally, fabricating a UHV sample in one place and studying it in another involves breaking the vacuum and then removing and transporting the sample. This is unsatisfactory for two reasons. First, no matter how clean a handling system is, exposing a sample to air will change or even destroy its material properties – often irrevocably. The second problem is that an opened UHV chamber must be baked out before it can be used again – and a bakeout can take several days out of a busy research schedule.
These problems can be avoided by connecting a portable UHV system (called a UHV suitcase) to the main vacuum chamber and then transferring the sample between the two. This UHV suitcase can then be used to move the sample across a university campus – or indeed, halfway around the world – where it can be transferred to another UHV system.
Ultralight aluminium UHV suitcases
While commercial designs have improved significantly over the past two decades, today’s UHV suitcases can still be heavy, unwieldy and expensive. To address these shortcomings, US-based VolkVac Instruments has developed the ULSC ultralight aluminium suitcase, which weighs less than 10 kg, and an even lighter version – the ULSC-R – which weighs in at less than 7 kg.
Key to the success of VolkVac’s UHV suitcases is the use of lightweight aluminium to create the portable vacuum chamber. The metal is used instead of stainless steel, a more conventional material for UHV chambers. As well as being lighter, aluminium is also much easier to machine. This means that VolkVac’s UHV suitcases can be efficiently machined from a single piece of aluminium. The lightweight material is also non-magnetic. This is an important feature for VolkVac because it means the suitcases can be used to transport samples with delicate magnetic properties.
Based in Escondido, California, VolkVac was founded in 2020 by the PhD physicist Igor Pinchuk. He says that the idea of a UHV suitcase is not new – pointing out that researchers have been creating their own bespoke solutions for decades. The earliest were simply standard vacuum chambers that were disconnected from one UHV system and then quickly wheeled to another – without being pumped.
This has changed in recent years with the arrival of new materials, vacuum pumps, pump controllers and batteries. It is now possible to create a lightweight, portable UHV chamber with a combination of passive and battery-powered pumps. Pinchuk explains that having an integrated pump is crucial because it is the only way to maintain a true UHV environment during transport.
Including pumps, controllers and batteries means that the material used to create the chamber of a UHV suitcase must be as light as possible to keep the overall weight to a minimum.
Aluminium is the ideal material
While aluminium is the ideal material for making UHV suitcases, it has one shortcoming – it is a relatively soft metal. Access to UHV chambers is provided by conflat flanges which have sharp circular edges that are driven into a copper-ring gasket to create an exceptionally airtight seal. The problem is that aluminium is too soft to provide durable long-lasting sharp knife edges on flanges.
This is why VolkVac has looked to Atlas Technologies for its expertise in bi-metal fabrication. Atlas fabricate aluminium flanges with titanium or stainless steel knife-edges. Because VolkVac requires non-magnetic materials for its UHV suitcases, Atlas developed titanium–aluminium flanges for the company.
Atlas Technologies’ Jimmy Stewart coordinates the company’s collaboration with VolkVac. He says that the first components for Pinchuk’s newest UHV suitcase, a custom iteration of VolkVac’s ULSC, have already been machined. He explains that VolkVac continues to work very closely with Atlas’s lead machinist and lead engineer to bring Pinchuk’s vision to life in aluminium and titanium.
Close relationship between Atlas and VolkVac
Stewart explains that this close relationship is necessary because bi-metal materials have very special requirements when it comes to things like welding and stress relief.
Stewart adds that Atlas often works like this with its customers to produce equipment that is used across a wide range of sectors including semiconductor fabrication, quantum computing and space exploration.
Because of the historical use of stainless steel in UHV systems, Stewart says that some customers have not yet used bi-metal components. “They may have heard about the benefits of bi-metal,” says Stewart, “but they don’t have the expertise. And that’s why they come to us – for our 30 years of experience and in-depth knowledge of bi-metal and aluminium vacuum.” He adds, “Atlas invented the market and pioneered the use of bi-metal components.”
Pinchuk agrees, saying that he knows stainless steel UHV technology forwards and backwards, but now he is benefitting from Atlas’s expertise in aluminium and bi-metal technology for his product development.
Three-plus decades of bi-metal expertise
Atlas Technologies was founded in 1993 by father and son Richard and Jed Bothell. Based in Port Townsend, Washington, the company specializes in creating aluminium vacuum chambers with bi-metal flanges. Atlas also designs and manufactures standard and custom bi-metal fittings for use outside of UHV applications.
Binding metals to aluminium to create vacuum components is a tricky business. The weld must be UHV compatible in terms of maintaining low pressure and not being prone to structural failure during the heating and cooling cycles of bakeout – or when components are cooled to cryogenic temperatures.
Jed Bothell points out that Japanese companies had pioneered the development of aluminium vacuum chambers but had struggled to create good-quality flanges. In the early 1990s, he was selling explosion-welded couplings and had no vacuum experience. His father, however, was familiar with the vacuum industry and realized that there was a business opportunity in creating bi-metal components for vacuum systems and other uses.
Explosion welding is a solid-phase technique whereby two plates of different metals are placed on top of each other. The top plate is then covered with an explosive material that is detonated starting at an edge. The force of the explosion pushes the plates together, plasticizing both metals and causing them to stick together. The interface between the two materials is wavy, which increases the bonded surface area and strengthens the bond.
Strong bi-metal bond
What is more, the air at the interface between the two metals is ionized, creating a plasma that travels along the interface ahead of the weld, driving out impurities before the weld is made – which further strengthens the bond. The resulting bi-metal material is then machined to create UHV flanges and other components.
As well as bonding aluminium to stainless steel, explosive welding can be used to create bi-metal structures of titanium and aluminium – avoiding the poor UHV properties of stainless steel.
“Stainless steel is bad material for vacuum in a lot of ways,” Bothell explains, He describes the hydrogen outgassing problem as “serious headwind” against using stainless steel for UHV (see box “UHV and XHV: science and industry benefit from bi-metal fabrication”). That is why Atlas developed bi-metal technologies that allow aluminium to be used in UHV components – and Bothell adds that it also shows promise for extreme high vacuum (XHV).
UHV and XHV: science and industry benefit from bi-metal fabrication
Modern experiments in condensed matter physics, materials science and chemistry often involve the fabrication and characterization of atomic-scale structures on surfaces. Usually, such experiments cannot be done at atmospheric pressure because samples would be immediately contaminated by gas molecules. Instead, these studies must be done in either UHV or XHV chambers – which both operate in the near absence of air. UHV and XHV also have important industrial applications including the fabrication of semiconductor chips.
UHV systems operate at pressures in the range 10−6–10−9 pa and XHV systems work at pressures of 10−10 pa and lower. In comparison, atmospheric pressure is about 105 pa.
At UHV pressures, it takes several days for a single layer (monolayer) of contaminant gases to build up on a surface – whereas surfaces in XHV will remain pristine for hundreds of days. These low pressures also allow beams of charged particles such as electrons, protons and ions to travel unperturbed by collisions with gas molecules.
Crucial roles in science and industry
As a result UHV and XHV vacuum technologies play crucial roles in particle accelerators and support powerful analytical techniques including angle resolved photoemission spectroscopy (ARPES), Auger electron spectroscopy (AES), secondary ion mass spectrometry (SIMS) and X-ray photoelectron spectroscopy (XPS).
UHV and XHV also allow exciting new materials to be created by depositing atoms or molecules on surfaces with atomic-layer precision – using techniques such as molecular beam epitaxy. This is very important in the fabrication of advanced semiconductors and other materials.
Traditionally, UHV components are made from stainless steel, whereas XHV systems are increasingly made from titanium. The latter is expensive and a much more difficult material to machine than stainless steel. As a result, titanium tends to be reserved for more specialized applications such as the X-ray lithography of semiconductor devices, particle-physics experiments and cryogenic systems. Unlike stainless steel, titanium is non-magnetic so it is also used in experiments that must be done in very low magnetic fields.
An important shortcoming of stainless steel is that the process used to create the material leaves it full of hydrogen, which finds its way into UHV chambers via a process called outgassing. Much of this hydrogen can be driven out by heating the stainless steel while the chamber is being pumped down to UHV pressures – a process called bakeout. But some hydrogen will be reabsorbed when the chamber is opened to the atmosphere, and therefore time-consuming bakeouts must be repeated every time a chamber is open.
Less hydrogen and hydrocarbon contamination
Aluminium contains about ten million times less hydrogen than stainless steel and it absorbs much less gas from the atmosphere when a UHV chamber is opened. And because aluminium contains a low amount of carbon, it results in less hydrocarbon-based contamination of the vacuum
Good thermal properties are crucial for UHV materials and aluminium conducts heat ten times better than stainless steel. This means that the chamber can be heated and cooled down much more quickly – without the undesirable hot and cold spots that affect stainless steel. As a bonus, aluminium bakeout can be done at 150 °C, whereas stainless steel must be heated to 250 °C. Furthermore, aluminium vacuum chambers retain most of the gains from previous bakeouts making them ideal for industrial applications where process up-time is highly valued.
Magnetic fields can have detrimental effects on experiments done at UHV, so aluminium’s slow magnetic permeability is ideal. The material also has low residual radioactivity and greater resistance to corrosion than stainless steel – making it favourable for use in high neutron-flux environments. Aluminium is also better at dampening vibrations than stainless steel – making delicate measurements possible.
When it comes to designing and fabricating components, aluminium is much easier to machine than stainless steel. This means that a greater variety of component shapes can be quickly made at a lower cost.
Aluminium is not as strong as stainless steel, which means more material is required. But thanks to its low density, about one third that of stainless steel, aluminium components still weigh less than their stainless steel equivalents.
All of these properties make aluminium an ideal material for vacuum components – and Atlas Technologies’ ability to create bi-metal flanges for aluminium vacuum systems means that both researchers and industrial users can gain from the UHV and XHV benefits of aluminium.
If you love science and are near London, the Royal Society runs a wonderful series of public events that are free of charge. This week, I had the pleasure of attending the Royal Society Milner Prize Lecture, which was given by the quantum cryptography pioneer Artur Ekert. The prize is described as “the premier European award for outstanding achievement in computer science” and his lecture was called “Privacy for the paranoid ones: the ultimate limits of secrecy“. I travelled up from Bristol to see the lecture and I enjoyed it very much.
Ekert has academic appointments at the University of Oxford, the National University of Singapore and the Okinawa Institute of Technology. He bagged this year’s prize, “For his pioneering contributions to quantum communication and computation, which transformed the field of quantum information science from a niche academic activity into a vibrant interdisciplinary field of industrial relevance”.
Ekert is perhaps most famous for his invention in 1991 of entanglement-based quantum cryptography. However, his lecture kicked-off several millennia earlier with an example of a permutation cypher called a scytale. Used by the ancient Greeks, the cypher conceals a message in a series of letters written on a strip of paper. When the paper is wound around a cylinder of the correct radius, the message appears – so not that difficult to decipher if you have a set of cylinders of different radii.
Several hundred years later things had improved somewhat, with the Romans using substitution cyphers whereby letters are substituted for each other according to a secret key that is shared by sender and receiver. The problem with this, explained Ekert, is that if the same key is used to encrypt multiple messages, patterns will emerge in the secret messages. For example, “e” is the most common letter in English, and if it is substituted by “p”, then that letter will be the most common letter in the encrypted messages.
Maths and codebreaking
Ekert said that this statistical codebreaking technique was developed in the 9th century by the Arab polymath Al-Kindi. This appears to be the start of the centuries-long relationship between mathematicians and code makers and breakers that thrives today at places like the UK’s Government Communications Headquarters (GCHQ).
Substitution cyphers can be improved by constantly changing the key, but then the problem becomes how to distribute keys in a secure way – and that’s where quantum physics comes in. While classical key distribution protocols like RSA are very difficult to crack, quantum protocols can be proven to be unbreakable – assuming that they are implemented properly.
Ekert’s entanglement-based protocol is called E91, and he explained how it has its roots in the Einstein–Podolsky–Rosen (EPR) paradox. This is a thought experiment that was devised in 1935 by Albert Einstein and colleagues to show that quantum mechanics was “incomplete” in how it described reality. They argued that classical physics with extra “hidden variables” could explain correlations that arise when measurements are made on two particles that are in what we now call a quantum-entangled state.
Ekert then fast-forwarded nearly three decades to 1964, when the Northern Irish physicist John Bell came up with a mathematical framework to test whether an entangled quantum state can indeed be described using classical physics and hidden variables. Starting in the 1970s, physicists did a series of experiments called Bell tests that have established that correlations observed in quantum systems cannot be explained by classical physics and hidden variables. This work led to John Clauser, Alain Aspect and Anton Zeilinger sharing the 2022 Nobel Prize for Physics.
Test for eavesdropping
In 1991, Ekert realised that a Bell test could be used to reveal whether a secret communication using entangled photons had been intercepted by an eavesdropper. The idea is that the eavesdropper’s act of measurement would destroy entanglement and leave the photon pairs with classical, rather than quantum, correlations.
That year, Ekert along with John Rarity and Paul Tapster demonstrated E91 at the UK’s Defence Research Agency in Malvern. In the intervening decades E91 and other quantum key distribution (QKD) protocols have been implemented in a number of different scenarios – including satellite communications – and some QKD protocols are commercially available.
However, Ekert points out that quantum solutions are not available for all cryptographic applications – they tend to work best for the exchange of messages, rather than the password protection of documents, for example. He also said that developers and users must ensure that QKD protocols are implemented properly using equipment that works as expected. Indeed, Ekert points out that the current interest in identifying and closing “Bell loopholes” is related to QKD. Loopholes are situations where classical phenomena could inadvertently affect a Bell test, making a classical system appear quantum.
So, there is much more work for Ekert and his colleagues to do in quantum cryptography. And if the enthusiasm of his talk is any indication, Ekert is up for the challenge.
We are entering a second golden age of space travel – with human missions to the Moon and Mars planned for the near future. In this episode of the Physics World Weekly podcast we explore two very different challenges facing the next generation of cosmic explorers.
First up, the radiation oncologist James Welsh chats with Physics World’s Tami Freeman about his new ebook about the biological effects of space radiation on astronauts. They talk about the types and origins of space radiation and how they impact human health. Despite the real dangers, Welsh explains that the human body appears to be more resilient to radiation than are the microelectronics used on spacecraft. Based at Loyola Medicine in the US, Welsh explains why damage to computers, rather than the health of astronauts, could be the limiting factor for space exploration.
Later in the episode I am in conversation with two physicists who have written a paper about how we could implement a universal time standard for the Moon. Based at the US’s National Institute of Standards and Technology (NIST), Biju Patla and Neil Ashby, explain how atomic clocks could be used to create a time system that would making coordinating lunar activities easier – and could operate as a GPS-like system to facilitate navigation. They also say that such a lunar system could be a prototype for a more ambitious system on Mars.
They are the chemist Robert Hoye; the physicists Nakita Noel and Pascal Kaienburg; and the materials scientist Sebastian Bonilla. We define what sustainability means in the context of photovoltaics and we look at the challenges and opportunities for making sustainable solar cells using silicon, perovskites, organic semiconductors and other materials.
This podcast is supported by Pfeiffer Vacuum+Fab Solutions.
Pfeiffer is part of the Busch Group, one of the world’s largest manufacturers of vacuum pumps, vacuum systems, blowers, compressors and gas abatement systems. Explore its products at the Pfeiffer website.
Physicists and others with STEM backgrounds are sought after in industry for their analytical skills. However, traditional training in STEM subjects is often lacking when it comes to nurturing the soft skills that are needed to succeed in managerial and leadership positions.
Our guest in this podcast is Peter Hirst, who is Senior Associate Dean, Executive Education at the MIT Sloan School of Management. He explains how MIT Sloan works with executives to ensure that they efficiently and effectively acquire the skills and knowledge needed to be effective leaders.
This podcast is sponsored by the MIT Sloan School of Management
This episode of the Physics World Weekly podcast, features the physicist and engineer Julia Sutcliffe, who is chief scientific adviser to the UK government’s Department for Business and Trade.
In a wide-ranging conversation with Physics World’s Matin Durrani, Sutcliffe explains how she began her career as a PhD physicist before working in systems engineering at British Aerospace – where she worked on cutting-edge technologies including robotics, artificial intelligence, and autonomous systems. They also chat about Sutcliffe’s current role advising the UK government to ensure that policymaking is underpinned by the best evidence.