Learn how ice crystallizes on the surface of Europa, a moon of Jupiter, may hold promise for signs of extraterrestrial life within its subsurface ocean.
Quantum science is enjoying a renaissance as nascent quantum computers emerge from the lab and quantum sensors are being used for practical applications.
As the technologies we use become more quantum in nature, it follows that everyone should have a basic understanding of quantum physics. To explore how quantum physics can be taught to the masses, I am joined by Arjan Dhawan, Aleks Kissinger and Bob Coecke – who are all based in the UK.
Coecke is chief scientist at Quantinuum – which develops quantum computing hardware and software. Kissinger is associate professor of quantum computing at the University of Oxford; and Dhawan is studying mathematics at the University of Durham.
Kissinger and Coecke have developed a way of teaching quantum physics using diagrams. In 2023, Oxford and Quantinuum joined forces to use the method in a pilot summer programme for 15 to 17 year-olds. Dhawan was one of their students.
In this week's episode of Space Minds Ashley Johnson, President and CFO of Planet, explains the company's ambitious goal to make global change visible, accessible and actionable.
Adaptive radiotherapy, an advanced cancer treatment in which each fraction is tailored to the patient’s daily anatomy, offers the potential to maximize target conformality and minimize dose to surrounding healthy tissue. Based on daily scans – such as MR images recorded by an MR-Linac, for example – treatment plans are adjusted each day to account for anatomical changes in the tumour and surrounding healthy tissue.
Creating a new plan for every treatment fraction, however, increase the potential for errors, making fast and effective quality assurance (QA) procedures more important than ever. To meet this need, the physics team at Hospital Almater in Mexicali, Mexico, is using Elekta ONE | QA, powered by ThinkQA Secondary Dose Check* (ThinkQA SDC) software to ensure that each adaptive plan is safe and accurate before it is delivered to the patient.
Radiotherapy requires a series of QA checks prior to treatment delivery, starting with patient-specific QA, where the dose calculated by the treatment planning system is delivered to a phantom. This procedure ensures that the delivered dose distribution matches the prescribed plan. Alongside, secondary dose checks can be performed, in which an independent algorithm verifies that the calculated dose distribution corresponds with that delivered to the actual patient anatomy.
“The secondary dose check is an independent dose calculation that uses a different algorithm to the one in the treatment planning system,” explains Alexis Cabrera Santiago, a medical physicist at Hospital Almater. “ThinkQA SDC software calculates the dose based on the patient anatomy, which is actually more realistic than using a rigid phantom, so we can compare both results and catch any differences before treatment.”
Pre-treatment verification ThinkQA SDC’s unique dose calculation method has been specifically designed for Elekta Unity. (Courtesy: Elekta)
For adaptive radiotherapy in particular, this second check is invaluable. Performing phantom-based QA following each daily imaging session is often impractical. Instead, in many cases, it’s possible to use ThinkQA SDC instead.
“Secondary dose calculation is necessary in adaptive treatments, for example using the MR-Linac, because you are changing the treatment plan for each session,” says José Alejandro Rojas‑López, who commissioned and validated ThinkQA SDC at Hospital Almater. “You are not able to shift the patient to realise patient-specific QA, so this secondary dose check is needed to analyse each treatment session.”
ThinkQA SDC’s ability to achieve patient-specific QA without shifting the patient is extremely valuable, allowing time savings while upholding the highest level of QA safety. “The AAPM TG 219 report recognises secondary dose verification as a validated alternative to patient-specific QA, especially when there is no time for traditional phantom checks in adaptive fractions,” adds Cabrera Santiago.
The optimal choice
At Hospital Almater, all external-beam radiation treatments are performed using an Elekta Unity MR-Linac (with brachytherapy employed for gynaecological cancers). This enables the hospital to offer adaptive radiotherapy for all cases, including head-and-neck, breast, prostate, rectal and lung cancers.
To ensure efficient workflow and high-quality treatments, the team turned to the ThinkQA SDC software. ThinkQA SDC received FDA 510(k) clearance in early 2024 for use with both the Unity MR-Linac and conventional Elekta linacs.
Rojas‑López (who now works at Hospital Angeles Puebla) says that the team chose ThinkQA SDC because of its user-friendly interface, ease of integration into the clinical workflow and common integrated QA platform for both CT and MR-Linac systems. The software also offers the ability to perform 3D evaluation of the entire planning treatment volume (PTV) and the organs-at-risk, making the gamma evaluation more robust.
Physics team Alexis Cabrera Santiago and José Alejandro Rojas‑López. (Courtesy: José Alejandro Rojas‑López/Hospital Almater)
Commissioning of ThinkQA SDC was fast and straightforward, Rojas‑López notes, requiring minimal data input into the software. For absolute dose calibration, the only data needed are the cryostat dose attenuation response, the output dose geometry and the CT calibration.
“This makes a difference compared with other commercial solutions where you have to introduce more information, such as MLC [multileaf collimator] leakage and MLC dosimetric leaf gap, for example,” he explains. “If you have to introduce more data for commissioning, this delays the clinical introduction of the software.”
Cabrera Santiago is now using ThinkQA SDC to provide secondary dose calculations for all radiotherapy treatments at Hospital Almater. The team has established a protocol with a 3%/2 mm gamma criterion, a tolerance limit of 95% and an action limit of 90%. He emphasizes that the software has proved robust and flexible, and provides confidence in the delivered treatment.
“ThinkQA SDC lets us work with more confidence, reduces risk and saves time without losing control over the patient’s safety,” he says. “It checks that the plan is correct, catches issues before treatment and helps us find any problems like set-up errors, contouring mistakes and planning issues.”
The software integrates smoothly into the Elekta ONE adaptive workflow, providing reliable results without slowing down the clinical workflow. “In our institution, we set up ThinkQA SDC so that it automatically receives the new plan, runs the check, compares it with the original plan and creates a report – all in around two minutes,” says Cabrera Santiago. “This saves us a lot of time and removes the need to do everything manually.”
A case in point
As an example of ThinkQA SDC’s power to ease the treatment workflow, Rojas‑López describes a paediatric brain tumour case at Hospital Almater. The young patient needed sedation during their treatment, requiring the physics team to optimize the treatment time for the entire adaptive radiotherapy workflow. “ThinkQA SDC served to analyse, in a fast mode, the treatment plan QA for each session. The measurements were reliable, enabling us to deliver all of the treatment sessions without any delay,” he explains.
Indeed, the ability to use secondary dose checks for each treatment fraction provides time advantages for the entire clinical workflow over phantom-based pre-treatment QA. “Time in the bunker is very expensive,” Rojas‑López points out. “If you reduce the time required for QA, you can use the bunker for patient treatments instead and treat more patients during the clinical time. Secondary dose check can optimize the workflow in the entire department.”
The team at Hospital Almater concur that ThinkQA SDC provides a reliable tool to evaluate radiation treatments, including the first fraction and all of the adaptive sessions, says Rojas‑López. “You can use it for all anatomical sites, with reliable and confident results,” he notes. “And you can reduce the need for measurements using another patient-specific QA tool.”
“I think that any centre doing adaptive radiotherapy should seriously consider using a tool like ThinkQA SDC,” adds Cabrera Santiago.
*ThinkQA is manufactured by DOSIsoft S.A. and distributed by Elekta.
Chinese rocket maker Sepoch has carried out a first vertical liftoff and splashdown landing ahead of a potential orbital launch attempt later this year.
Off-world agriculture has long seemed experimental, but that could soon change thanks to a collaboration between design firm Heatherwick Studio and the space architecture nonprofit Aurelia Institute.
Anna’s hummingbirds have evolved to have longer, larger beaks to access backyard feeders in urban areas. It could be a step toward becoming a “commensal” species that lives alongside humans, like pigeons.
The first high-resolution images of Bolivia’s Uturuncu volcano have yielded unprecedented insights into whether this volcanic “zombie” is likely to erupt in the near future. The images were taken using a technique that combines seismology, rock physics and petrological analyses, and the scientists who developed it say it could apply to other volcanoes, too.
Volcanic eruptions occur when bubbles of gases such as SO2 and CO2 rise to the Earth’s surface through dikes and sills in the planet’s crust, bringing hot, molten rock known as magma with them. To evaluate the chances of this happening, researchers need to understand how much gas and melted rock have accumulated in the volcano’s shallow upper crust, or crater. This is not easy, however, as the structures that convey gas and magma to the surface are complex and mapping them is challenging with current technologies.
A zombie volcano
In the new work, a team led by Mike Kendall of the University of Oxford, UK and Haijiang Zhang from the University of Science and Technology of China (USTC) employed a combination of seismological and petrophysical analyses to create such a map for Uturuncu. Located in the Central Andes, this volcano formed in the Pleistocene era (around 2.58 million to 11,700 years ago) as the oceanic Nazca plate was forced beneath the South American continental plate. It is made up of around 50 km3 of homogeneous, porphyritic dacite lava flows that are between 62% and 67% silicon dioxide (SiO2) by weight, and it sits atop the Altiplano–Puna magma body, which is the world’s largest body of partially-melted silicic rock.
Although Uturuncu has not erupted for nearly 250,000 years, it is not extinct. It regularly emits plumes of gas, and earthquakes are a frequent occurrence in the shallow crust beneath and around it. Previous geodetic studies also detected a 150-km-wide deformed region of rock centred around 3 km southwest of its summit. These signs of activity, coupled with Uturuncu’s lack of a geologically recent eruption, have led some scientists to describe it as a “zombie”.
Movement of liquid and gas explains Uturuncu’s unrest
To tease out the reasons for Uturuncu’s semi-alive behaviour, the team turned to seismic tomography – a technique Kendall compares to medical imaging of a human body. The idea is to detect the seismic waves produced by earthquakes travelling through the Earth’s crust, analyse their arrival times, and use this information to create three-dimensional images of what lies beneath the surface of the structure being studied.
Writing in PNAS, Kendall and colleagues explain that they used seismic tomography to analyse signals from more than 1700 earthquakes in the region around Uturuncu. They performed this analysis in two ways. First, they assumed that seismic waves travel through the crust at the same speed regardless of their direction of propagation. This isotropic form of tomography gave them a first image of the region’s structure. In their second analysis, they took the directional dependence of the seismic waves’ speed into account. This anisotropic tomography gave them complementary information about the structure.
The researchers then combined their tomographic measurements with previous geophysical imaging results to construct rock physics models. These models contain information about the paths that hot migrating fluids and gases take as they migrate to the surface. In Uturuncu’s case, the models showed fluids and gases accumulating in shallow magma reservoirs directly below the volcano’s crater and down to a depth of around 5 km. This movement of liquid and gas explains Uturuncu’s unrest, the team say, but the good news is that it has a low probability of producing eruptions any time soon.
According to Kendall, the team’s methods should be applicable to more than 1400 other potentially active volcanoes around the world. “It could also be applied to identifying potential geothermal energy sites and for critical metal recovery in volcanic fluids,” he tells Physics World.
NASA has decided to switch to a backup propellant line on its Psyche asteroid mission to allow the spacecraft to resume use of its electric propulsion system.
Learn how a possible ninth planet, the theoretical Planet X, could have acquired its wide orbit in the outer Solar System — a process that’s also applicable for other planets.
This simple, at-home method to check hormone levels, using floss, could pave the way for a new generation of monitoring tools for a range of health issues.
China launched its second planetary exploration mission Wednesday, sending Tianwen-2 to sample a near Earth asteroid and later survey a main belt comet.
Hidden depths Shengxi Huang (left) with members of her lab at Rice University in the US, where she studies 2D materials as single-photon sources. (Courtesy: Jeff Fitlow)
Everyday life is three dimensional, with even a sheet of paper having a finite thickness. Shengxi Huang from Rice University in the US, however, is attracted by 2D materials, which are usually just one atomic layer thick. Graphene is perhaps the most famous example — a single layer of carbon atoms arranged in a hexagonal lattice. But since it was first created in 2004, all sorts of other 2D materials, notably boron nitride, have been created.
Her group at Rice currently has 12 people, including eight graduate students and four postdocs. Some are physicists, some are engineers, while others have backgrounds in material science or chemistry. But they all share an interest in understanding the optical and electronic properties of quantum materials and seeing how they can be used, for example, as biochemical sensors. Lab equipment from Picoquant is vital in helping in that quest, as Huang explains in an interview with Physics World.
Why are you fascinated by 2D materials?
I’m an electrical engineer by training, which is a very broad field. Some electrical engineers focus on things like communication and computing, but others, like myself, are more interested in how we can use fundamental physics to build useful devices, such as semiconductor chips. I’m particularly interested in using 2D materials for optoelectronic devices and as single-photon emitters.
What kinds of 2D materials do you study?
The materials I am particularly interested in are transition metal dichalcogenides, which consist of a layer of transition-metal atoms sandwiched between two layers of chalcogen atoms – sulphur, selenium or tellurium. One of the most common examples is molybdenum disulphide, which in its monolayer form has a layer of sulphur on either side of a layer of molybdenum. In multi-layer molybdenum disulphide, the van der Waals forces between the tri-layers are relatively weak, meaning that the material is widely used as a lubricant – just like graphite, which is a many-layer version of graphene.
Why do you find transition metal dichalcogenides interesting?
Transition metal dichalcogenides have some very useful optoelectronic properties. In particular, they emit light whenever the electron and hole that make up an “exciton” recombine. Now because these dichalcogenides are so thin, most of the light they emit can be used. In a 3D material, in contrast, most light is generated deep in the bulk of the material and doesn’t penetrate beyond the surface. Such 2D materials are therefore very efficient and, what’s more, can be easily integrated onto chip-based devices such as waveguides and cavities.
Transition metal dichalcogenide materials also have promising electronic applications, particularly as the active material in transistors. Over the years, we’ve seen silicon-based transistors get smaller and smaller as we’ve followed Moore’s law, but we’re rapidly reaching a limit where we can’t shrink them any further, partly because the electrons in very thin layers of silicon move so slowly. In 2D transition metal dichalcogenides, in contrast, the electron mobility can actually be higher than in silicon of the same thickness, making them a promising material for future transistor applications.
What can such sources of single photons be used for?
Single photons are useful for quantum communication and quantum cryptography. Carrying information as zero and one, they basically function as a qubit, providing a very secure communication channel. Single photons are also interesting for quantum sensing and even quantum computing. But it’s vital that you have a highly pure source of photons. You don’t want them mixed up with “classical photons”, which — like those from the Sun — are emitted in bunches as otherwise the tasks you’re trying to perform cannot be completed.
What approaches are you taking to improve 2D materials as single-photon emitters?
What we do is introduce atomic defects into a 2D material to give it optical properties that are different to what you’d get in the bulk. There are several ways of doing this. One is to irradiate a sample with ions or electrons, which can bombard individual atoms out to generate “vacancy defects”. Another option is to use plasmas, whereby atoms in the sample get replaced by atoms from the plasma.
So how do you study the samples?
We can probe defect emission using a technique called photoluminescence, which basically involves shining a laser beam onto the material. The laser excites electrons from the ground state to an excited state, prompting them to emit light. As the laser beam is about 500-1000 nm in diameter, we can see single photon emission from an individual defect if the defect density is suitable.
Beyond the surface Shengxi Huang (second right) uses equipment from PicoQuant to probe 2D materials. (Courtesy: Jeff Fitlow)
What sort of experiments do you do in your lab?
We start by engineering our materials at the atomic level to introduce the correct type of defect. We also try to strain the material, which can increase how many single photons are emitted at a time. Once we’ve confirmed we’ve got the correct defects in the correct location, we check the material is emitting single photons by carrying out optical measurements, such as photoluminescence. Finally, we characterize the purity of our single photons – ideally, they shouldn’t be mixed up with classical photons but in reality, you never have a 100% pure source. As single photons are emitted one at a time, they have different statistical characteristics to classical light. We also check the brightness and lifetime of the source, the efficiency, how stable it is, and if the photons are polarized. In fact, we have a feedback loop: what improvements can we do at the atomic level to get the properties we’re after?
Is it difficult adding defects to a sample?
It’s pretty challenging. You want to add just one defect to an area that might be just one micron square so you have to control the atomic structure very finely. It’s made harder because 2D materials are atomically thin and very fragile. So if you don’t do the engineering correctly, you may accidentally introduce other types of defects that you don’t want, which will alter the defects’ emission.
What techniques do you use to confirm the defects are in the right place?
Because the defect concentration is so low, we cannot use methods that are typically used to characterise materials, such as X-ray photo-emission spectroscopy or scanning electron microscopy. Instead, the best and most practical way is to see if the defects generate the correct type of optical emission predicted by theory. But even that is challenging because our calculations, which we work on with computational groups, might not be completely accurate.
How do your PicoQuant instruments help in that regard?
We have two main pieces of equipment – a MicroTime 100 photoluminescence microscope and a FluoTime 300 spectrometer. These have been customized to form a Hanbury Brown Twiss interferometer, which measures the purity of a single photon source. We also use the microscope and spectrometer to characterise photoluminescence spectrum and lifetime. Essentially, if the material emits light, we can then work out how long it takes before the emission dies down.
Did you buy the equipment off-the-shelf?
It’s more of a customised instrument with different components – lasers, microscopes, detectors and so on — connected together so we can do multiple types of measurement. I put in a request to Picoquant, who discussed my requirements with me to work out how to meet my needs. The equipment has been very important for our studies as we can carry out high-throughput measurements over and over again. We’ve tailored it for our own research purposes basically.
So how good are your samples?
The best single-photon source that we currently work with is boron nitride, which has a single-photon purity of 98.5% at room temperature. In other words, for every 200 photons only three are classical. With transition-metal dichalcogenides, we get a purity of 98.3% at cryogenic temperatures.
What are your next steps?
There’s still lots to explore in terms of making better single-photon emitters and learning how to control them at different wavelengths. We also want to see if these materials can be used as high-quality quantum sensors. In some cases, if we have the right types of atomic defects, we get a high-quality source of single photons, which we can then entangle with their spin. The emitters can therefore monitor the local magnetic environment with better performance than is possible with classical sensing methods.
In the evolving landscape of space technology, a pivotal transformation is quietly taking shape: the development of spacecraft autonomy. While launch capabilities often dominate headlines, the real innovation frontier lies […]