↩ Accueil

Vue normale

Reçu avant avant-hier

Cancer centres streamline radiotherapy workflow with SunCHECK QA platform

2 juillet 2025 à 16:00

As the number of cancer cases continues to grow, radiation oncology departments are under increasing pressure to treat more and more patients. And as clinical facilities expand to manage this ongoing growth, and technology developments increase the complexity of radiotherapy delivery, there’s an urgent need to optimize the treatment workflow without ramping up time or staffing requirements.

To enable this level of optimization, radiation therapy departments will require an efficient quality management system that can handle both machine and patient quality assurance (QA), works seamlessly with treatment devices from multiple vendors, and provides the time savings required to ease staff workload.

Driven by growth

A case in point is the Moffitt Cancer Center in Florida, which in 2018 shifted all of its QA to SunCHECK, a quality management platform from Sun Nuclear that combines hardware and software to streamline treatment and delivery system QA into one centralized platform. Speaking at a recent Sun Nuclear webinar, clinical physicist Daniel Opp explained that the primary driver for this switch was growth.

Daniel Opp
Daniel Opp “Having one system means that we’re able to do tests in the same way across all our linacs.” (Courtesy: D Opp)

“In 2018, our physicians were shifting to perform a lot more SBRT [stereotactic body radiation therapy]. Our leadership had plans in motion to add online adaptive planning as well as expand with opening more radiation oncology centres,” he explained.

At that time, the centre was using multiple software platforms and many different imaging phantoms to run its QA, with physicists still relying on manual measurements and qualitative visual assessments. Now, the team performs all machine QA using SunCHECK Machine and almost all patient-specific QA [PSQA] using SunCHECK Patient.

“Our QA software and data were fractured and all over the place,” said Opp. “The move to SunCHECK made sense as it gave us the ability to integrate all measurements, software and databases into a one-stop shop, providing significant time savings and far cleaner record keeping.”

SunCHECK also simplifies QA procedures by consolidating tests. Opp explained that back in 2018, photon tests on the centre’s linacs required five setups, 12 measurements and manually entering values 22 times; SunCHECK reduced this to one setup, four measurements and no manual entries. “This alone gives you an overview of the significant time savings,” he said.

Another benefit is the ability to automate tests and ensure standardization. “If you tell our large group of physicists to do a picket fence test, we’ll all do it a little differently,” Opp explained. “Having one system on which we’re all running the same tests means that we’re able to do the test in the same way across all our linacs.”

Opp noted that SunCHECK displays all required information on an easy-to-read screen, with the patient QA worklist on one side and the machine QA worklist on the other. “You see a snapshot of the clinic and can figure out if there’s anything you need to take care of. It’s very efficient in letting you know when something needs your attention,” he said.

A unified platform

Medical physicist Patricia Sansourekidou of the University of New Mexico (UNM) Comprehensive Cancer Center in Albuquerque, also implemented SunCHECK to improve the efficiency of the site’s quality management programmes.

Sansourekidou initiated the switch to SunCHECK after joining UNM in 2020 as its new director of medical physics. At that time the cancer centre was treating about 1000 patients per year. But high patient numbers led to a long waiting list – with roughly three months between referral and the start of treatment – and clear need for the facility to expand.

Patricia Sansourekidou
Patricia Sansourekidou “We saw huge time savings for both monthly and daily QA.” (Courtesy: P Sansourekidou)

Assessing the centre’s QA procedures in 2020 revealed that the team was using a wide variety of QA software, making routine checks time consuming. Monthly linac QA, for example, required roughly 32 files and took about 14 hours to perform. In addition, Sansourekidou noted, physicists were spending hours every month adjusting the machines. “One day it was the energy that was off and then the output was off; I soon realised that, in the absence of appropriate software, we were making adjustments back and forth,” she said. “More importantly, we had no way to track these trends.”

Sansourekidou concluded that the centre needed an improved QA solution based on one unified platform. “So we went on a physics hunt,” she said. “We met with every vendor out there and Sun Nuclear won the request for proposal. So we implemented SunCHECK Machine and SunCHECK Patient.”

Switching to SunCHECK reduced monthly QA to just 4–5 hours per linac. “We’re saving about nine hours per linac per month; that’s 324 hours per year when we could be doing something else for our patients,” said Sansourekidou. Importantly, the new software enables the team to visualize trends and assess whether a genuine problem is present.

For daily QA, which previously required numerous spreadsheets and systems, SunCHECK’s daily QA template provides time savings of about 60%. “At six in the morning, that’s important,” Sansourekidou pointed out. Annual QA saw roughly 33% time savings, while for the 70% of patients requiring PSQA, time savings were about 25%.

Another “unexpected side effect” of deploying SunCHECK, said Sansourekidou, is that the IT department was happy to maintain one platform. “Every time we have a new physicist, it’s much easier for our IT department to set them up. That has been a huge benefit for us,” she said. “Additionally, our service engineers are happy because we are not spending hours of their time adjusting the machine back and forth.”

“Overall, I thought there were great improvements that really helped us justify the initial investment – not just monetary, but also time investment from our physics team,” she said.

Reducing QA time requirements
Efficiency savings QA times before and after implementing SunCHECK at the UNM Comprehensive Cancer Center. (Courtesy: Patricia Sansourekidou)

Phantom-free QA

For Opp, one of the biggest features enabled by SunCHECK was the move to phantom-free PSQA, which saves a lot of time and eliminates errors that can be inherent to phantom-based QA. In the last year, the Moffitt team also switched to using DoseCHECK – SunCHECK’s secondary 3D dose calculation algorithm – as the foundation of its quality checks. Alongside, a RayStation script checks plan deliverability to ensure that no problems arise once the patient is on the table.

“We don’t do our pre-treatment QA anymore. We rely on those two to get confidence into the final work and then we run our logs off the first patient fraction,” Opp explained. “We have a large physics group and there was natural apprehension, but everybody got on board and agreed that this was a shift we needed to make. We leveraged DoseCHECK to create a better QA system for ourselves.”

Since 2018, both patient workload and staff numbers at the Moffitt Cancer Center have doubled. By the end of 2025, it will also have almost doubled its number of treatment units. The centre has over 100 SunCHECK users – including therapists, dosimetrists and physicists – and Opp emphasized that the system is robust enough to handle all these users doing different tasks at different times without any issues.

As patient numbers increase, the time savings conferred by SunCHECK help reduce staff workload and improve quality-of-life for users. The centre currently performs about 100 PSQA procedures per week, which would have taken about 37 hours using previous QA processes – a workload that Opp notes would not be managed well. SunCHECK reduced the weekly average to around seven hours.

Similarly, linac QA previously required two or three late nights per month (or one full day on the weekend). “After the switch to SunCHECK, everybody’s pretty much able to get it done in one late night per month,” said Opp. He added that the Moffitt Cancer Center’s continuing growth has required the onboarding of many new physicists – and that it’s significantly easier to train these new staff with all of the QA software in one centralized platform.

Enabling accreditation

Finally, accreditation is essential for radiation oncology departments to demonstrate the ability to deliver safe, high-quality care. The UNM Comprehensive Cancer Centre’s previous American College of Radiology (ACR) accreditation had expired before Sansourekidou’s arrival, and she was keen to rectify this situation. And in March 2024 the centre achieved ASTRO’s APEx accreditation.

“SunCHECK helped with that,” she said. “It wasn’t the only reason, there were other things that we had to improve, but we did come across as having a strong physics programme.”

Achieving accreditation also helps justify the purchase of a totally new QA platform, Sansourekidou explained. “The most important thing to explain to your administration is that if we don’t do things the way that our regulatory bodies advise, then not only will we lose our accreditation, but we will fall behind,” she said.

Sansourekidou emphasized that the efficiency gains conferred by SunCHECK were invaluable for the physics team, particularly for out-of-hours working. “We saw huge time savings for both monthly and daily QA,” she said. “It is a large investment, but improving efficiency through investment in software will really help the department in the long term.”

The post Cancer centres streamline radiotherapy workflow with SunCHECK QA platform appeared first on Physics World.

New open-access journal AI for Science aims to revolutionize scientific discovery

2 juillet 2025 à 10:25
AI for Science journal cover
Intelligent read: the new diamond open-access journal AI for Science will meet the need for high-quality journals dedicated to artificial intelligence (courtesy: IOP Publishing)

Are you in the field of AI for science? Now, you have a new place to share your latest work to the world.  IOP Publishing has partnered with the Songshan Lake Materials Laboratory in China to launch a new diamond” open-access journal to showcase how artificial intelligence (AI) is driving scientific innovationAI for Science (AI4S) will publish high-impact original research, reviews, and perspectives to highlight the transformative applications and impact of AI.

The launch of the interdisciplinary journal AI4S comes as AI technologies become increasingly integral to scientific research from drug discovery to quantum computing and materials science.

AI is one of the most dynamic and rapidly expanding areas of research so much so that in the last five years the topic has expanded by nearly ten times the rate of general scientific output.  

Gian-Marco Rignanese from École Polytechnique de Louvain (EPL) in Belgium, who is the editor-in-chief of Al4S, says he is “very excited” by AI’s transformative potential for science. “It is really disrupting the way research is being performed. AI excels at processing and analyzing large volumes of data quickly and accurately,” he says. “This capability enables researchers to gain insights – or identify patterns – that were previously difficult or impossible to obtain.

Rignanese adds that AI is also accelerating simulations making them “closer to the real world” and large language models and neuro-linguistic programming are changing our way to apprehend the existing literature. “Generative AI holds a lot of promises,” he says.

Rignanese, whose research focuses on investigating and designing advanced materials for electronics, energy storage and energy production in which he uses first-principles simulations and machine learning, says that AI4S “not only targets high standards in terms of quality of the published research” but that it also recognizes the importance of sharing data and software.

The journal recognizes the rapid and multifaceted growth of AI. Notably, in 2025 both the chemistry and physics Nobel prizes went to the science of AI. Research funding is also increasing, with both the US Department of Energy (DOE) and National Science Foundation (NSF) allocating more resources to this field in 2025 than ever before.

In China, AI is emerging as a major priority in which the science community is poised to become a driving force in global development. Reflecting this, AI4S is co-led by editor-in-chief Weihua Wang from the Songshan Lake Materials Laboratory. Songshan Lake Materials Laboratory is a new and leading institute for advanced materials research and innovation that is preparing to focus intensively on AI in the near future.

“Our primary goal with AI for Science is to provide a global forum where scientists can share their cutting-edge research, innovative methodologies, and transformative perspectives,” says Wang The field of AI in scientific research is not only expanding but also evolving at an unprecedented pace, making it vital for professionals to connect and collaborate.”

Wang expressed his optimistic vision for the future of AI in scientific research. “We want AI for Science to be instrumental in creating a more connected and collaborative global community of researchers,” he adds. “Together, we can harness the transformative power of AI to address some of the world’s most pressing scientific challenges and make the field even more impactful.”

Wang notes that the inspiration behind the journal is the potential impact of AI on scientific discovery. “We believe that AI has the power to revolutionize the way research is conducted,” he says. “By providing a space for open dialogue and collaboration, we hope to enable scientists to leverage AI technologies more effectively, ultimately accelerating innovation and improving outcomes across various fields.”

The scope of AI4S is broad yet focused, catering to a wide array of interests within the scientific community. Wang explains that the journal covers various topics. These include: AI algorithms adapted for scientific applications; AI software and toolkits designed specifically for researchers; the importance of AI-ready datasets; and the development of embodied AI systems. These topics aim to bridge the gap between AI technology and its applications across disciplines like materials science, biology, and chemistry.

AI4S is also setting new standards for author experience. Submissions are reviewed by an international editorial board together with the support of a 22-member advisory board composed of leading scientists and engineers. The journal also promises a rapid turnaround in which once accepted, articles are published within 24 hours and assigned a citable digital object identifier (DOI). In addition, from 2025 to 2027, all article publication charges are fully waived, paid for by the Songshan Lake Materials Laboratory.

AI4S joins a growing number of journals focused on machine learning and AI. This includes the IOP’s Machine Learning Series: Machine Learning: Science and Technology; Machine Learning: Engineering; Machine Learning: Earth; and Machine Learning: Health.

“AI is a new approach to science which is really exciting and holds a lot of promises,” adds Rignanese, “so I am convinced that there is room for a journal accompanying this new paradigm.”

For more information or to submit your manuscript, click here.

The post New open-access journal <I>AI for Science</I> aims to revolutionize scientific discovery appeared first on Physics World.

Nanostructured plastics deliver energy innovation

27 juin 2025 à 14:30
capacitor bank image
Power engineering: Multilayered films developed by Peak Nano can improve the performance and resilience of high-voltage capacitors that manage the flow of electricity around power grids (Courtesy: shutterstock/jakit17)

Grid operators around the world are under intense pressure to expand and modernize their power networks. The International Energy Authority predicts that demand for electricity will rise by 30% in this decade alone, fuelled by global economic growth and the ongoing drive towards net zero. At the same time, electrical transmission systems must be adapted to handle the intermittent nature of renewable energy sources, as well as the extreme and unpredictable weather conditions that are being triggered by climate change.

High-voltage capacitors play a crucial role in these power networks, balancing the electrical load and managing the flow of current around the grid. For more than 40 years, the standard dielectric for storing energy in these capacitors has been a thin film of a polymer material called biaxially oriented polypropylene (BOPP). However, as network operators upgrade their analogue-based infrastructure with digital technologies such as solid-state transformers and high-frequency switches, BOPP struggles to provide the thermal resilience and reliability that are needed to ensure the stability, scalability and security of the grid.

“We’re trying to bring innovation to an area that hasn’t seen it for a very long time,” says Dr Mike Ponting, Chief Scientific Officer of Peak Nano, a US firm specializing in advanced polymer materials. “Grid operators have been using polypropylene materials for a generation, with no improvement in capability or performance. It’s time to realize we can do better.”

Peak Nano has created a new capacitor film technology that address the needs of the digital power grid, as well as other demanding energy storage applications such as managing the power supply to data centres, charging solutions for electric cars, and next-generation fusion energy technology. The company’s Peak NanoPlex™ materials are fabricated from multiple thin layers of different polymer materials, and can be engineered to deliver enhanced performance for both electrical and optical applications. The capacitor films typically contain polymer layers anywhere between 32 and 156 nm thick, while the optical materials are fabricated with as many as 4000 layers in films thinner than 300 µm.

“When they are combined together in an ordered, layered structure, the long polymer molecules behave and interact with each other in different ways,” explains Ponting. “By putting the right materials together, and controlling the precise arrangement of the molecules within the layers, we can engineer the film properties to achieve the performance characteristics needed for each application.”

In the case of capacitor films, this process enhances BOPP’s properties by interleaving it with another polymer. Such layered films can be optimized to store four times the energy as conventional BOPP while achieving extremely fast charge/discharge rates. Alternatively, they can be engineered to deliver longer lifetimes at operating temperatures some 50–60°C higher than existing materials. Such improved thermal resilience is useful for applications that experience more heat, such as mining and aerospace, and is also becoming an important priority for grid operators as they introduce new transmission technologies that generate more heat.

rolls of ultrathin film
On a roll: NanoPlex films are made from ultrathin layers of polymer materials (Courtesy: Peak Nano)

“We talked to the users of the components to find out what they needed, and then adjusted our formulations to meet those needs,” says Ponting. “Some people wanted smaller capacitors that store a lot of energy and can be cycled really fast, while others wanted an upgraded version of BOPP that is more reliable at higher temperatures.”

The multilayered materials now being produced by Peak Nano emerged from research Ponting was involved in while he was a graduate student at Case Western Reserve University in the 2000s, where Ponting was a graduate student. Plastics containing just a few layers had originally been developed for everyday applications like gift wrap and food packaging, but scientists were starting to explore the novel optical and electronic properties that emerge when the thickness of the polymer layers is reduced to the nanoscale regime.

Small samples of these polymer nanocomposites produced in the lab demonstrated their superior performance, and Peak Nano was formed in 2016 to commercialize the technology and scale up the fabrication process. “There was a lot of iteration and improvement to produce large quantities of the material while still maintaining the precision and repeatability of the nanostructured layers,” says Ponting, who has been developing these multilayered polymer materials and the required processing technology for more than 20 years. “The film properties we want to achieve require the polymer molecules to be well ordered, and it took us a long time to get it right.”

As part of this development process, Peak Nano worked with capacitor manufacturers to create a plug-and-play replacement technology for BOPP that can be used on the same manufacturing systems and capacitor designs as BOPP today. By integrating its specialist layering technology into these existing systems, Peak Nano has been able to leverage established supply chains for materials and equipment rather than needing to develop a bespoke manufacturing process. “That has helped to keep costs down, which means that our layered material is only slightly more expensive than BOPP,” says Ponting.

Ponting also points out that long term, NanoPlex is a more cost-effective option. With improved reliability and resilience, NanoPlex can double or even quadruple the lifetime of a component. “The capacitors don’t need to be replaced as often, which reduces the need for downtime and offsets the slightly higher cost,” he says.

For component manufacturers, meanwhile, the multilayered films can be used in exactly the same way as conventional materials. “Our material can be wound into capacitors using the same process as for polypropylene,” says Ponting. “Our customers don’t need to change their process; they just need to design for higher performance.”

Initial interest in the improved capabilities of NanoPlex came from the defence sector, with Peak Nano benefiting from investment and collaborative research with the US Defense Advanced Research Projects Agency (DARPA) and the Naval Research Laboratory. Optical films produced by the company have been used to fabricate lenses with a graduated refractive index, reducing the size and weight of head-mounted visual equipment while also sharpening the view. Dielectric films with a high breakdown voltage are also a common requirement within the defence community.

The post Nanostructured plastics deliver energy innovation appeared first on Physics World.

New optical cryostat combines high cooling capacity, low vibrations and large sample area

25 juin 2025 à 17:00

The development of advance quantum materials and devices often involves making measurements at very low temperatures. This is crucial when developing single-photon emitters and detectors for quantum technologies. And even if a device or material will not be used at cryogenic temperatures, researchers will sometimes make measurements at low temperatures in order to reduce thermal noise.

This R&D will often involve optical techniques such as spectroscopy and imaging, which use lasers and free-space optics. These optical systems must remain in alignment to ensure the quality and repeatability of the measurements. Furthermore, the vibration of optical components must be kept to an absolute minimum because motion will degrade the performance of instrumentation.

Minimizing vibration is usually achieved by doing experiments on optical tables, which are very large, heavy and rigid in order to dampen motion. Therefore, when a cryogenic cooler (cryocooler) is deployed on an optical table it is crucial that it does not introduce unwanted vibrations.

Closed-cycle cryocoolers offer an efficient way to cool samples to temperatures as low as ~2 K to 4 K (−272 °C to −269 °C). Much like a domestic refrigerator or air conditioner, these cryocoolers involve the cyclic compression and expansion of a gas – which is helium in cryogenic systems.

In 2010 Montana Instruments founder Luke Mauritsen, a mechanical engineer and entrepreneur, recognized that the future development of quantum materials and devices would rely on optical cryostats that allow researchers to make optical measurements at very low temperatures and at very low levels of vibration. To make that possible, Mauritsen founded Montana Instruments, which in 2010 launched its first low-vibration cryostats. Based in Bozeman, Montana, the company was acquired by Sweden’s Atlas Copco in 2022 and it continues to develop cryogenic technologies for cutting-edge quantum science and other demanding applications.

Until recently, all of Montana’s low-vibration optical cryostats used Gifford–McMahon (GM) cryocoolers. While these systems provide low temperatures and low vibrations, they are limited in terms of the cooling power that they can deliver. This is because operating GM cryocoolers at higher powers results in greater vibrations.

To create a low-vibration cryostat with more cooling power, Montana has developed the Cryostation 200 PT, which is the first Montana system to use a pulse-tube cryocooler. Pulse tubes offer similar cooling powers to GM cryocoolers but at much lower vibration levels. As a result, the Cryostation 200 PT delivers much higher cooling power, while maintaining very low vibrations on par with Montana’s other cryostats.

Montana’s R&D manager Josh Doherty explains, “One major reason that a pulse tube has lower vibrations is that its valve motor can be ‘remote’, located a short distance from coldhead of the cryostat. This allows us to position the valve motor, which generates vibrations, on a cart next to the optical table so its energy can be shunted to the ground, away from the experimental space on the optical table.”

However, isolating the coldhead from the valve motor is not enough to achieve the new cryostat’s very low levels of vibration. During operation, helium gas moves back and forth in the pulse tube and this causes tiny vibrations that are very difficult to mitigate. Using its extensive experience, Montana has minimized the vibrations at the sample/device mount and has also reduced the vibrational energy transferred from the pulse tube to the optical table. Doherty explains that this was done using the company’s patented technologies that minimize the transfer of vibrational energy, while at the same time maximizing thermal conductance between the pulse tube’s first stage and second stage flanges and the sample/device mounting surface(s). This includes the use of flexible, high-thermal-conductivity links and flexible vacuum bellows connections between the coldhead and the sample/device.

Breadboard
200mm breadboard The Cryostation 200 PT offers a large working area that can be accessed via multiple feedthrough options that support free-space optics, RF and DC electrical connections, optical fibres and a vacuum connection. (Courtesy: Montana Instruments)

Doherty adds, “we intentionally design the supporting structure to de-tune it from the pulse tube vibration source”. This was done by first measuring the pulse-tube vibrations in the lab to determine the vibrational frequencies at which energy is transferred to the optical table. Doherty and colleagues then used the ANSYS engineering/multiphysics software to simulate designs of the pulse tube support and the sample mount supporting structures.

“We optimized the supporting structure design, through material choices, assembly methods and geometry to mismatch the simulated natural frequencies of the support structure from the dominant vibrations of the source,” he explains.

As a result, the Cryostation 200 PT delivers more that 250 mW cooling power at 4.2 K, with a peak-to-peak vibrational amplitude of less than 30 nm. This is more than three times the cooling power delivered by Montana’s Cryostation s200, which offers a similarly-sized sample/device area and vibrational performance.

The control unit has a touchscreen user interface, which displays the cryostat temperature, temperature stability and vacuum pressure.

The cryostat has multiple feedthrough options that support free-space optics, RF and DC electrical connections, optical fibres and a vacuum connection. The Cryostation 200 PT supports Montana’s Cryo-Optic microscope objective and nanopositioner, which can be integrated within the cryostat. Also available is a low working distance window, which supports the use of an external microscope.

According to Montana Instrument senior product manager Patrick Gale, the higher cooling power of the Cryostation 200 PT means that it can support larger experimental payloads – meaning that a much wider range of experiments can be done within the cryostat. For example, more electrical connections can be made with the outside world than had been possible before.

“Every wire that you bring into the cryostat increases that heat load a little bit,” explains Gale, adding, “By using a 1 W pulse tube, we can cool the system down faster than any of our other systems”. While Montana’s other systems have typical cooling times of about 10 h, this has been reduced to about 6 h in the Cryostation 200. “This is particularly important for commercial users who are testing multiple samples in a week,” says Gale. “Saving that four hours per measurement allows a user to do two tests per day, versus just one per day.”

According to Gale, applications of the Cryostation 200 PT include developing ion traps for use in quantum computing, quantum sensing and atomic clocks. Other applications related to quantum technologies include the development of photonic devices; spin-based devices included those based on nitrogen-vacancies in diamond; quantum dots; and superconducting circuits.

The post New optical cryostat combines high cooling capacity, low vibrations and large sample area appeared first on Physics World.

Laser World of Photonics showcases cutting-edge optical innovation

19 juin 2025 à 10:45

Laser World of Photonics, the leading trade show for the laser and photonics industry, takes place in Munich from 24 to 27 June. Attracting visitors and exhibitors from around the world, the event features 11 exhibition areas covering the entire spectrum of photonic technologies – including illumination and energy, biophotonics, data transmission, integrated photonics, laser systems, optoelectronics, sensors and much more.

Running parallel and co-located with Laser World of Photonics is World of Quantum, the world’s largest trade fair for quantum technologies. Showcasing all aspects of quantum technologies – from quantum sensors and quantum computers to quantum communications and cryptography – the event provides a platform to present innovative quantum-based products and discuss potential applications.

Finally, the World of Photonics Congress (running from 22 to 27 June) features seven specialist conferences, over 3000 lectures and around 6700 experts from scientific and industrial research.

The event is expecting to attract around 40,000 visitors from 70 countries, with the trade shows incorporating 1300 exhibitors from 40 countries. Here are some of the companies and product innovations to look out for on the show floor.

HOLOEYE unveils compact 4K resolution spatial light modulator

HOLOEYE Photonics AG, a leading provider of spatial light modulator (SLM) devices, announces the release of the GAEA-C spatial light modulator, a compact version of the company’s high-resolution SLM series. The GAEA-C will be officially launched at Laser World of Photonics, showcasing its advanced capabilities and cost-effective design.

The GAEA-C spatial light modulator
Compact and cost-effective The GAEA-C spatial light modulator is ideal for a variety of applications requiring precise light modulation. (Courtesy: HOLOEYE)

The GAEA-C is a phase-only SLM with a 4K resolution of 4094 x 2400 pixels, with an exceptionally small pixel pitch of 3.74 µm. This compact model is equipped with a newly developed driver solution that not only reduces costs but also enhances phase stability, making it ideal for a variety of applications requiring precise light modulation.

The GAEA-C SLM features a reflective liquid crystal on silicon (LCOS) display (phase only). Other parameters include a fill factor of 90%, an input frame rate of 30 Hz and a maximum spatial resolution of 133.5 lp/mm.

The GAEA-C is available in three versions, each optimized for a different wavelength range: a VIS version (420–650 nm), a NIR version (650–1100 nm) and a version tailored for the telecommunications waveband around 1550 nm. This versatility ensures that the GAEA-C can meet the diverse needs of industries ranging from telecoms to scientific research.

HOLOEYE continues to lead the market with its innovative SLM solutions, providing unparalleled resolution and performance. The introduction of the GAEA-C underscores HOLOEYE’s commitment to delivering cutting-edge technology that meets the evolving demands of its customers.

  • For more information about the GAEA-C and other SLM products, visit HOLOEYE at booth #225 in Hall A2.

Avantes launches NIR Enhanced spectrometers

At this year’s Laser World of Photonics, Avantes unveils its newest generation of spectrometers: the NEXOS NIR Enhanced and VARIUS NIR Enhanced. Both instruments mark a significant leap in near-infrared (NIR) spectroscopy, offering up to 2x improved sensitivity and unprecedented data quality for integration into both research and industry applications.

NEXOS NIR Enhanced spectrometer
Solving spectroscopy challenges Visit Avantes at booth 218, Hall A3, for hands-on demonstrations of its newest generation of spectrometers. (Courtesy: Avantes)

Compact, robust and highly modular, the NEXOS NIR Enhanced spectrometer redefines performance in a small form factor. It features enhanced NIR quantum efficiency in the 700–1100 nm range, with up to 2x increased sensitivity, fast data transfer and improved signal-to-noise ratio. The USB-powered spectrometer is designed with a minimal footprint of just 105 x 80 x 20 mm and built using AvaMation production for top-tier reproducibility and scalability. It also offers seamless integration with third-party software platforms.

The NEXOS NIR Enhanced is ideal for food sorting, Raman applications and VCSEL/laser system integration, providing research-grade performance in a compact housing. See the NEXOS NIR Enhanced product page for further information.

Designed for flexibility and demanding industrial environments, the VARIUS NIR Enhanced spectrometer introduces a patented optical bench for supreme accuracy, with replaceable slits for versatile configurations. The spectrometer offers a dual interface – USB 3.0 and Gigabit Ethernet – plus superior stray light suppression, high dynamic range and enhanced NIR sensitivity in the 700–1100 nm region.

With its rugged form factor (183 x 130 x 45.2 mm) and semi-automated production process, the VARIUS NIR is optimized for real-time applications, ensuring fast data throughput and exceptional reliability across industries. For further information, see the VARIUS NIR Enhanced product page.

Avantes invites visitors to experience both systems live at Laser World of Photonics 2025. Meet the team for hands-on demonstrations, product insights and expert consultations. Avantes offers free feasibility studies and tailored advice to help you identify the optimal solution for your spectroscopy challenges.

  • For more information, visit www.avantes.com or meet Avantes at booth #218 in Hall A3.

HydraHarp 500: a new era in time-correlated single-photon counting

Laser World of Photonics sees PicoQuant introduce its newest generation of event timer and time-correlated single-photon counting (TCSPC) unit – the HydraHarp 500. Setting a new standard in speed, precision and flexibility, the TCSPC unit is freely scalable with up to 16 independent channels and a common sync channel, which can also serve as an additional detection channel if no sync is required.

HydraHarp 500
Redefining what’s possible PicoQuant presents HydraHarp 500, a next-generation TCSPC unit that maximizes precision, flexibility and efficiency. (Courtesy: PicoQuant)

At the core of the HydraHarp 500 is its outstanding timing precision and accuracy, enabling precise photon timing measurements at exceptionally high data rates, even in demanding applications.

In addition to the scalable channel configuration, the HydraHarp 500 offers flexible trigger options to support a wide range of detectors, from single-photon avalanche diodes to superconducting nanowire single-photon detectors. Seamless integration is ensured through versatile interfaces such as USB 3.0 or an external FPGA interface for data transfer, while White Rabbit synchronization allows precise cross-device coordination for distributed setups.

The HydraHarp 500 is engineered for high-throughput applications, making it ideal for rapid, large-volume data acquisition. It offers 16+1 fully independent channels for true simultaneous multi-channel data recording and efficient data transfer via USB or the dedicated FPGA interface. Additionally, the HydraHarp 500 boasts industry-leading, extremely low dead-time per channel and no dead-time across channels, ensuring comprehensive datasets for precise statistical analysis.

The HydraHarp 500 is fully compatible with UniHarp, a sleek, powerful and intuitive graphical user interface. UniHarp revolutionizes the interaction with PicoQuant’s TCSPC and time tagging electronics, offering seamless access to advanced measurement modes like time trace, histogram, unfold, raw and correlation (including FCS and g²).

Step into the future of photonics and quantum research with the HydraHarp 500. Whether it’s achieving precise photon correlation measurements, ensuring reproducible results or integrating advanced setups, the HydraHarp 500 redefines what’s possible – offering precision, flexibility and efficiency combined with reliability and seamless integration to achieve breakthrough results.

For more information, visit www.picoquant.com or contact us at info@picoquant.com.

  • Meet PicoQuant at booth #216 in Hall B2.

SmarAct showcases integrated, high-precision technologies

With a strong focus on turnkey, application-specific solutions, SmarAct offers nanometre-precise motion systems, measurement equipment and scalable micro-assembly platforms for photonics, quantum technologies, semiconductor manufacturing and materials research – whether in research laboratories or high-throughput production environments.

SmarAct’s high-precision technologies
State-of-the-art solutions The SmarAct Group returns to Laser World of Photonics in 2025 with a comprehensive showcase of integrated, high-precision technologies. (Courtesy: SmarAct)

At Laser World of Photonics, SmarAct presents a new modular multi-axis positioning system for quantum computing applications and photonic integrated circuit (PIC) testing. The compact system is made entirely from titanium and features a central XY stage with integrated rotation, flanked by two XYZ modules – one equipped with a tip-tilt goniometer.

For cryogenic applications, the system can be equipped with cold plates and copper braids to provide a highly stable temperature environment, even at millikelvin levels. Thanks to its modularity, the platform can be reconfigured for tasks such as low-temperature scanning or NV centre characterization. When combined with SmarAct’s interferometric sensors, the system delivers unmatched accuracy and long-term stability under extreme conditions.

Also debuting is the SGF series of flexure-based goniometers – compact, zero-backlash rotation stages developed in collaboration with the University of Twente. Constructed entirely from non-ferromagnetic materials, the goniometers are ideal for quantum optics, electron and ion beam systems. Their precision has been validated in a research paper presented at EUSPEN 2023.

Targeting the evolving semiconductor and photonics markets, SmarAct’s optical assembly platforms enable nanometre-accurate alignment and integration of optical components. At their core is a modular high-performance toolkit for application-specific configurations, with the new SmarAct robot control software serving as the digital backbone. Key components include SMARPOD parallel kinematic platforms, long-travel SMARSHIFT electromagnetic linear stages and ultraprecise microgrippers – all seamlessly integrated to perform complex optical alignment tasks with maximum efficiency.

Highlights at Laser World of Photonics include a gantry-based assembly system developed for the active alignment of beam splitters and ferrules, and a compact, fully automated fibre array assembly system designed for multicore and polarization-maintaining fibres. Also on display are modular probing systems for fast, accurate and reliable alignment of fibres and optical elements – providing the positioning precision required for chip- and wafer-level testing of PICs prior to packaging. Finally, the microassembly platform P50 from SmarAct Automation offers a turnkey solution for automating critical micro-assembly tasks such as handling, alignment and joining of tiny components.

Whether you’re working on photonic chip packaging, quantum instrumentation, miniaturized medical systems or advanced semiconductor metrology, SmarAct invites researchers, engineers and decision-makers to experience next-generation positioning, automation and metrology solutions live in Munich.

  • Visit SmarAct at booth #107 in Hall B2.

 

The post Laser World of Photonics showcases cutting-edge optical innovation appeared first on Physics World.

Development and application of a 3-electrode setup for the operando detection of side reactions in Li-Ion batteries

9 juin 2025 à 11:13

webinar main image

Join us to learn about the development and application of a 3-Electrode setup for the operando detection of side reactions in Li-Ion batteries.

Detecting parasitic side reactions originating both from the cathode active materials (CAMs) and the electrolyte is paramount for developing more stable cell chemistries for Li-ion batteries. This talk will present a method for the qualitative analysis of oxidative electrolyte oxidation, as well as the quantification of released lattice oxygen and transition metal ions (TM ions) from the CAM. It is based on a 3-electrode cell design employing a Vulcan carbon-based sense electrode (SE) that is held at a controlled voltage against a partially delithiated lithium iron phosphate (LFP) counter electrode (CE). At this SE, reductive currents can be measured while polarizing a CAM or carbon working electrode (WE) against the same LFP CE. In voltametric scans, we show how the SE potential can be selected to specifically detect a given side reaction during CAM charge/discharge, allowing, e.g., to discriminate between lattice oxygen, protons, and dissolved TMs. Furthermore, it is shown via On-line Electrochemical Mass Spectrometry (OEMS) that O2 reduction in the here-used LP47 electrolyte consumes ~2.3 electrons/O2. Using this value, the lattice oxygen release deduced from the 3-electrode setup upon charging of the NCA WE is in good agreement with OEMS measurements up to NCA potentials >4.65 VLi. At higher potentials, the contributions from the reduction of TM ions can be quantified by comparing the integrated SE current with the O2 evolution from OEMS

Lennart Reuter
Lennart Reuter

Lennart Reuter is a PhD student in the group of Prof Hubert A Gasteiger at the Chair of Technical Electrochemistry at TUM. His research focused on the interfacial processes in lithium-ion batteries that govern calendar life, cycle stability, and rate capability. He advanced the on-line electrochemical mass spectrometry (OEMS) technique to investigate gas evolution mechanisms from interfacial side reactions at the cathode and anode. His work also explored how SEI formation and graphite structural changes affect Li⁺ transport, using impedance spectroscopy and complementary analysis techniques.

 

Leonhard J Reinschluessel
Leonhard J Reinschluessel

Leonhard J Reinschluessel is currently a PhD candidate at at the Chair of Technical Electrochemistry in the Gasteiger research group at the Technical University of Munich (TUM). His current work encompasses an in-depth understanding of the complex interplay of cathode- and electrolyte degradation mechanisms in lithium-ion batteries using operando lab-based and synchrotron techniques. He received his MSc in chemistry from TUM, where he investigated the mitigation of aging of FeNC-based cathode catalyst layers in PEMFCs in his thesis at the Gasteiger group Electrochemistry at TUM. His research focused on the interfacial processes in lithium-ion batteries that govern calendar life, cycle stability, and rate capability. He advanced the on-line electrochemical mass spectrometry (OEMS) technique to investigate gas evolution mechanisms from interfacial side reactions at the cathode and anode. His work also explored how SEI formation and graphite structural changes affect Li⁺ transport, using impedance spectroscopy and complementary analysis techniques.

The post Development and application of a 3-electrode setup for the operando detection of side reactions in Li-Ion batteries appeared first on Physics World.

Richard Bond and George Efstathiou: meet the astrophysicists who are shaping our understanding of the early universe

5 juin 2025 à 17:00

This episode of the Physics World Weekly podcast features George Efstathiou and Richard Bond, who share the 2025 Shaw Prize in Astronomy, “for their pioneering research in cosmology, in particular for their studies of fluctuations in the cosmic microwave background (CMB). Their predictions have been verified by an armada of ground-, balloon- and space-based instruments, leading to precise determinations of the age, geometry, and mass-energy content of the universe.”

Bond and Efstathiou talk about how the CMB emerged when the universe was just 380,000 years old and explain how the CMB is observed today. They explain why studying fluctuations in today’s CMB provides a window into the nature of the universe as it existed long ago, and how future studies could help physicists understand the nature of dark matter – which is one of the greatest mysteries in physics.

Efstathiou is emeritus professor of astrophysics at the University of Cambridge in the UK – and Richard Bond is a professor at the Canadian Institute for Theoretical Astrophysics (CITA) and university professor at the University of Toronto in Canada. Bond and Efstathiou share the 2025 Shaw Prize in Astronomy and its $1.2m prize money equally.

This podcast is sponsored by The Shaw Prize Foundation.

The post Richard Bond and George Efstathiou: meet the astrophysicists who are shaping our understanding of the early universe appeared first on Physics World.

Superconducting innovation: SQMS shapes up for scalable success in quantum computing

5 juin 2025 à 16:00

Developing quantum computing systems with high operational fidelity, enhanced processing capabilities plus inherent (and rapid) scalability is high on the list of fundamental problems preoccupying researchers within the quantum science community. One promising R&D pathway in this regard is being pursued by the Superconducting Quantum Materials and Systems (SQMS) National Quantum Information Science Research Center at the US Department of Energy’s Fermi National Accelerator Laboratory, the pre-eminent US particle physics facility on the outskirts of Chicago, Illinois.

The SQMS approach involves placing a superconducting qubit chip (held at temperatures as low as 10–20 mK) inside a three-dimensional superconducting radiofrequency (3D SRF) cavity – a workhorse technology for particle accelerators employed in high-energy physics (HEP), nuclear physics and materials science. In this set-up, it becomes possible to preserve and manipulate quantum states by encoding them in microwave photons (modes) stored within the SRF cavity (which is also cooled to the millikelvin regime).

Put another way: by pairing superconducting circuits and SRF cavities at cryogenic temperatures, SQMS researchers create environments where microwave photons can have long lifetimes and be protected from external perturbations – conditions that, in turn, make it possible to generate quantum states, manipulate them and read them out. The endgame is clear: reproducible and scalable realization of such highly coherent superconducting qubits opens the way to more complex and scalable quantum computing operations – capabilities that, over time, will be used within Fermilab’s core research programme in particle physics and fundamental physics more generally.

Fermilab is in a unique position to turn this quantum technology vision into reality, given its decadal expertise in developing high-coherence SRF cavities. In 2020, for example, Fermilab researchers demonstrated record coherence lifetimes (of up to two seconds) for quantum states stored in an SRF cavity.

“It’s no accident that Fermilab is a pioneer of SRF cavity technology for accelerator science,” explains Sir Peter Knight, senior research investigator in physics at Imperial College London and an SQMS advisory board member. “The laboratory is home to a world-leading team of RF engineers whose niobium superconducting cavities routinely achieve very high quality factors (Q) from 1010 to above 1011 – figures of merit that can lead to dramatic increases in coherence time.”

Moreover, Fermilab offers plenty of intriguing HEP use-cases where quantum computing platforms could yield significant research dividends. In theoretical studies, for example, the main opportunities relate to the evolution of quantum states, lattice-gauge theory, neutrino oscillations and quantum field theories in general. On the experimental side, quantum computing efforts are being lined up for jet and track reconstruction during high-energy particle collisions; also for the extraction of rare signals and for exploring exotic physics beyond the Standard Model.

SQMS associate scientists Yao Lu and Tanay Roy
Collaborate to accumulate SQMS associate scientists Yao Lu (left) and Tanay Roy (right) worked with PhD student Taeyoon Kim (centre) to develop a two-qudit superconducting QPU with a record coherence lifetime (>20 ms). (Courtesy: Hannah Brumbaugh, Fermilab)

Cavities and qubits

SQMS has already notched up some notable breakthroughs on its quantum computing roadmap, not least the demonstration of chip-based transmon qubits (a type of charge qubit circuit exhibiting decreased sensitivity to noise) showing systematic and reproducible improvements in coherence, record-breaking lifetimes of over a millisecond, and reductions in performance variation.

Key to success here is an extensive collaborative effort in materials science and the development of novel chip fabrication processes, with the resulting transmon qubit ancillas shaping up as the “nerve centre” of the 3D SRF cavity-based quantum computing platform championed by SQMS. What’s in the works is essentially a unique quantum analogue of a classical computing architecture: the transmon chip providing a central logic-capable quantum information processor and microwave photons (modes) in the 3D SRF cavity acting as the random-access quantum memory.

As for the underlying physics, the coupling between the transmon qubit and discrete photon modes in the SRF cavity allows for the exchange of coherent quantum information, as well as enabling quantum entanglement between the two. “The pay-off is scalability,” says Alexander Romanenko, a senior scientist at Fermilab who leads the SQMS quantum technology thrust. “A single logic-capable processor qubit, such as the transmon, can couple to many cavity modes acting as memory qubits.”

In principle, a single transmon chip could manipulate more than 10 qubits encoded inside a single-cell SRF cavity, substantially streamlining the number of microwave channels required for system control and manipulation as the number of qubits increases. “What’s more,” adds Romanenko, “instead of using quantum states in the transmon [coherence times just crossed into milliseconds], we can use quantum states in the SRF cavities, which have higher quality factors and longer coherence times [up to two seconds].”

In terms of next steps, continuous improvement of the ancilla transmon coherence times will be critical to ensure high-fidelity operation of the combined system – with materials breakthroughs likely to be a key rate-determining step. “One of the unique differentiators of the SQMS programme is this ‘all-in’ effort to understand and get to grips with the fundamental materials properties that lead to losses and noise in superconducting qubits,” notes Knight. “There are no short-cuts: wide-ranging experimental and theoretical investigations of materials physics – per the programme implemented by SQMS – are mandatory for scaling superconducting qubits into industrial and scientifically useful quantum computing architectures.”

Laying down a marker, SQMS researchers recently achieved a major milestone in superconducting quantum technology by developing the longest-lived multimode superconducting quantum processor unit (QPU) ever built (coherence lifetime >20 ms). Their processor is based on a two-cell SRF cavity and leverages its exceptionally high quality factor (~1010) to preserve quantum information far longer than conventional superconducting platforms (typically 1 or 2 ms for rival best-in-class implementations).

Coupled with a superconducting transmon, the two-cell SRF module enables precise manipulation of cavity quantum states (photons) using ultrafast control/readout schemes (allowing for approximately 104 high-fidelity operations within the qubit lifetime). “This represents a significant achievement for SQMS,” claims Yao Lu, an associate scientist at Fermilab and co-lead for QPU connectivity and transduction in SQMS. “We have demonstrated the creation of high-fidelity [>95%] quantum states with large photon numbers [20 photons] and achieved ultra-high-fidelity single-photon entangling operations between modes [>99.9%]. It’s work that will ultimately pave the way to scalable, error-resilient quantum computing.”

The SQMS multiqubit QPU prototype
Scalable thinking The SQMS multiqudit QPU prototype (above) exploits 3D SRF cavities held at millikelvin temperatures. (Courtesy: Ryan Postel, Fermilab)

Fast scaling with qudits

There’s no shortage of momentum either, with these latest breakthroughs laying the foundations for SQMS “qudit-based” quantum computing and communication architectures. A qudit is a multilevel quantum unit that can be more than two states and, in turn, hold a larger information density – i.e. instead of working with a large number of qubits to scale information processing capability, it may be more efficient to maintain a smaller number of qudits (with each holding a greater range of values for optimized computations).

Scale-up to a multiqudit QPU system is already underway at SQMS via several parallel routes (and all with a modular computing architecture in mind). In one approach, coupler elements and low-loss interconnects integrate a nine-cell multimode SRF cavity (the memory) to a two-cell SRF cavity quantum processor. Another iteration uses only two-cell modules, while yet another option exploits custom-designed multimodal cavities (10+ modes) as building blocks.

One thing is clear: with the first QPU prototypes now being tested, verified and optimized, SQMS will soon move to a phase in which many of these modules will be assembled and put together in operation. By extension, the SQMS effort also encompasses crucial developments in control systems and microwave equipment, where many devices must be synchronized optimally to encode and analyse quantum information in the QPUs.

Along a related coordinate, complex algorithms can benefit from fewer required gates and reduced circuit depth. What’s more, for many simulation problems in HEP and other fields, it’s evident that multilevel systems (qudits) – rather than qubits – provide a more natural representation of the physics in play, making simulation tasks significantly more accessible. The work of encoding several such problems into qudits – including lattice-gauge-theory calculations and others – is similarly ongoing within SQMS.

Taken together, this massive R&D undertaking – spanning quantum hardware and quantum algorithms – can only succeed with a “co-design” approach across strategy and implementation: from identifying applications of interest to the wider HEP community to full deployment of QPU prototypes. Co-design is especially suited to these efforts as it demands sustained alignment of scientific goals with technological implementation to drive innovation and societal impact.

In addition to their quantum computing promise, these cavity-based quantum systems will play a central role in serving both as the “adapters” and low-loss channels at elevated temperatures for interconnecting chip or cavity-based QPUs hosted in different refrigerators. These interconnects will provide an essential building block for the efficient scale-up of superconducting quantum processors into larger quantum data centres.

Researchers in the control room of the SQMS Quantum Garage facility
Quantum insights Researchers in the control room of the SQMS Quantum Garage facility, developing architectures and gates for SQMS hardware tailored toward HEP quantum simulations. From left to right: Nick Bornman, Hank Lamm, Doga Kurkcuoglu, Silvia Zorzetti, Julian Delgado, Hans Johnson (Courtesy: Hannah Brumbaugh)

 “The SQMS collaboration is ploughing its own furrow – in a way that nobody else in the quantum sector really is,” says Knight. “Crucially, the SQMS partners can build stuff at scale by tapping into the phenomenal engineering strengths of the National Laboratory system. Designing, commissioning and implementing big machines has been part of the ‘day job’ at Fermilab for decades. In contrast, many quantum computing start-ups must scale their R&D infrastructure and engineering capability from a far-less-developed baseline.”

The last word, however, goes to Romanenko. “Watch this space,” he concludes, “because SQMS is on a roll. We don’t know which quantum computing architecture will ultimately win out, but we will ensure that our cavity-based quantum systems will play an enabling role.”

Scaling up: from qubits to qudits

Conceptual illustration of the SQMS Center’s superconducting TESLA cavity coupled to a transmon ancilla qubit
Left: conceptual illustration of the SQMS Center’s superconducting TESLA cavity coupled to a transmon ancilla qubit (AI-generated). Right: an ancilla qubit with two energy levels – ground ∣g⟩ and excited ∣e⟩ – is used to control a high-coherence (d+1) dimensional qudit encoded in a cavity resonator. The ancilla enables state preparation, control and measurement of the qudit. (Courtesy: Fermilab)

The post Superconducting innovation: SQMS shapes up for scalable success in quantum computing appeared first on Physics World.

Secondary dose checks ensure safe and accurate delivery of adaptive radiotherapy

29 mai 2025 à 13:00

Adaptive radiotherapy, an advanced cancer treatment in which each fraction is tailored to the patient’s daily anatomy, offers the potential to maximize target conformality and minimize dose to surrounding healthy tissue. Based on daily scans – such as MR images recorded by an MR-Linac, for example – treatment plans are adjusted each day to account for anatomical changes in the tumour and surrounding healthy tissue.

Creating a new plan for every treatment fraction, however, increase the potential for errors, making fast and effective quality assurance (QA) procedures more important than ever. To meet this need, the physics team at Hospital Almater in Mexicali, Mexico, is using Elekta ONE | QA, powered by ThinkQA Secondary Dose Check* (ThinkQA SDC) software to ensure that each adaptive plan is safe and accurate before it is delivered to the patient.

Radiotherapy requires a series of QA checks prior to treatment delivery, starting with patient-specific QA, where the dose calculated by the treatment planning system is delivered to a phantom. This procedure ensures that the delivered dose distribution matches the prescribed plan. Alongside, secondary dose checks can be performed, in which an independent algorithm verifies that the calculated dose distribution corresponds with that delivered to the actual patient anatomy.

“The secondary dose check is an independent dose calculation that uses a different algorithm to the one in the treatment planning system,” explains Alexis Cabrera Santiago, a medical physicist at Hospital Almater. “ThinkQA SDC software calculates the dose based on the patient anatomy, which is actually more realistic than using a rigid phantom, so we can compare both results and catch any differences before treatment.”

ThinkQA SDC
Pre-treatment verification ThinkQA SDC’s unique dose calculation method has been specifically designed for Elekta Unity. (Courtesy: Elekta)

For adaptive radiotherapy in particular, this second check is invaluable. Performing phantom-based QA following each daily imaging session is often impractical. Instead, in many cases, it’s possible to use ThinkQA SDC instead.

“Secondary dose calculation is necessary in adaptive treatments, for example using the MR-Linac, because you are changing the treatment plan for each session,” says José Alejandro Rojas‑López, who commissioned and validated ThinkQA SDC at Hospital Almater. “You are not able to shift the patient to realise patient-specific QA, so this secondary dose check is needed to analyse each treatment session.”

ThinkQA SDC’s ability to achieve patient-specific QA without shifting the patient is extremely valuable, allowing time savings while upholding the highest level of QA safety. “The AAPM TG 219 report recognises secondary dose verification as a validated alternative to patient-specific QA, especially when there is no time for traditional phantom checks in adaptive fractions,” adds Cabrera Santiago.

The optimal choice

At Hospital Almater, all external-beam radiation treatments are performed using an Elekta Unity MR-Linac (with brachytherapy employed for gynaecological cancers). This enables the hospital to offer adaptive radiotherapy for all cases, including head-and-neck, breast, prostate, rectal and lung cancers.

To ensure efficient workflow and high-quality treatments, the team turned to the ThinkQA SDC software. ThinkQA SDC received FDA 510(k) clearance in early 2024 for use with both the Unity MR-Linac and conventional Elekta linacs.

Rojas‑López (who now works at Hospital Angeles Puebla) says that the team chose ThinkQA SDC because of its user-friendly interface, ease of integration into the clinical workflow and common integrated QA platform for both CT and MR-Linac systems. The software also offers the ability to perform 3D evaluation of the entire planning treatment volume (PTV) and the organs-at-risk, making the gamma evaluation more robust.

Alexis Cabrera Santiago and José Alejandro Rojas‑López
Physics team Alexis Cabrera Santiago and José Alejandro Rojas‑López. (Courtesy: José Alejandro Rojas‑López/Hospital Almater)

Commissioning of ThinkQA SDC was fast and straightforward, Rojas‑López notes, requiring minimal data input into the software. For absolute dose calibration, the only data needed are the cryostat dose attenuation response, the output dose geometry and the CT calibration.

“This makes a difference compared with other commercial solutions where you have to introduce more information, such as MLC [multileaf collimator] leakage and MLC dosimetric leaf gap, for example,” he explains. “If you have to introduce more data for commissioning, this delays the clinical introduction of the software.”

Cabrera Santiago is now using ThinkQA SDC to provide secondary dose calculations for all radiotherapy treatments at Hospital Almater. The team has established a protocol with a 3%/2 mm gamma criterion, a tolerance limit of 95% and an action limit of 90%. He emphasizes that the software has proved robust and flexible, and provides confidence in the delivered treatment.

“ThinkQA SDC lets us work with more confidence, reduces risk and saves time without losing control over the patient’s safety,” he says. “It checks that the plan is correct, catches issues before treatment and helps us find any problems like set-up errors, contouring mistakes and planning issues.”

The software integrates smoothly into the Elekta ONE adaptive workflow, providing reliable results without slowing down the clinical workflow. “In our institution, we set up ThinkQA SDC so that it automatically receives the new plan, runs the check, compares it with the original plan and creates a report – all in around two minutes,” says Cabrera Santiago. “This saves us a lot of time and removes the need to do everything manually.”

A case in point

As an example of ThinkQA SDC’s power to ease the treatment workflow, Rojas‑López describes a paediatric brain tumour case at Hospital Almater. The young patient needed sedation during their treatment, requiring the physics team to optimize the treatment time for the entire adaptive radiotherapy workflow. “ThinkQA SDC served to analyse, in a fast mode, the treatment plan QA for each session. The measurements were reliable, enabling us to deliver all of the treatment sessions without any delay,” he explains.

Indeed, the ability to use secondary dose checks for each treatment fraction provides time advantages for the entire clinical workflow over phantom-based pre-treatment QA. “Time in the bunker is very expensive,” Rojas‑López points out. “If you reduce the time required for QA, you can use the bunker for patient treatments instead and treat more patients during the clinical time. Secondary dose check can optimize the workflow in the entire department.”

Importantly, in a recent study comparing patient-specific QA measurements using Sun Nuclear’s ArcCheck with ThinkQA SDC calculations, Rojas‑López and colleagues confirmed that the two techniques provided comparable results, with very similar gamma passing rates. As such, they are working to reduce phantom measurements and, in most cases, replace them with a secondary dose check using ThinkQA SDC.

The team at Hospital Almater concur that ThinkQA SDC provides a reliable tool to evaluate radiation treatments, including the first fraction and all of the adaptive sessions, says Rojas‑López. “You can use it for all anatomical sites, with reliable and confident results,” he notes. “And you can reduce the need for measurements using another patient-specific QA tool.”

“I think that any centre doing adaptive radiotherapy should seriously consider using a tool like ThinkQA SDC,” adds Cabrera Santiago.

*ThinkQA is manufactured by DOSIsoft S.A. and distributed by Elekta.

The post Secondary dose checks ensure safe and accurate delivery of adaptive radiotherapy appeared first on Physics World.

Shengxi Huang: how defects can boost 2D materials as single-photon emitters

28 mai 2025 à 17:01
Photo of researchers in a lab at Rice University.
Hidden depths Shengxi Huang (left) with members of her lab at Rice University in the US, where she studies 2D materials as single-photon sources. (Courtesy: Jeff Fitlow)

Everyday life is three dimensional, with even a sheet of paper having a finite thickness. Shengxi Huang from Rice University in the US, however, is attracted by 2D materials, which are usually just one atomic layer thick. Graphene is perhaps the most famous example — a single layer of carbon atoms arranged in a hexagonal lattice. But since it was first created in 2004, all sorts of other 2D materials, notably boron nitride, have been created.

An electrical engineer by training, Huang did a PhD at the Massachusetts Institute of Technology and postdoctoral research at Stanford University before spending five years as an assistant professor at the Pennsylvania State University. Huang has been at Rice since 2022, where she is now an associate professor in the Department of Electrical and Computer Engineering, the Department of Material Science and NanoEngineering, and the Department of Bioengineering.

Her group at Rice currently has 12 people, including eight graduate students and four postdocs. Some are physicists, some are engineers, while others have backgrounds in material science or chemistry. But they all share an interest in understanding the optical and electronic properties of quantum materials and seeing how they can be used, for example, as biochemical sensors. Lab equipment from Picoquant is vital in helping in that quest, as Huang explains in an interview with Physics World.

Why are you fascinated by 2D materials?

I’m an electrical engineer by training, which is a very broad field. Some electrical engineers focus on things like communication and computing, but others, like myself, are more interested in how we can use fundamental physics to build useful devices, such as semiconductor chips. I’m particularly interested in using 2D materials for optoelectronic devices and as single-photon emitters.

What kinds of 2D materials do you study?

The materials I am particularly interested in are transition metal dichalcogenides, which consist of a layer of transition-metal atoms sandwiched between two layers of chalcogen atoms – sulphur, selenium or tellurium. One of the most common examples is molybdenum disulphide, which in its monolayer form has a layer of sulphur on either side of a layer of molybdenum. In multi-layer molybdenum disulphide, the van der Waals forces between the tri-layers are relatively weak, meaning that the material is widely used as a lubricant – just like graphite, which is a many-layer version of graphene.

Why do you find transition metal dichalcogenides interesting?

Transition metal dichalcogenides have some very useful optoelectronic properties. In particular, they emit light whenever the electron and hole that make up an “exciton” recombine. Now because these dichalcogenides are so thin, most of the light they emit can be used. In a 3D material, in contrast, most light is generated deep in the bulk of the material and doesn’t penetrate beyond the surface. Such 2D materials are therefore very efficient and, what’s more, can be easily integrated onto chip-based devices such as waveguides and cavities.

Transition metal dichalcogenide materials also have promising electronic applications, particularly as the active material in transistors. Over the years, we’ve seen silicon-based transistors get smaller and smaller as we’ve followed Moore’s law, but we’re rapidly reaching a limit where we can’t shrink them any further, partly because the electrons in very thin layers of silicon move so slowly. In 2D transition metal dichalcogenides, in contrast, the electron mobility can actually be higher than in silicon of the same thickness, making them a promising material for future transistor applications.

What can such sources of single photons be used for?

Single photons are useful for quantum communication and quantum cryptography. Carrying information as zero and one, they basically function as a qubit, providing a very secure communication channel. Single photons are also interesting for quantum sensing and even quantum computing. But it’s vital that you have a highly pure source of photons. You don’t want them mixed up with “classical photons”, which — like those from the Sun — are emitted in bunches as otherwise the tasks you’re trying to perform cannot be completed.

What approaches are you taking to improve 2D materials as single-photon emitters?

What we do is introduce atomic defects into a 2D material to give it optical properties that are different to what you’d get in the bulk. There are several ways of doing this. One is to irradiate a sample with ions or electrons, which can bombard individual atoms out to generate “vacancy defects”. Another option is to use plasmas, whereby atoms in the sample get replaced by atoms from the plasma.

So how do you study the samples?

We can probe defect emission using a technique called photoluminescence, which basically involves shining a laser beam onto the material. The laser excites electrons from the ground state to an excited state, prompting them to emit light. As the laser beam is about 500-1000 nm in diameter, we can see single photon emission from an individual defect if the defect density is suitable.

Photo of researchers in a lab at Rice University
Beyond the surface Shengxi Huang (second right) uses equipment from PicoQuant to probe 2D materials. (Courtesy: Jeff Fitlow)

What sort of experiments do you do in your lab?

We start by engineering our materials at the atomic level to introduce the correct type of defect. We also try to strain the material, which can increase how many single photons are emitted at a time. Once we’ve confirmed we’ve got the correct defects in the correct location, we check the material is emitting single photons by carrying out optical measurements, such as photoluminescence. Finally, we characterize the purity of our single photons – ideally, they shouldn’t be mixed up with classical photons but in reality, you never have a 100% pure source. As single photons are emitted one at a time, they have different statistical characteristics to classical light. We also check the brightness and lifetime of the source, the efficiency, how stable it is, and if the photons are polarized. In fact, we have a feedback loop: what improvements can we do at the atomic level to get the properties we’re after?

Is it difficult adding defects to a sample?

It’s pretty challenging. You want to add just one defect to an area that might be just one micron square so you have to control the atomic structure very finely. It’s made harder because 2D materials are atomically thin and very fragile. So if you don’t do the engineering correctly, you may accidentally introduce other types of defects that you don’t want, which will alter the defects’ emission.

What techniques do you use to confirm the defects are in the right place?

Because the defect concentration is so low, we cannot use methods that are typically used to characterise materials, such as X-ray photo-emission spectroscopy or scanning electron microscopy. Instead, the best and most practical way is to see if the defects generate the correct type of optical emission predicted by theory. But even that is challenging because our calculations, which we work on with computational groups, might not be completely accurate.

How do your PicoQuant instruments help in that regard?

We have two main pieces of equipment – a MicroTime 100 photoluminescence microscope and a FluoTime 300 spectrometer. These have been customized to form a Hanbury Brown Twiss interferometer, which measures the purity of a single photon source. We also use the microscope and spectrometer to characterise photoluminescence spectrum and lifetime. Essentially, if the material emits light, we can then work out how long it takes before the emission dies down.

Did you buy the equipment off-the-shelf?

It’s more of a customised instrument with different components – lasers, microscopes, detectors and so on — connected together so we can do multiple types of measurement. I put in a request to Picoquant, who discussed my requirements with me to work out how to meet my needs. The equipment has been very important for our studies as we can carry out high-throughput measurements over and over again. We’ve tailored it for our own research purposes basically.

So how good are your samples?

The best single-photon source that we currently work with is boron nitride, which has a single-photon purity of 98.5% at room temperature. In other words, for every 200 photons only three are classical. With transition-metal dichalcogenides, we get a purity of 98.3% at cryogenic temperatures.

What are your next steps?

There’s still lots to explore in terms of making better single-photon emitters and learning how to control them at different wavelengths. We also want to see if these materials can be used as high-quality quantum sensors. In some cases, if we have the right types of atomic defects, we get a high-quality source of single photons, which we can then entangle with their spin. The emitters can therefore monitor the local magnetic environment with better performance than is possible with classical sensing methods.

The post Shengxi Huang: how defects can boost 2D materials as single-photon emitters appeared first on Physics World.

What is meant by neuromorphic computing – a webinar debate

23 mai 2025 à 10:08
AI circuit board
(Courtesy: Shutterstock/metamorworks)

There are two main approaches to what we consider neuromorphic computing. The first involves emulating biological neural processing systems through the physics of computation of computational substrates that have similar properties and constraints as real neural systems, with potential for denser structures and advantages in energy cost. The other simulates neural processing systems on scalable architectures that allow the simulation of large neural networks, with higher degree of abstraction, arbitrary precision, high resolution, and no constraints imposed by the physics of the computing medium.

Both may be required to advance the field, but is either approach ‘better’? Hosted by Neuromorphic Computing and Engineering, this webinar will see teams of leading experts in the field of neuromorphic computing argue the case for either approach, overseen by an impartial moderator.

Speakers image. Left to right: Elisa Donati, Jennifer Hasler, Catherine (Katie) Schuman, Emre Neftci, Giulia D’Angelo
Left to right: Elisa Donati, Jennifer Hasler, Catherine (Katie) Schuman, Emre Neftci, Giulia D’Angelo

Team emulation:
Elisa Donati. Elisa’s research interests aim at designing neuromorphic circuits that are ideally suited for interfacing with the nervous system and show how they can be used to build closed-loop hybrid artificial and biological neural processing systems.  She is also involved in the development of neuromorphic hardware and software systems able to mimic the functions of biological brains to apply for medical and robotics applications.

Jennifer Hasler received her BSE and MS degrees in electrical engineering from Arizona State University in August 1991. She received her PhD in computation and neural systems from California Institute of Technology in February 1997. Jennifer is a professor at the Georgia Institute of Technology in the School of Electrical and Computer Engineering; Atlanta is the coldest climate in which she has lived. Jennifer founded the Integrated Computational Electronics (ICE) laboratory at Georgia Tech, a laboratory affiliated with the Laboratories for Neural Engineering. She is a member of Tau Beta P, Eta Kappa Nu, and the IEEE.

Team simulation:
Catherine (Katie) Schuman is an assistant professor in the Department of Electrical Engineering and Computer Science at the University of Tennessee (UT). She received her PhD in computer science from UT in 2015, where she completed her dissertation on the use of evolutionary algorithms to train spiking neural networks for neuromorphic systems. Katie previously served as a research scientist at Oak Ridge National Laboratory, where her research focused on algorithms and applications of neuromorphic systems. Katie co-leads the TENNLab Neuromorphic Computing Research Group at UT. She has written for more than 70 publications as well as seven patents in the field of neuromorphic computing. She received the Department of Energy Early Career Award in 2019. Katie is a senior member of the Association of Computing Machinery and the IEEE.

Emre Neftci received his MSc degree in physics from EPFL in Switzerland, and his PhD in 2010 at the Institute of Neuroinformatics at the University of Zurich and ETH Zurich. He is currently an institute director at the Jülich Research Centre and professor at RWTH Aachen. His current research explores the bridges between neuroscience and machine learning, with a focus on the theoretical and computational modelling of learning algorithms that are best suited to neuromorphic hardware and non-von Neumann computing architectures.

Discussion chair:
Giulia D’Angelo is currently a Marie Skłodowska-Curie postdoctoral fellow at the Czech Technical University in Prague, where she focuses on neuromorphic algorithms for active vision. She obtained a bachelor’s degree in biomedical engineering from the University of Genoa and a master’s degree in neuroengineering with honours. During her master’s, she developed a neuromorphic system for the egocentric representation of peripersonal visual space at King’s College London. She earned her PhD in neuromorphic algorithms at the University of Manchester, receiving the President’s Doctoral Scholar Award, in collaboration with the Event-Driven Perception for Robotics Laboratory at the Italian Institute of Technology. There, she proposed a biologically plausible model for event-driven, saliency-based visual attention. She was recently awarded the Marie Skłodowska-Curie Fellowship to explore sensorimotor contingency theories in the context of neuromorphic active vision algorithms.

About this journal
Neuromorphic Computing and Engineering journal cover

Neuromorphic Computing and Engineering is a multidisciplinary, open access journal publishing cutting-edge research on the design, development and application of artificial neural networks and systems from both a hardware and computational perspective.

Editor-in-chief: Giacomo Indiveri, University of Zurich, Switzerland

 

The post What is meant by neuromorphic computing – a webinar debate appeared first on Physics World.

Electrolysis workstation incorporates mass spectrometry to accelerate carbon-dioxide reduction research

13 mai 2025 à 15:59

The electrochemical reduction of carbon dioxide is used to produce a range of chemical and energy feedstocks including syngas (hydrogen and carbon monoxide), formic acid, methane and ethylene. As well as being an important industrial process, the large-scale reduction of carbon dioxide by electrolysis offers a practical way to capture and utilize carbon dioxide.

As a result, developing new and improved electrochemical processes for carbon-dioxide reduction is an important R&D activity. This work involves identifying which catalyst and electrolyte materials are optimal for efficient production. And when a promising electrochemical system is identified in the lab, the work is not over because the design must be then scaled up to create an efficient and practical industrial process.

Such R&D activities must overcome several challenges in operating and characterizing potential electrochemical systems. These include maintaining the correct humidification of carbon-dioxide gas during the electrolysis process and minimizing the production of carbonates – which can clog membranes and disrupt electrolysis.

While these challenges can be daunting, they can be overcome using the 670 Electrolysis Workstation from US-based Scribner. This is a general-purpose electrolysis system designed to test the materials used in the conversion of electrical energy to fuels and chemical feedstocks – and it is ideal for developing systems for carbon-dioxide reduction.

Turn-key and customizable

The workstation is a flexible system that is both turn-key and customizable. Liquid and gas reactants can be used on one or both of the workstation’s electrodes. Scribner has equipped the 670 Electrolysis Workstation with cells that feature gas diffusion electrodes and membranes from US-based Dioxide Materials. The company specializes in the development of technologies for converting carbon dioxide into fuels and chemicals, and it was chosen by Scribner because Dioxide Materials’ products are well documented in the scientific literature.

The gas diffusion electrodes are porous graphite cathodes through which carbon-dioxide gas flows between input and output ports. The gas can migrate from the graphite into a layer containing a metal catalyst. Membranes are used in electrolysis cells to ensure that only the desired ions are able to migrate across the cell, while blocking the movement of gases.

Two men in a lab
Fully integrated Scribner’s Jarrett Mansergh (left) and Luke Levin-Pompetzki of Hiden Analytical in Scribner’s lab after integrating the electrolysis and mass-spectrometry systems. (Courtesy: Scribner)

The system employs a multi-range  ±20 A and 5 V potentiostat for high-accuracy operation over a wide range of reaction rates and cell sizes. The workstation is controlled by Scribner’s FlowCell™ software, which provides full control and monitoring of test cells and comes pre-loaded with a wide range of experimental protocols. This includes electrochemical impedance spectroscopy (EIS) capabilities up to 20 KHz and cyclic voltammetry protocols – both of which are used to characterize the health and performance of electrochemical systems. FlowCell™ also allows users to set up long duration experiments while providing safety monitoring with alarm settings for the purging of gases.

Humidified gas

The 670 Electrolysis Workstation features a gas handling unit that can supply humidified gas to test cells. Adding water vapour to the carbon-dioxide reactant is crucial because the water provides the protons that are needed to convert carbon dioxide to products such as methane and syngas. Humidifying gas is very difficult and getting it wrong leads to unwanted condensation in the system. The 670 Electrolysis Workstation uses temperature control to minimize condensation. The same degree of control can be difficult to achieve in homemade systems, leading to failure.

The workstation offers electrochemical cells with 5 cm2 and 25 cm2 active areas. These can be used to build carbon-dioxide reduction cells using a range of materials, catalysts and membranes – allowing the performance of these prototype cells to be thoroughly evaluated. By studying cells at these two different sizes, researchers can scale up their electrochemical systems from a preliminary experiment to something that is closer in size to an industrial system. This makes the 670 Electrolysis Workstation ideal for use across university labs, start-up companies and corporate R&D labs.

The workstation can handle, acids, bases and organic solutions. For carbon-dioxide reduction, the cell is operated with a liquid electrolyte on the positive electrode (anode) and gaseous carbon dioxide at the negative electrode (cathode). An electric potential is applied across the electrodes and the product gas comes off the cathode side.

The specific product is largely dependent on the catalyst used at the cathode. If a silver catalyst is used for example, the cell is likely to produce the syngas. If a tin catalyst is used, the product is more likely to be formic acid.

Mass spectrometry

The best way to ensure that the desired products are being made in the cell is to connect the gas output to a mass spectrometer. As a result, Scribner has joined forces with Hiden Analytical to integrate the UK-based company’s HPR-20 mass spectrometer for gas analysis. The Hiden system is specifically configured to perform continuous analysis of evolved gases and vapours from the 670 Electrolysis Workstation.

CO2 reduction cell feature
The Scribner CO2 Reduction Cell Fixture (Courtesy: Scribner)

If a cell is designed to create syngas, for example, the mass spectrometer will determine exactly how much carbon monoxide is being produced and how much hydrogen is being produced. At the same time, researchers can monitor the electrochemical properties of the cell. This allows researchers to study relationships between a system’s electrical performance and the chemical species that it produces.

Monitoring gas output is crucial for optimizing electrochemical processes that minimize negative effects such as the production of carbonates, which is a significant problem when doing carbon dioxide reduction.

In electrochemical cells, carbon dioxide is dissolved in a basic solution. This results in the precipitation of carbonate salts that clog up the membranes in cells, greatly reducing performance. This is a significant problem when scaling up cell designs for industrial use because commercial cells must be very long-lived.

Pulsed-mode operation

One strategy for dealing with carbonates is to operate electrochemical cells in pulsed mode, rather than in a steady state. The off time allows the carbonates to migrate away from electrodes, which minimizes clogging. The 670 Electrolysis Workstation allows users to explore the use of short, second-scale pulses. Another option that researchers can explore is the use of pulses of fresh water to flush carbonates away from the cathode area. These and other options are available in a set of pre-programmed experiments that allow users to explore the mitigation of salt formation in their electrochemical cells.

The gaseous products of these carbonate-mitigation modes can be monitored in real time using Hiden’s mass spectrometer. This allows researchers to identify any changes in cell performance that are related to pulsed operation. Currently, electrochemical and product characteristics can be observed on time scales as short as 100 ms. This allows researchers to fine-tune how pulses are applied to minimize carbonate production and maximize the production of desired gases.

Real-time monitoring of product gases is also important when using EIS to observe the degradation of the electrochemical performance of a cell over time. This provides researchers with a fuller picture of what is happening in a cell as it ages.

The integration of Hiden’s mass spectrometer to the 670 Electrolysis Workstation is the latest innovation from Scribner. Now, the company is working on improving the time resolution of the system so that even shorter pulse durations can be studied by users. The company is also working on boosting the maximum current of the 670 to 100 A.

The post Electrolysis workstation incorporates mass spectrometry to accelerate carbon-dioxide reduction research appeared first on Physics World.

Loop quantum cosmology may explain smoothness of cosmic microwave background

First light: The cosmic microwave background, as imaged by the European Space Agency’s Planck mission. (Courtesy: ESA and the Planck Collaboration)
First light The cosmic microwave background, as imaged by the European Space Agency’s Planck mission. (Courtesy: ESA and the Planck Collaboration)

In classical physics, gravity is universally attractive. At the quantum level, however, this may not always be the case. If vast quantities of matter are present within an infinitesimally small volume – at the centre of a black hole, for example, or during the very earliest moments of the universe – space–time becomes curved at scales that approach the Planck length. This is the fundamental quantum unit of distance, and is around 1020 times smaller than a proton.

In these extremely curved regions, the classical theory of gravity – Einstein’s general theory of relativity – breaks down. However, research on loop quantum cosmology offers a possible solution. It suggests that gravity, in effect, becomes repulsive. Consequently, loop quantum cosmology predicts that our present universe began in a so-called “cosmic bounce”, rather than the Big Bang singularity predicted by general relativity.

In a recent paper published in EPL, Edward Wilson-Ewing, a mathematical physicist at the University of New Brunswick, Canada, explores the interplay between loop quantum cosmology and a phenomenon sometimes described as “the echo of the Big Bang”: the cosmic microwave background (CMB). This background radiation pervades the entire visible universe, and it stems from the moment the universe became cool enough for neutral atoms to form. At this point, light was suddenly able to travel through space without being continually scattered by the plasma of electrons and light nuclei that existed before. It is this freshly liberated light that makes up the CMB, so studying it offers clues to what the early universe was like.

Edward Wilson-Ewing
Cosmologist Edward Wilson-Ewing uses loop quantum gravity to study quantum effects in the very early universe. (Courtesy: University of New Brunswick)

What was the motivation for your research?

Observations of the CMB show that the early universe (that is, the universe as it was when the CMB formed) was extremely homogeneous, with relative anisotropies of the order of one part in 104. Classical general relativity has trouble explaining this homogeneity on its own, because a purely attractive version of gravity tends to drive things in the opposite direction. This is because if a region has a higher density than the surrounding area, then according to general relativity, that region will become even denser; there is more mass in that region and therefore particles surrounding it will be attracted to it. Indeed, this is how the small inhomogeneities we do see in the CMB grew over time to form stars and galaxies today.

The main way this gets resolved in classical general relativity is to suggest that the universe experienced an episode of super-rapid growth in its earliest moments. This super-rapid growth is known as inflation, and it can suffice to generate homogeneous regions. However, in general, this requires a very large amount of inflation (much more than is typically considered in most models).

Alternately, if for some reason there happens to be a region that is moderately homogeneous when inflation starts, this region will increase exponentially in size while also becoming further homogenized. This second possibility requires a little more than a minimal amount of inflation, but not much more.

My goal in this work was to explore whether, if gravity becomes repulsive in the deep quantum regime (as is the case in loop quantum cosmology), this will tend to dilute regions of higher density, leading to inhomogeneities being smoothed out. In other words, one of the main objectives of this work was to find out whether quantum gravity could be the source of the high degree of homogeneity observed in the CMB.

What did you do in the paper?

In this paper, I studied spherically symmetric space–times coupled to dust (a simple model for matter) in loop quantum cosmology.  These space–times are known as Lemaître–Tolman–Bondi space–times, and they allow arbitrarily large inhomogeneities in the radial direction. They therefore provide an ideal arena to explore whether homogenization can occur: they are simple enough to be mathematically tractable, while still allowing for large inhomogeneities (which, in general, are very hard to handle).

Loop quantum cosmology predicts several leading-order quantum effects. One of these effects is that space–time, at the quantum level, is discrete: there are quanta of geometry just as there are quanta of matter.  This has implications for the equations of motion, which relate the geometry of space–time to the matter in it: if we take into account the discrete nature of quantum geometry, we have to modify the equations of motion.

These modifications are captured by so-called effective equations, and in the paper I solved these equations numerically for a wide range of initial conditions. From this, I found that while homogenization doesn’t occur everywhere, it always occurs in some regions. These homogenized regions can then be blown up to cosmological scales by inflation (and inflation will further homogenize them).  Therefore, this quantum gravity homogenization process could indeed explain the homogeneity observed in the CMB.

What do you plan to do next?

It is important to extend this work in several directions to check the robustness of the homogenization effect in loop quantum cosmology.  The restriction to spherical symmetry should be relaxed, although this will be challenging from a mathematical perspective. It will also be important to go beyond dust as a description of matter. The simplicity of dust makes calculations easier, but it is not particularly realistic.

Other relevant forms of matter include radiation and the so-called inflaton field, which is a type of matter that can cause inflation to occur. That said, in cosmology, the physics is to some extent independent of the universe’s matter content, at least at a qualitative level. This is because while different types of matter content may dilute more rapidly than others in an expanding universe, and the universe may expand at different rates depending on its matter content, the main properties of the cosmological dynamics (for example, the expanding universe, the occurrence of an initial singularity and so on) within general relativity are independent of the specific matter being considered.

I therefore think it is reasonable to expect that the quantitative predictions will depend on the matter content, but the qualitative features (in particular, that small regions are homogenized by quantum gravity) will remain the same. Still, further research is needed to test this expectation.

The post Loop quantum cosmology may explain smoothness of cosmic microwave background appeared first on Physics World.

MR QA from radiotherapy perspective

IBA webinar image

During this webinar, the key steps of integrating an MRI scanner and MRI Linac into a radiotherapy will be presented, specially focusing on the quality assurance required for the use of the MRI images. Furthermore, the use of phantoms and their synergy with each other across the multi-vendor facility will be discussed.

Akos Gulyban
Akos Gulyban

Akos Gulyban is a medical physicist with a PhD in Physics (in Medicine), renowned for his expertise in MRI-guided radiotherapy (MRgRT). Currently based at Institut Jules Bordet in Brussels, he plays a pivotal role in advancing MRgRT technologies, particularly through the integration of the Elekta Unity MR-Linac system along the implementation of dedicated MRI simulation for radiotherapy.

In addition to his clinical research, Gulyban has been involved in developing quality assurance protocols for MRI-linear accelerator (MR-Linac) systems, contributing to guidelines that ensure safe and effective implementation of MRI-guided radiotherapy.

Gulyban is playing a pivotal role in integrating advanced imaging technologies into radiotherapy, striving to enhance treatment outcomes for cancer patients.

The post MR QA from radiotherapy perspective appeared first on Physics World.

FLIR MIX – a breakthrough in infrared and visible imaging

23 avril 2025 à 16:25

flir mix champagne cork

Until now, researchers have had to choose between thermal and visible imaging: One reveals heat signatures while the other provides structural detail. Recording both and trying to align them manually — or harder still, synchronizing them temporally — can be inconsistent and time-consuming. The result is data that is close but never quite complete. The new FLIR MIX is a game changer, capturing and synchronizing high-speed thermal and visible imagery at up to 1000 fps. Visible and high-performance infrared cameras with FLIR Research Studio software work together to deliver one data set with perfect spatial and temporal alignment — no missed details or second guessing, just a complete picture of fast-moving events.

Jerry Beeney
Jerry Beeney

Jerry Beeney is a seasoned global business development leader with a proven track record of driving product growth and sales performance in the Teledyne FLIR Science and Automation verticals. With more than 20 years at Teledyne FLIR, he has played a pivotal role in launching new thermal imaging solutions, working closely with technical experts, product managers, and customers to align products with market demands and customer needs. Before assuming his current role, Beeney held a variety of technical and sales positions, including senior scientific segment engineer. In these roles, he managed strategic accounts and delivered training and product demonstrations for clients across diverse R&D and scientific research fields. Beeney’s dedication to achieving meaningful results and cultivating lasting client relationships remains a cornerstone of his professional approach.

The post FLIR MIX – a breakthrough in infrared and visible imaging appeared first on Physics World.

Radiosurgery made easy: the role of the Gamma Knife in modern radiotherapy

17 avril 2025 à 15:16

This podcast features Alonso Gutierrez, who is chief of medical physics at the Miami Cancer Institute in the US. In a wide-ranging conversation with Physics World’s Tami Freeman, Gutierrez talks about his experience using Elekta’s Leksell Gamma Knife for radiosurgery in a busy radiotherapy department.

This podcast is sponsored by Elekta.

The post Radiosurgery made easy: the role of the Gamma Knife in modern radiotherapy appeared first on Physics World.

On the path towards a quantum economy

16 avril 2025 à 16:15
The high-street bank HSBC has worked with the NQCC, hardware provider Rigetti and the Quantum Software Lab to investigate the advantages that quantum computing could offer for detecting the signs of fraud in transactional data. (Courtesy: Shutterstock/Westend61 on Offset)

Rapid technical innovation in quantum computing is expected to yield an array of hardware platforms that can run increasingly sophisticated algorithms. In the real world, however, such technical advances will remain little more than a curiosity if they are not adopted by businesses and the public sector to drive positive change. As a result, one key priority for the UK’s National Quantum Computing Centre (NQCC) has been to help companies and other organizations to gain an early understanding of the value that quantum computing can offer for improving performance and enhancing outcomes.

To meet that objective the NQCC has supported several feasibility studies that enable commercial organizations in the UK to work alongside quantum specialists to investigate specific use cases where quantum computing could have a significant impact within their industry. One prime example is a project involving the high-street bank HSBC, which has been exploring the potential of quantum technologies for spotting the signs of fraud in financial transactions. Such fraudulent activity, which affects millions of people every year, now accounts for about 40% of all criminal offences in the UK and in 2023 generated total losses of more than £2.3 bn across all sectors of the economy.

Banks like HSBC currently exploit classical machine learning to detect fraudulent transactions, but these techniques require a large computational overhead to train the models and deliver accurate results. Quantum specialists at the bank have therefore been working with the NQCC, along with hardware provider Rigetti and the Quantum Software Lab at the University of Edinburgh, to investigate the capabilities of quantum machine learning (QML) for identifying the tell-tale indicators of fraud.

“HSBC’s involvement in this project has brought transactional fraud detection into the realm of cutting-edge technology, demonstrating our commitment to pushing the boundaries of quantum-inspired solutions for near-term benefit,” comments Philip Intallura, Group Head of Quantum Technologies at HSBC. “Our philosophy is to innovate today while preparing for the quantum advantage of tomorrow.”

Another study focused on a key problem in the aviation industry that has a direct impact on fuel consumption and the amount of carbon emissions produced during a flight. In this logistical challenge, the aim was to find the optimal way to load cargo containers onto a commercial aircraft. One motivation was to maximize the amount of cargo that can be carried, the other was to balance the weight of the cargo to reduce drag and improve fuel efficiency.

“Even a small shift in the centre of gravity can have a big effect,” explains Salvatore Sinno of technology solutions company Unisys, who worked on the project along with applications engineers at the NQCC and mathematicians at the University of Newcastle. “On a Boeing 747 a displacement of just 75 cm can increase the carbon emissions on a flight of 10,000 miles by four tonnes, and also increases the fuel costs for the airline company.”

aeroplane being loaded with cargo
A hybrid quantum–classical solution has been used to optimize the configuration of air freight, which can improve fuel efficiency and lower carbon emissions. (Courtesy: Shutterstock/supakitswn)

With such a large number of possible loading combinations, classical computers cannot produce an exact solution for the optimal arrangement of cargo containers. In their project the team improved the precision of the solution by combining quantum annealing with high-performance computing, a hybrid approach that Unisys believes can offer immediate value for complex optimization problems. “We have reached the limit of what we can achieve with classical computing, and with this work we have shown the benefit of incorporating an element of quantum processing into our solution,” explains Sinno.

The HSBC project team also found that a hybrid quantum–classical solution could provide an immediate performance boost for detecting anomalous transactions. In this case, a quantum simulator running on a classical computer was used to run quantum algorithms for machine learning. “These simulators allow us to execute simple QML programmes, even though they can’t be run to the same level of complexity as we could achieve with a physical quantum processor,” explains Marco Paini, the project lead for Rigetti. “These simulations show the potential of these low-depth QML programmes for fraud detection in the near term.”

The team also simulated more complex QML approaches using a similar but smaller-scale problem, demonstrating a further improvement in performance. This outcome suggests that running deeper QML algorithms on a physical quantum processor could deliver an advantage for detecting anomalies in larger datasets, even though the hardware does not yet provide the performance needed to achieve reliable results. “This initiative not only showcases the near-term applicability of advanced fraud models, but it also equips us with the expertise to leverage QML methods as quantum computing scales,” comments Intellura.

Indeed, the results obtained so far have enabled the project partners to develop a roadmap that will guide their ongoing development work as the hardware matures. One key insight, for example, is that even a fault-tolerant quantum computer would struggle to process the huge financial datasets produced by a bank like HSBC, since a finite amount of time is needed to run the quantum calculation for each data point. “From the simulations we found that the hybrid quantum–classical solution produces more false positives than classical methods,” says Paini. “One approach we can explore would be to use the simulations to flag suspicious transactions and then run the deeper algorithms on a quantum processor to analyse the filtered results.”

This particular project also highlighted the need for agreed protocols to navigate the strict rules on data security within the banking sector. For this project the HSBC team was able to run the QML simulations on its existing computing infrastructure, avoiding the need to share sensitive financial data with external partners. In the longer term, however, banks will need reassurance that their customer information can be protected when processed using a quantum computer. Anticipating this need, the NQCC has already started to work with regulators such as the Financial Conduct Authority, which is exploring some of the key considerations around privacy and data security, with that initial work feeding into international initiatives that are starting to consider the regulatory frameworks for using quantum computing within the financial sector.

For the cargo-loading project, meanwhile, Sinno says that an important learning point has been the need to formulate the problem in a way that can be tackled by the current generation of quantum computers. In practical terms that means defining constraints that reduce the complexity of the problem, but that still reflect the requirements of the real-world scenario. “Working with the applications engineers at the NQCC has helped us to understand what is possible with today’s quantum hardware, and how to make the quantum algorithms more viable for our particular problem,” he says. “Participating in these studies is a great way to learn and has allowed us to start using these emerging quantum technologies without taking a huge risk.”

Indeed, one key feature of these feasibility studies is the opportunity they offer for different project partners to learn from each other. Each project includes an end-user organization with a deep knowledge of the problem, quantum specialists who understand the capabilities and limitations of present-day solutions, and academic experts who offer an insight into emerging theoretical approaches as well as methodologies for benchmarking the results. The domain knowledge provided by the end users is particularly important, says Paini, to guide ongoing development work within the quantum sector. “If we only focused on the hardware for the next few years, we might come up with a better technical solution but it might not address the right problem,” he says. “We need to know where quantum computing will be useful, and to find that convergence we need to develop the applications alongside the algorithms and the hardware.”

Another major outcome from these projects has been the ability to make new connections and identify opportunities for future collaborations. As a national facility NQCC has played an important role in providing networking opportunities that bring diverse stakeholders together, creating a community of end users and technology providers, and supporting project partners with an expert and independent view of emerging quantum technologies. The NQCC has also helped the project teams to share their results more widely, generating positive feedback from the wider community that has already sparked new ideas and interactions.

“We have been able to network with start-up companies and larger enterprise firms, and with the NQCC we are already working with them to develop some proof-of-concept projects,” says Sinno. “Having access to that wider network will be really important as we continue to develop our expertise and capability in quantum computing.”

The post On the path towards a quantum economy appeared first on Physics World.

Very high-energy electrons could prove optimal for FLASH radiotherapy

15 avril 2025 à 13:00

Electron therapy has long played an important role in cancer treatments. Electrons with energies of up to 20 MeV can treat superficial tumours while minimizing delivered dose to underlying tissues; they are also ideal for performing total skin therapy and intraoperative radiotherapy. The limited penetration depth of such low-energy electrons, however, limits the range of tumour sites that they can treat. And as photon-based radiotherapy technology continues to progress, electron therapy has somewhat fallen out of fashion.

That could all be about to change with the introduction of radiation treatments based on very high-energy electrons (VHEEs). Once realised in the clinic, VHEEs – with energies from 50 up to 400 MeV – will deliver highly penetrating, easily steerable, conformal treatment beams with the potential to enable emerging techniques such as FLASH radiotherapy. French medical technology company THERYQ is working to make this opportunity a reality.

Therapeutic electron beams are produced using radio frequency (RF) energy to accelerate electrons within a vacuum cavity. An accelerator of a just over 1 m in length can boost electrons to energies of about 25 MeV – corresponding to a tissue penetration depth of a few centimetres. It’s possible to create higher energy beams by simply daisy chaining additional vacuum chambers. But such systems soon become too large and impractical for clinical use.

THERYQ is focusing on a totally different approach to generating VHEE beams. “In an ideal case, these accelerators allow you to reach energy transfers of around 100 MeV/m,” explains THERYQ’s Sébastien Curtoni. “The challenge is to create a system that’s as compact as possible, closer to the footprint and cost of current radiotherapy machines.”

Working in collaboration with CERN, THERYQ is aiming to modify CERN’s Compact Linear Collider technology for clinical applications. “We are adapting the CERN technology, which was initially produced for particle physics experiments, to radiotherapy,” says Curtoni. “There are definitely things in this design that are very useful for us and other things that are difficult. At the moment, this is still in the design and conception phase; we are not there yet.”

VHEE advantages

The higher energy of VHEE beams provides sufficient penetration to treat deep tumours, with the dose peak region extending up to 20–30 cm in depth for parallel (non-divergent) beams using energy levels of 100–150 MeV (for field sizes of 10 x 10 cm or above). And in contrast to low-energy electrons, which have significant lateral spread, VHEE beams have extremely narrow penumbra with sharp beam edges that help to create highly conformal dose distributions.

“Electrons are extremely light particles and propagate through matter in very straight lines at very high energies,” Curtoni explains. “If you control the initial direction of the beam, you know that the patient will receive a very steep and well defined dose distribution and that, even for depths above 20 cm, the beam will remain sharp and not spread laterally.”

Electrons are also relatively insensitive to tissue inhomogeneities, such as those encountered as the treatment beam passes through different layers of muscle, bone, fat or air. “VHEEs have greater robustness against density variations and anatomical changes,” adds THERYQ’s Costanza Panaino. “This is a big advantage for treatments in locations where there is movement, such as the lung and pelvic areas.”

It’s also possible to manipulate VHEEs via electromagnetic scanning. Electrons have a charge-to-mass ratio roughly 1800 times higher than that of protons, meaning that they can be steered with a much weaker magnetic field than required for protons. “As a result, the technology that you are building has a smaller footprint and the possibility costing less,” Panaino explains. “This is extremely important because the cost of building a proton therapy facility is prohibitive for some countries.”

Enabling FLASH

In addition to expanding the range of clinical indications that can be treated with electrons, VHEE beams can also provide a tool to enable the emerging – and potentially game changing – technique known as FLASH radiotherapy. By delivering therapeutic radiation at ultrahigh dose rates (higher than 100 Gy/s), FLASH vastly reduces normal tissue toxicity while maintaining anti-tumour activity, potentially minimizing harmful side-effects.

The recent interest in the FLASH effect began back in 2014 with the report of a differential response between normal and tumour tissue in mice exposed to high dose-rate, low-energy electrons. Since then, most preclinical FLASH studies have used electron beams, as did the first patient treatment in 2019 – a skin cancer treatment at Lausanne University Hospital (CHUV) in Switzerland, performed with the Oriatron eRT6 prototype from PMB-Alcen, the French company from which THERYQ originated.

FLASH radiotherapy is currently being used in clinical trials with proton beams, as well as with low-energy electrons, where it remains intrinsically limited to superficial treatments. Treating deep-seated tumours with FLASH requires more highly penetrating beams. And while the most obvious option would be to use photons, it’s extremely difficult to produce an X-ray beam with a high enough dose rate to induce the FLASH effect without excessive heat generation destroying the conversion target.

“It’s easier to produce a high dose-rate electron beam for FLASH than trying to [perform FLASH] with X-rays, as you use the electron beam directly to treat the patient,” Curtoni explains. “The possibility to treat deep-seated tumours with high-energy electron beams compensates for the fact that you can’t use X-rays.”

Panaino points out that in addition to high dose rates, FLASH radiotherapy also relies on various interdependent parameters. “Ideally, to induce the FLASH effect, the beam should be pulsed at a frequency of about 100 Hz, the dose-per-pulse should be 1 Gy or above, and the dose rate within the pulse should be higher than 106 Gy/s,” she explains.

VHEE infographic

Into the clinic

THERYQ is using its VHEE expertise to develop a clinical FLASH radiotherapy system called FLASHDEEP, which will use electrons at energies of 100 to 200 MeV to treat tumours at depths of up to 20 cm. The first FLASHDEEP systems will be installed at CHUV (which is part of a consortium with CERN and THERYQ) and at the Gustave Roussy cancer centre in France.

“We are trying to introduce FLASH into the clinic, so we have a prototype FLASHKNiFE machine that allows us to perform low-energy, 6 and 9 MeV, electron therapy,” says Charlotte Robert, head of the medical physics department research group at Gustave Roussy. “The first clinical trials using low-energy electrons are all on skin tumours, aiming to show that we can safely decrease the number of treatment sessions.”

While these initial studies are limited to skin lesions, clinical implementation of the FLASHDEEP system will extend the benefits of FLASH to many more tumour sites. Robert predicts that VHEE-based FLASH will prove most valuable for treating radioresistant cancers that cannot currently be cured. The rationale is that FLASH’s ability to spare normal tissue will allow delivery of higher target doses without increasing toxicity.

“You will not use this technology for diseases that can already be cured, at least initially,” she explains. “The first clinical trial, I’m quite sure, will be either glioblastoma or pancreatic cancers that are not effectively controlled today. If we can show that VHEE FLASH can spare normal tissue more than conventional radiotherapy can, we hope this will have a positive impact on lesion response.”

“There are a lot of technological challenges around this technology and we are trying to tackle them all,” Curtoni concludes. “The ultimate goal is to produce a VHEE accelerator with a very compact beamline that makes this technology and FLASH a reality for a clinical environment.”

The post Very high-energy electrons could prove optimal for FLASH radiotherapy appeared first on Physics World.

Designer van der Waals materials for quantum optical emission

Join us for an insightful webinar highlighting cutting-edge research in 2D transition-metal dichalcogenides (TMDs) and their applications in quantum optics.

This session will showcase multimodal imaging techniques, including reflection and time-resolved photoluminescence (TRPL), performed with our high-performance MicroTime 100 microscope. Complementary spectroscopic insights are provided through photoluminescence emission measurements using the FluoTime 300 spectrometer, highlighting the unique characteristics of these advanced materials and their potential in next-generation photonic devices.

Whether you’re a researcher, engineer, or enthusiast in nanophotonics and quantum materials, this webinar will offer valuable insights into the characterization and design of van der Waals materials for quantum optical applications. Don’t miss this opportunity to explore the forefront of 2D material spectroscopy and imaging with a leading expert in the field.

Shengxi Huang
Shengxi Huang

Shengxi Huang is an associate professor in the Department of Electrical and Computer Engineering at Rice University. Huang earned her PhD in electrical engineering and computer science at MIT in 2017, under the supervision of Professors Mildred Dresselhaus and Jing Kong. Following that, she did postdoctoral research at Stanford University with Professors Tony Heinz and Jonathan Fan. She obtained her bachelor’s degree with the highest honors at Tsinghua University, China. Before joining Rice, she was an assistant professor in the Department of Electrical Engineering, Department of Biomedical Engineering, and Materials Research Institute at The Pennsylvania State University.

Huang’s research interests involve light-matter interactions of quantum materials and nanostructures, and the development of new quantum optical platforms and biochemical sensing technologies. In particular, her research focuses on (1) understanding optical and electronic properties of new materials such as 2D materials and Weyl semimetals, (2) developing new biochemical sensing techniques with applications in medical diagnosis, and (3) exploring new quantum optical effects and quantum sensing. She is leading the SCOPE (Sensing, Characterization, and OPtoElectronics) Laboratory.

The post Designer van der Waals materials for quantum optical emission appeared first on Physics World.

❌