↩ Accueil

Vue normale

Reçu avant avant-hier

Handheld device captures airborne signs of disease

16 juin 2025 à 16:00

A sensitive new portable device can detect gas molecules associated with certain diseases by condensing dilute airborne biomarkers into concentrated liquid droplets. According to its developers at the University of Chicago in the US, the device could be used to detect airborne viruses or bacteria in hospitals and other public places, improve neonatal care, and even allow diabetic patients to read glucose levels in their breath, to list just three examples.

Many disease biomarkers are only found in breath or ambient air at levels of a few parts per trillion. This makes them very difficult to detect compared with biomarkers in biofluids such as blood, saliva or mucus, where they are much more concentrated. Traditionally, reaching a high enough sensitivity required bulky and expensive equipment such as mass spectrometers, which are impractical for everyday environments.

Rapid and sensitive identification

Researchers led by biophysicist and materials chemist Bozhi Tian have now developed a highly portable alternative. Their new Airborne Biomarker Localization Engine (ABLE) can detect both non-volatile and volatile molecules in air in around 15 minutes.

This handheld device comprises a cooled condenser surface, an air pump and microfluidic enrichment modules, and it works in the following way. First, air that (potentially) contains biomarkers flows into a cooled chamber. Within this chamber, Tian explains, the supersaturated moisture condenses onto nanostructured superhydrophobic surfaces and forms droplets. Any particles in the air thus become suspended inside the droplets, which means they can be analysed using conventional liquid-phase biosensors such as colorimeteric test strips or electrochemical probes. This allows them to be identified rapidly with high sensitivity.

Tiny babies and a big idea

Tian says the inspiration for this study, which is detailed in Nature Chemical Engineering, came from a visit he made to a neonatal intensive care unit (NICU) in 2021. “Here, I observed the vulnerability and fragility of preterm infants and realized how important non-invasive monitoring is for them,” Tian explains.

“My colleagues and I envisioned a contact-free system capable of detecting disease-related molecules in air. Our biggest challenge was sensitivity and initial trials failed to detect key chemicals,” he remembers. “We overcame this problem by developing a new enrichment strategy using nanostructured condensation and molecular sieves while also exploiting evaporation physics to stabilize and concentrate the captured biomarkers.”

The technology opens new avenues for non-contact, point-of-care diagnostics, he tells Physics World. Possible near-term applications include the early detection of ailments such as inflammatory bowel disease (IBD), which can lead to markers of inflammation appearing in patients’ breath. Respiratory disorders and neurodevelopment conditions in babies could be detected in a similar way. Tian suggests the device could even be used for mental health monitoring via volatile stress biomarkers (again found in breath) and for monitoring air quality in public spaces such as schools and hospitals.

“Thanks to its high sensitivity and low cost (of around $200), ABLE could democratize biomarker sensing, moving diagnostics beyond the laboratory and into homes, clinics and underserved areas, allowing for a new paradigm in preventative and personalized medicine,” he says.

Widespread applications driven by novel physics

The University of Chicago scientists’ next goal is to further miniaturize and optimize the ABLE device. They are especially interested in enhancing its sensitivity and energy efficiency, as well as exploring the possibility of real-time feedback through closed-loop integration with wearable sensors. “We also plan to extend its applications to infectious disease surveillance and food spoilage detection,” Tian reveals.

The researchers are currently collaborating with health professionals to test ABLE in real-world settings such as NICUs and outpatient clinics. In the future, though, they also hope to explore novel physical processes that might improve the efficiency at which devices like these can capture hydrophobic or nonpolar airborne molecules.

According to Tian, the work has unveiled “unexpected evaporation physics” in dilute droplets with multiple components. Notably, they have seen evidence that such droplets defy the limit set by Henry’s law, which states that at constant temperature, the amount of a gas that dissolves in a liquid of a given type and volume is directly proportional to the partial pressure of the gas in equilibrium with the liquid. “This opens a new physical framework for such condensation-driven sensing and lays the foundation for widespread applications in the non-contact diagnostics, environmental monitoring and public health applications mentioned,” Tian says.

The post Handheld device captures airborne signs of disease appeared first on Physics World.

Wireless e-tattoos help manage mental workload

3 juin 2025 à 10:00

Managing one’s mental workload is a tricky balancing act that can affect cognitive performance and decision making abilities. Too little engagement with an ongoing task can lead to boredom and mistakes; too high could cause a person to become overwhelmed.

For those performing safety-critical tasks, such as air traffic controllers or truck drivers for example, monitoring how hard their brain is working is even more important – lapses in focus could have serious consequences. But how can a person’s mental workload be assessed? A team at the University of Texas at Austin proposes the use of temporary face tattoos that can track when a person’s brain is working too hard.

“Technology is developing faster than human evolution. Our brain capacity cannot keep up and can easily get overloaded,” says lead author Nanshu Lu in a press statement. “There is an optimal mental workload for optimal performance, which differs from person to person.”

The traditional approach for monitoring mental workload is electroencephalography (EEG), which analyses the brain’s electrical activity. But EEG devices are wired, bulky and uncomfortable, making them impractical for real-world situations. Measurements of eye movements using electrooculography (EOG) are another option for assessing mental workload.

Lu and colleagues have developed an ultrathin wireless e-tattoo that records high-fidelity EEG and EOG signals from the forehead. The e-tattoo combines a disposable sticker-like electrode layer and a reusable battery-powered flexible printed circuit (FPC) for data acquisition and wireless transmission.

The serpentine-shaped electrodes and interconnects are made from low-cost, conductive graphite-deposited polyurethane, coated with an adhesive polymer composite to reduce contact impedance and improve skin attachment. The e-tattoo stretches and conforms to the skin, providing reliable signal acquisition, even during dynamic activities such as walking and running.

To assess the e-tattoo’s ability to record basic neural activities, the team used it to measure alpha brainwaves as a volunteer opened and closed their eyes. The e-tattoo captured equivalent neural spectra to that recorded by a commercial gel electrode-based EEG system with comparable signal fidelity.

The researchers next tested the e-tattoo on six participants while they performed a visuospatial memory task that gradually increased in difficulty. They analysed the signals collected by the e-tattoo during the tasks, extracting EEG band powers for delta, theta, alpha, beta and gamma brainwaves, plus various EOG features.

As the task got more difficult, the participants showed higher activity in the theta and delta bands, a feature associated with increased cognitive demand. Meanwhile, activity in the alpha and beta bands decreased, indicating mental fatigue.

The researchers built a machine learning model to predict the level of mental workload experienced during the tasks, training it on forehead EEG and EOG features recorded by the e-tattoo. The model could reliably estimate mental workload in each of the six subjects, demonstrating the feasibility of real-time cognitive state decoding.

“Our key innovation lies in the successful decoding of mental workload using a wireless, low-power, low-noise and ultrathin EEG/EOG e-tattoo device,” the researchers write. “It addresses the unique challenges of monitoring forehead EEG and EOG, where wearability, non-obstructiveness and signal stability are critical to assessing mental workload in the real world.”

They suggest that future applications could include real-time cognitive load monitoring in pilots, operators and healthcare professionals. “We’ve long monitored workers’ physical health, tracking injuries and muscle strain,” says co-author Luis Sentis. “Now we have the ability to monitor mental strain, which hasn’t been tracked. This could fundamentally change how organizations ensure the overall well-being of their workforce.”

The e-tattoo is described in Device.

The post Wireless e-tattoos help manage mental workload appeared first on Physics World.

New contact lenses allow wearers to see in the near-infrared

30 mai 2025 à 13:00

A new contact lens enables humans to see near-infrared light without night vision goggles or other bulky equipment. The lens, which incorporates metallic nanoparticles that “upconvert” normally-invisible wavelengths into visible ones, could have applications for rescue workers and others who would benefit from enhanced vision in conditions with poor visibility.

The infrared (IR) part of the electromagnetic spectrum encompasses light with wavelengths between 700 nm and 1 mm. Human eyes cannot normally detect these wavelengths because opsins, the light-sensitive protein molecules that allow us to see, do not have the required thermodynamic properties. This means we see only a small fraction of the electromagnetic spectrum, typically between 400‒700 nm.

While devices such as night vision goggles and infrared-visible converters can extend this range, they require external power sources. They also cannot distinguish between different wavelengths of IR light.

Photoreceptor-binding nanoparticles

In a previous work, researchers led by neuroscientist Tian Xue of the University of Science and Technology of China (USTC) injected photoreceptor-binding nanoparticles into the retinas of mice. While this technique was effective, it is too invasive and risky for human volunteers. In the new study, therefore, Xue and colleagues integrated the nanoparticles into biocompatible polymeric materials similar to those used in standard soft contact lenses.

The nanoparticles in the lenses are made from Au/NaGdF4: Yb3+, Er3+ and have a diameter of approximately 45 nm each. They work by capturing photons with lower energies (longer wavelengths) and re-emitting them as photons with higher energies (shorter wavelengths). This process is known as upconversion and the emitted light is said to be anti-Stokes shifted.

When the researchers tested the new upconverting contact lenses (UCLs) on mice, the rodents’ behaviour suggested they could sense IR wavelengths. For example, when given a choice between a dark box and an IR-illuminated one, the lens-wearing mice scurried into the dark box. In contrast, a control group of mice not wearing lenses showed no preference for one box over the other. The pupils of the lens-wearing mice also constricted when exposed to IR light, and brain imaging revealed that processing centres in their visual cortex were activated.

Flickering seen even with eyes closed

The team then moved on to human volunteers. “In humans, the near-infrared UCLs enabled participants to accurately detect flashing Morse code-like signals and perceive the incoming direction of near-infrared (NIR) light,” Xue says, referring to light at wavelengths between 800‒1600 nm. Counterintuitively, the flashing images appeared even clearer when the volunteers closed their eyes – probably because IR light is better than visible light at penetrating biological tissue such as eyelids. Importantly, Xue notes that wearing the lenses did not affect participants’ normal vision.

The team also developed a wearable system with built-in flat UCLs. This system allowed volunteers to distinguish between patterns such as horizontal and vertical lines; S and O shapes; and triangles and squares.

But Xue and colleagues did not stop there. By replacing the upconverting nanoparticles with trichromatic orthogonal ones, they succeeded in converting NIR light into three different spectral bands. For example, they converted infrared wavelengths of 808, 980 nm and 1532 nm into 540, 450, and 650 nm respectively – wavelengths that humans perceive as green, blue and red.

“As well as allowing wearers to garner more detail within the infrared spectrum, this technology could also help colour-blind individuals see wavelengths they would otherwise be unable to detect by appropriately adjusting the absorption spectrum,” Xue tells Physics World.

According to the USTC researchers, who report their work in Cell, the devices could have several other applications. Apart from providing humans with night vision and offering an adaptation for colour blindness, the lenses could also give wearers better vision in foggy or dusty conditions.

At present, the devices only work with relatively bright IR emissions (the study used LEDs). However, the researchers hope to increase the photosensitivity of the nanoparticles so that lower levels of light can trigger the upconversion process.

The post New contact lenses allow wearers to see in the near-infrared appeared first on Physics World.

Ultrasound-activated structures clear biofilms from medical implants

22 mai 2025 à 16:20

When implanted medical devices like urinary stents and catheters get clogged with biofilms, the usual solution is to take them out and replace them with new ones. Now, however, researchers at the University of Bern and ETH Zurich, Switzerland have developed an alternative. By incorporating ultrasound-activated moving structures into their prototype “stent-on-a-chip” device, they showed it is possible to remove biofilms without removing the device itself. If translated into clinical practice, the technology could increase the safe lifespan of implants, saving money and avoiding operations that are uncomfortable and sometimes hazardous for patients.

Biofilms are communities of bacterial cells that adhere to natural surfaces in the body as well as artificial structures such as catheters, stents and other implants. Because they are encapsulated by a protective, self-produced extracellular matrix made from polymeric substances, they are mechanically robust and resistant to standard antibacterial measures. If not removed, they can cause infections, obstructions and other complications.

Intense, steady flows push away impurities

The new technology, which was co-developed by Cornel Dillinger, Pedro Amado and other members of Francesco Clavica and Daniel Ahmed’s research teams, takes advantage of recent advances in the fields of robotics and microfluidics. Its main feature is a coating made from microscopic hair-like structures known as cilia. Under the influence of an acoustic field, which is applied externally via a piezoelectric transducer, these cilia begin to move. This movement produces intense, steady fluid flows with velocities of up to 10 mm/s – enough to break apart encrusted deposits (made from calcium carbonate, for example) and flush away biofilms from the inner and outer surfaces of implanted urological devices.

Microscope image showing square and diamond shapes in various shades of grey
All fouled up: Typical examples of crystals known as encrustations that develop on the surfaces of urinary stents and catheters. (Courtesy: Pedro Amado and Shaokai Zheng)

“This is a major advance compared to existing stents and catheters, which require regular replacements to avoid obstruction and infections,” Clavica says.

The technology is also an improvement on previous efforts to clear implants by mechanical means, Ahmed adds. “Our polymeric cilia in fact amplify the effects of ultrasound by allowing for an effect known as acoustic streaming at frequencies of 20 to 100 kHz,” he explains. “This frequency is lower than that possible with previous microresonator devices developed to work in a similar way that had to operate in the MHz-frequency range.”

The lower frequency achieves the desired therapeutic effects while prioritizing patient safety and minimizing the risk of tissue damage, he adds.

Wider applications

In creating their technology, the researchers were inspired by biological cilia, which are a natural feature of physiological systems such as the reproductive and respiratory tracts and the central nervous system. Future versions, they say, could apply the ultrasound probe directly to a patient’s skin, much as handheld probes of ultrasound scanners are currently used for imaging. “This technology has potential applications beyond urology, including fields like visceral surgery and veterinary medicine, where keeping implanted medical devices clean is also essential,” Clavica says.

The researchers now plan to test new coatings that would reduce contact reactions (such as inflammation) in the body. They will also explore ways of improving the device’s responsiveness to ultrasound – for example by depositing thin metal layers. “These modifications could not only improve acoustic streaming performance but could also provide additional antibacterial benefits,” Clavica tells Physics World.

In the longer term, the team hope to translate their technology into clinical applications. Initial tests that used a custom-built ultrasonic probe coupled to artificial tissue have already demonstrated promising results in generating cilia-induced acoustic streaming, Clavica notes. “In vivo animal studies will then be critical to validate safety and efficacy prior to clinical adoption,” he says.

The present study is detailed in PNAS.

The post Ultrasound-activated structures clear biofilms from medical implants appeared first on Physics World.

Visual assistance system helps blind people navigate

21 mai 2025 à 10:00
Structure and workflow of a wearable visual assistance system
Visual assistance system The wearable system uses intuitive multimodal feedback to assist visually impaired people with daily life tasks. (Courtesy: J Tang et al. Nature Machine Intelligence 10.1038/s42256-025-01018-6, 2005, Springer Nature)

Researchers from four universities in Shanghai, China, are developing a practical visual assistance system to help blind and partially sighted people navigate. The prototype system combines lightweight camera headgear, rapid-response AI-facilitated software and artificial “skins” worn on the wrists and finger that provide physiological sensing. Functionality testing suggests that the integration of visual, audio and haptic senses can create a wearable navigation system that overcomes current designs’ adoptability and usability concerns.

Worldwide, 43 million people are blind, according to 2021 estimates by the International Agency for the Prevention of Blindness. Millions more are so severely visually impaired that they require the use of a cane to navigate.

Visual assistance systems offer huge potential as navigation tools, but current designs have many drawbacks and challenges for potential users. These include limited functionality with respect to the size and weight of headgear, battery life and charging issues, slow real-time processing speeds, audio command overload, high system latency that can create safety concerns, and extensive and sometimes complex learning requirements.

Innovations in miniaturized computer hardware, battery charge longevity, AI-trained software to decrease latency in auditory commands, and the addition of lightweight wearable sensory augmentation material providing near-real-time haptic feedback are expected to make visual navigation assistance viable.

The team’s prototype visual assistance system, described in Nature Machine Intelligence, incorporates an RGB-D (red, green, blue, depth) camera mounted on a 3D-printed glasses frame, ultrathin artificial skins, a commercial lithium-ion battery, a wireless bone-conducting earphone and a virtual reality training platform interfaced via triboelectric smart insoles. The camera is connected to a microcontroller via USB, enabling all computations to be performed locally without the need for a remote server.

When a user sets a target using a voice command, AI algorithms process the RGB-D data to estimate the target’s orientation and determine an obstacle-free direction in real time. As the user begins to walk to the target, bone conduction earphones deliver spatialized cues to guide them, and the system updates the 3D scene in real time.

The system’s real-time visual recognition incorporates changes in distance and perspective, and can compensate for low ambient light and motion blur. To provide robust obstacle avoidance, it combines a global threshold method with a ground interval approach to accurately detect overhead hanging, ground-level and sunken obstacles, as well as sloping or irregular ground surfaces.

First author Jian Tang of Shanghai Jiao Tong University and colleagues tested three audio feedback approaches: spatialized cues, 3D sounds and verbal instructions. They determined that spatialized cues are the most rapid to convey and be understood and provide precise direction perception.

Real-world testing A visually impaired person navigates through a cluttered conference room. (Courtesy: Tang et al. Nature Machine Intelligence)

To complement the audio feedback, the researchers developed stretchable artificial skin – an integrated sensory-motor device that provides near-distance alerting. The core component is a compact time-of-flight sensor that vibrates to stimulate the skin when the distance to an obstacle or object is smaller than a predefined threshold. The actuator is designed as a slim, lightweight polyethylene terephthalate cantilever. A gap between the driving circuit and the skin promotes air circulation to improve skin comfort, breathability and long-term wearability, as well as facilitating actuator vibration.

Users wear the sensor on the back of an index or middle finger, while the actuator and driving circuit are worn on the wrist. When the artificial skin detects a lateral obstacle, it provides haptic feedback in just 18 ms.

The researchers tested the trained system in virtual and real-world environments, with both humanoid robots and 20 visually impaired individuals who had no prior experience of using visual assistance systems. Testing scenarios included walking to a target while avoiding a variety of obstacles and navigating through a maze. Participants’ navigation speed increased with training and proved comparable to walking with a cane. Users were also able to turn more smoothly and were more efficient at pathfinding when using the navigation system than when using a cane.

“The proficient completion of tasks mirroring real-world challenges underscores the system’s effectiveness in meeting real-life challenges,” the researchers write. “Overall, the system stands as a promising research prototype, setting the stage for the future advancement of wearable visual assistance.”

The post Visual assistance system helps blind people navigate appeared first on Physics World.

Light-activated pacemaker is smaller than a grain of rice

24 avril 2025 à 17:50

The world’s smallest pacemaker to date is smaller than a single grain of rice, optically controlled and dissolves after it’s no longer needed. According to researchers involved in the work, the pacemaker could work in human hearts of all sizes that need temporary pacing, including those of newborn babies with congenital heart defects.

“Our major motivation was children,” says Igor Efimov, a professor of medicine and biomedical engineering, in a press release from Northwestern University. Efimov co-led the research with Northwestern bioelectronics pioneer John Rogers.

“About 1% of children are born with congenital heart defects – regardless of whether they live in a low-resource or high-resource country,” Efimov explains. “Now, we can place this tiny pacemaker on a child’s heart and stimulate it with a soft, gentle, wearable device. And no additional surgery is necessary to remove it.”

The current clinical standard-of-care involves sewing pacemaker electrodes directly onto a patient’s heart muscle during surgery. Wires from the electrodes protrude from the patient’s chest and connect to an external pacing box. Placing the pacemakers – and removing them later – does not come without risk. Complications include infection, dislodgment, torn or damaged tissues, bleeding and blood clots.

To minimize these risks, the researchers sought to develop a dissolvable pacemaker, which they introduced in Nature Biotechnology in 2021. By varying the composition and thickness of materials in the devices, Rogers’ lab can control how long the pacemaker functions before dissolving. The dissolvable device also eliminates the need for bulky batteries and wires.

“The heart requires a tiny amount of electrical stimulation,” says Rogers in the Northwestern release. “By minimizing the size, we dramatically simplify the implantation procedures, we reduce trauma and risk to the patient, and, with the dissolvable nature of the device, we eliminate any need for secondary surgical extraction procedures.”

Light-controlled pacing
Light-controlled pacing When the wearable device (left) detects an irregular heartbeat, it emits light to activate the pacemaker. (Courtesy: John A Rogers/Northwestern University)

The latest iteration of the device – reported in Nature – advances the technology further. The pacemaker is paired with a small, soft, flexible, wireless device that is mounted onto the patient’s chest. The skin-interfaced device continuously captures electrocardiogram (ECG) data. When it detects an irregular heartbeat, it automatically shines a pulse of infrared light to activate the pacemaker and control the pacing.

“The new device is self-powered and optically controlled – totally different than our previous devices in those two essential aspects of engineering design,” says Rogers. “We moved away from wireless power transfer to enable operation, and we replaced RF wireless control strategies – both to eliminate the need for an antenna (the size-limiting component of the system) and to avoid the need for external RF power supply.”

Measurements demonstrated that the pacemaker – which is 1.8 mm wide, 3.5 mm long and 1 mm thick – delivers as much stimulation as a full-sized pacemaker. Initial studies in animals and in the human hearts of organ donors suggest that the device could work in human infants and adults. The devices are also versatile, the researchers say, and could be used across different regions of the heart or the body. They could also be integrated with other implantable devices for applications in nerve and bone healing, treating wounds and blocking pain.

The next steps for the research (supported by the Querrey Simpson Institute for Bioelectronics, the Leducq Foundation and the National Institutes of Health) include further engineering improvements to the device. “From the translational standpoint, we have put together a very early-stage startup company to work individually and/or in partnerships with larger companies to begin the process of designing the device for regulatory approval,” Rogers says.

The post Light-activated pacemaker is smaller than a grain of rice appeared first on Physics World.

Retinal stimulation reveals colour never before seen by the human eye

22 avril 2025 à 10:45

A new retinal stimulation technique called Oz enabled volunteers to see colours that lie beyond the natural range of human vision. Developed by researchers at UC Berkeley, Oz works by stimulating individual cone cells in the retina with targeted microdoses of laser light, while compensating for the eye’s motion.

Colour vision is enabled by cone cells in the retina. Most humans have three types of cone cells, known as L, M and S (long, medium and short), which respond to different wavelengths of visible light. During natural human vision, the spectral distribution of light reaching these cone cells determines the colours that we see.

Spectral sensitivity curves
Spectral sensitivity curves The response function of M cone cells overlaps completely with those of L and S cones. (Courtesy: Ben Rudiak-Gould)

Some colours, however, simply cannot be seen. The spectral sensitivity curves of the three cone types overlap – in particular, there is no wavelength of light that stimulates only the M cone cells without stimulating nearby L (and sometimes also S) cones as well.

The Oz approach, however, is fundamentally different. Rather than being based on spectral distribution, colour perception is controlled by shaping the spatial distribution of light on the retina.

Describing the technique in Science Advances, Ren Ng and colleagues showed that targeting individual cone cells with a 543 nm laser enabled subjects to see a range of colours in both images and videos. Intriguingly, stimulating only the M cone cells sent a colour signal to the brain that never occurs in natural vision.

The Oz laser system uses a technique called adaptive optics scanning light ophthalmoscopy (AOSLO) to simultaneously image and stimulate the retina with a raster scan of laser light. The device images the retina with infrared light to track eye motion in real time and targets pulses of visible laser light at individual cone cells, at a rate of 105 per second.

In a proof-of-principle experiment, the researchers tested a prototype Oz system on five volunteers. In a preparatory step, they used adaptive optics-based optical coherence tomography (AO-OCT) to classify the LMS spectral type of 1000 to 2000 cone cells in a region of each subject’s retina.

When exclusively targeting M cone cells in these retinal regions, subjects reported seeing a new blue–green colour of unprecedented saturation – which the researchers named “olo”. They could also clearly perceive Oz hues in image and video form, reliably detecting the orientation of a red line and the motion direction of a rotating red dot on olo backgrounds. In colour matching experiments, subjects could only match olo with the closest monochromatic light by desaturating it with white light – demonstrating that olo lies beyond the range of natural vision.

The team also performed control experiments in which the Oz microdoses were intentionally “jittered” by a few microns. With the target locations no longer delivered accurately, the subjects instead perceived the natural colour of the stimulating laser. In the image and video recognition experiments, jittering the microdose target locations reduced the task accuracy to guessing rate.

Ng and colleagues conclude that “Oz represents a new class of experimental platform for vision science and neuroscience [that] will enable diverse new experiments”. They also suggest that the technique could one day help to elicit full colour vision in people with colour blindness.

The post Retinal stimulation reveals colour never before seen by the human eye appeared first on Physics World.

❌