↩ Accueil

Vue lecture

Il y a de nouveaux articles disponibles, cliquez pour rafraîchir la page.

Nanocrystals measure tiny forces on tiny length scales

Par : No Author

Two independent teams in the US have demonstrated the potential of using the optical properties of nanocrystals to create remote sensors that measure tiny forces on tiny length scales. One team is based at Stanford University and used nanocrystals to measure the micronewton-scale forces exerted by a worm as it chewed bacteria. The other team is based at several institutes and used the photon avalanche effect in nanocrystals to measure sub-nanonewton to micronewton forces. The latter technique could potentially be used to study forces involved in processes such as stem cell differentiation.

Remote sensing of forces at small scales is challenging, especially inside living organisms. Optical tweezers cannot make remote measurements inside the body, while fluorophores – molecules that absorb and re-emit light – can measure forces in organisms, but have limited range, problematic stability or, in the case of quantum dots, toxicity. Nanocrystals with optical properties that change when subjected to external forces offer a way forward.

At Stanford, materials scientist Jennifer Dionne led a team that used nanocrystals doped with ytterbium and erbium. When two ytterbium atoms absorb near-infrared photons, they can then transfer energy to a nearby erbium atom. In this excited state, the erbium can either decay directly to its lowest energy state by emitting red light, or become excited to an even higher-energy state that decays by emitting green light. These processes are called upconversion.

Colour change

The ratio of green to red emission depends on the separation between the ytterbium and erbium atoms, and the separation between the erbium atoms – explains Dionne’s PhD student Jason Casar, who is lead author of a paper describing the Stanford research. Forces on the nanocrystal can change these separations and therefore affect that ratio.

The researchers encased their nanocrystals in polystyrene vessels approximately the size of a E coli bacterium. They then mixed the encased nanoparticles with E coli bacteria that were then fed to tiny nematode worms. To extract the nutrients, the worm’s pharynx needs to break open the bacterial cell wall. “The biological question we set out to answer is how much force is the bacterium generating to achieve that breakage?” explains Stanford’s Miriam Goodman.

The researchers shone near-infrared light on the worms, allowing them to monitor the flow of the nanocrystals. By measuring the colour of the emitted light when the particles reached the pharynx, they determined the force it exerted with micronewton-scale precision.

Meanwhile, a collaboration of scientists at Columbia University, Lawrence Berkeley National Laboratory and elsewhere has shown that a process called photon avalanche can be used to measure even smaller forces on nanocrystals. The team’s avalanching nanoparticles (ANPs) are sodium yttrium fluoride nanocrystals doped with thulium – and were discovered by the team in 2021.

The fun starts here

The sensing process uses a laser tuned off-resonance from any transition from the ground state of the ANP. “We’re bathing our particles in 1064 nm light,” explains James Schuck of Columbia University, whose group led the research. “If the intensity is low, that all just blows by. But if, for some reason, you do eventually get some absorption – maybe a non-resonant absorption in which you give up a few phonons…then the fun starts. Our laser is resonant with an excited state transition, so you can absorb another photon.”

This creates a doubly excited state that can decay radiatively directly to the ground state, producing an upconverted photon. Or, it energy can be transferred to a nearby thulium atom, which becomes resonant with the excited state transition and can excite more thulium atoms into resonance with the laser. “That’s the avalanche,” says Schuck; “We find on average you get 30 or 40 of these events – it’s analogous to a chain reaction in nuclear fission.”

Now, Schuck and colleagues have shown that the exact number of photons produced in each avalanche decreases when the nanoparticle experiences compressive force. One reason is that the phonon frequencies are raised as the lattice is compressed, making non-radiatively decay energetically more favourable.

The thulium-doped nanoparticles decay by emitting either red or near infrared photons. As the force increases, the red dims more quickly, causing a change in the colour of the emitted light. These effects allowed the researchers to measure forces from the sub-nanonewton to the micronewton range – at which point the light output from the nanoparticles became too low to detect.

Not just for forces

Schuck and colleagues are now seeking practical applications of their discovery, and not just for measuring forces.

“We’re discovering that this avalanching process is sensitive to a lot of things,” says Schuck. “If we put these particles in a cell and we’re trying to measure a cellular force gradient, but the cell also happened to change its temperature, that would also affect the brightness of our particles, and we would like to be able to differentiate between those things. We think we know how to do that.”

If the technique could be made to work in a living cell, it could be used to measure tiny forces such as those involved in the extra-cellular matrix that dictate stem cell differentiation.

Andries Meijerink of Utrecht University in the Netherlands believes both teams have done important work that is impressive in different ways. Schuck and colleagues for unveiling a fundamentally new force sensing technique and Dionne’s team for demonstrating a remarkable practical application.

However, Meijerink is sceptical that photon avalanching will be useful for sensing in the short term. “It’s a very intricate process,” he says, adding, “There’s a really tricky balance between this first absorption step, which has to be slow and weak, and this resonant absorption”. Nevertheless, he says that researchers are discovering other systems that can avalanche. “I’m convinced that many more systems will be found,” he says.

Both studies are described in Nature. Dionne and colleagues report their results here, and Schuck and colleagues here.

The post Nanocrystals measure tiny forces on tiny length scales appeared first on Physics World.

NMR technology shows promise in landmine clearance field trials

Par : No Author

Novel landmine detectors based on nuclear magnetic resonance (NMR) have passed their first field-trial tests. Built by the Sydney-based company mRead, the devices could speed up the removal of explosives in former war zones. The company tested its prototype detectors in Angola late last year, finding that they could reliably sense explosives buried up to 15 cm underground — the typical depth of a deployed landmine.

Landmines are a problem in many countries recovering from armed conflict. According to NATO, some 110 million landmines are located in 70 countries worldwide including Cambodia and Bosnia despite conflict ending in both nations decades ago. Ukraine is currently the world’s most mine-infested country, making vast swathes of Ukraine’s agricultural land potentially unusable for decades.

Such landmines also continue to kill innocent civilians. According to the Landmine and Cluster Munition Monitor, nearly 2000 people died from landmine incidents in 2023 – double the number compared to 2022 – and a further 3660 were injured. Over 80% of the casualties were civilians, with children accounting for 37% of deaths.

Humanitarian “deminers”, who are trying to remove these explosives, currently inspect suspected minefields with hand-held metal detectors. These devices use magnetic induction coils that respond to the metal components present in landmines. Unfortunately, they react to every random piece of metal and shrapnel in the soil, leading to high rates of false positives.

“It’s not unreasonable with a metal detector to see 100 false alarms for every mine that you clear,” says Matthew Abercrombie, research and development officer at the HALO Trust, a de-mining charity. “Each of these false alarms, you still have to investigate as if it were a mine.” But for every mine excavated, about 50 hours is wasted on excavating false positives, meaning that clearing a single minefield could take months or years.

“Landmines make time stand still,” adds HALO Trust research officer Ronan Shenhav. “They can lie silent and invisible in the ground for decades. Once disturbed they kill and maim civilians, as well as valuable livestock, preventing access to schools, roads, and prime agricultural land.”

Hope for the future

One alternative landmine-detecting technology option is NMR, which is already widely used to look for underground mineral resources and scan for drugs at airports. NMR results in nuclei inside atoms emitting a weak electromagnetic signal in the presence of a strong constant magnetic field and a weak oscillating field. As the frequency of the signal depends on the molecule’s structure, every chemical compound has a specific electromagnetic fingerprint.

The problem with using it to sniff out landmines is pervasive environmental radio noise, with the electromagnetic signal emitted by the excited molecules being 16 orders of magnitude weaker than that used to trigger the effect. Digital radio transmission, electricity generators and industrial infrastructure all produce noise of the same frequency as the one the detectors are listening for. Even thunderstorms trigger such a radio hum that can spread across vast distances.

mRead scanner
The handheld detectors developed by MRead emit radio pulses at frequencies between 0.5 and 5 MHz (courtesy: mRead)

“It’s easier to listen to the Big Bang at the edge of the Universe,” says Nick Cutmore, chief technology officer at mRead. “Because the signal is so small, every interference stops you. That stopped a lot of practical applications of this technique in the past.” Cutmore is part of a team that has been trying to cut the effects of noise since the early 2000s, eventually finding a way to filter out this persistent crackle through a proprietary sensor design.

MRead’s handheld detectors emit radio pulses at frequencies between 0.5 and 5 MHz, which are much higher than the kilohertz-range frequencies used by conventional metal detectors. The signal elicits the magnetic resonance response in atoms of sodium, potassium and chlorine, which are commonly found in explosives. A sensor inside the detector “listens out” for the particular fingerprint signal, locating a forgotten mine more precisely than is possible with conventional metal detectors.

With over two million landmines laid in Ukraine since 2022, landmine clearance needs to be faster, safer, and smarter

James Cowan

Given that the detected signal is so small, it has be amplified, but this resulted in adding noise. The company says it has found a way to make sure the electronics in the detector do not exacerbate the problem. “Our current handheld system only consumes 40 to 50 W when operating,” says Cutmore. “Previous systems have sometimes operated at a few kilowatts, making them power-hungry and bulky.”

Having tested the prototype detectors in a simulated minefield in Australia in August 2024, mRead engineers have now deployed them in minefields in Angola in cooperation with the HALO Trust. As the detectors respond directly to the explosive substance, they almost eliminated false positives completely, allowing deminers to double-check locations flagged by metal detectors before time-consuming digging took place.

During the three-week trial, the researchers also detected mines that had a low content of metal, which is difficult to spot with metal detectors.“Instead of doing 1000 metal detections and finding one mine, we can isolate those detections and very quickly before people start digging,” says Cutmore.

Researchers at mRead plan to return to Angola later this year for further tests. They also want to finetune their prototypes and begin working on devices that could be produced commercially. “I am tremendously excited by the results of these trials,” says James Cowan, chief executive officer of the HALO Trust. “With over two million landmines laid in Ukraine since 2022, landmine clearance needs to be faster, safer, and smarter.”

The post NMR technology shows promise in landmine clearance field trials appeared first on Physics World.

Vacuum expertise enables physics research

Whether creating a contaminant-free environment for depositing material or minimizing unwanted collisions in spectrometers and accelerators, vacuum environments are a crucial element of many scientific endeavours. Creating and maintaining very low pressures requires a holistic approach to system design that includes material selection, preparation, and optimization of the vacuum chamber and connection volumes. Measurement strategies also need to be considered across the full range of vacuum to ensure consistent performance and deliver the expected outcomes from the experiment or process.

Developing a vacuum system that achieves the optimal low-pressure conditions for each application, while also controlling the cost and footprint of the system, is a complex balancing act that benefits from specialized expertise in vacuum science and engineering. A committed technology partner with extensive experience of working with customers to design vacuum systems, including those for physics research, can help to define the optimum technologies that will produce the best solution for each application.

Over many years, the technology experts at Agilent have assisted countless customers with configuring and enhancing their vacuum processes. “Our best successes come from collaborations where we take the time to understand the customer’s needs, offer them guidance, and work together to create innovative solutions,” comments John Screech, senior applications engineer at Agilent. “We strive to be a trusted partner rather than just a commercial vendor, ensuring our customers not only have the right tools for their needs, but also the information they need to achieve their goals.”

In his role Screech works with customers from the initial design phase all the way through to installation and troubleshooting. “Many of our customers know they need vacuum, but they don’t have the time or resources to really understand the individual components and how they should be put together,” he says. “We are available to provide full support to help customers create a complete system that performs reliably and meets the requirements of their application.”

In one instance, Screech was able to assist a customer who had been using an older technology to create an ultrahigh vacuum environment. “Their system was able to produce the vacuum they needed, but it was unreliable and difficult to operate,” he remembers. By identifying the problem and supporting the migration to a modern, simpler technology, Screech helped his customer to achieve the required vacuum conditions improve uptime and increase throughput.

Agilent collaborates with various systems integrators to create custom vacuum solutions for scientific instruments and processes. Such customized designs must be compact enough to be integrated within the system, while also delivering the required vacuum performance at a cost-effective price point. “Customers trust us to find a practical and reliable solution, and realize that we will be a committed partner over the long term,” says Screech.

Expert partnership yields success

The company also partners with leading space agencies and particle physics laboratories to create customized vacuum solutions for the most demanding applications. For many years, Agilent has supplied high-performance vacuum pumps to CERN, which created the world’s largest vacuum system to prevent unwanted collisions between accelerated particles and residual gas molecules in the Large Hadron Collider.

particle collider
Physics focus: The Large Hadron Collider (Courtesy: Shuttershock Ralf Juergen Kraft)

When engineering a vacuum solution that meets the exact specifications of the facility, one key consideration is the physical footprint of the equipment. Another is ensuring that the required pumping performance is achieved without introducing any unwanted effects – such as stray magnetic fields – into the highly controlled environment. Agilent vacuum experts have the experience and knowledge to engineer innovative solutions that meet such a complex set of criteria. “These large organizations already have highly skilled vacuum engineers who understand the unique parameters of their system, but even they can benefit from our expertise to transform their requirements into a workable solution,” says Screech.

Agilent also shares its knowledge and experience through various educational opportunities in vacuum technologies, including online webinars and dedicated training courses. The practical aspects of vacuum can be challenging to learn online, so in-person classes emphasize a hands-on approach that allows participants to assemble and characterize rough- and high-vacuum systems. “In our live sessions everyone has the opportunity to bolt a system together, test which configuration will pump down faster, and gain insights into leak detection,” says Screech. “We have students from industry and academia in the classes, and they are always able to share tips and techniques with one another.” Additionally, the company maintains a vacuum community as an online resource, where questions can be posed to experts, and collaboration among users is encouraged.

Agilent recognizes that vacuum is an enabler for scientific research and that creating the ideal vacuum system can be challenging. “Customers can trust Agilent as a technology partner,” says Screech. “We can share our experience and help them create the optimal vacuum system for their needs.”

The post Vacuum expertise enables physics research appeared first on Physics World.

Wafer mask alignment: Queensgate focuses on the move to 300 mm

Par : No Author

Electronic chips are made using photolithography, which involves shining ultraviolet light through a patterned mask and onto a semiconductor wafer. The light activates a photoresist on the surface, which allows the etching of a pattern on the wafer. Through successive iterations of photolithography and the deposition of metals, devices with features as small as a few dozen nanometres are created.

Crucial to this complex manufacturing process is aligning the wafer with successive masks. This must be done in a rapid and repeatable manner, while maintaining  nanometre precision throughout the manufacturing process. That’s where Queensgate – part of precision optical and mechanical instrumentation manufacturer Prior Scientific – comes into the picture.

For 45 years, UK-based Queensgate has led the way in the development of nanopositioning technologies. The firm spun out of Imperial College London in 1979 as a supplier of precision instrumentation for astronomy. Its global reputation was sealed when NASA chose Queensgate technology for use on the Space Shuttle and the International Space Station. The company has worked for over two decades with the hard-disk drive-maker Seagate to develop technologies for the rapid inspection of read/write heads during manufacture.  Queensgate is also involved in a longstanding collaboration with the UK’s National Physical Laboratory (NPL) to develop nanopositioning technologies that are being used to define international standards of measurement.

Move to larger wafers

The semiconductor industry is in the process of moving from 200 mm to 300 mm wafers – which doubles the number of chips that can be produced from a wafer. Processing the larger and heavier wafers requires a new generation of equipment that can position wafers at nanometre precision.

Queensgate already works with original equipment manufacturers (OEMs) to make optical wafer-inspection systems that are used to identify defects during the processing of 300 mm wafers. Now the company has set its sights on wafer alignment systems. The move to 300 mm wafers offers the company an opportunity to contribute to the development the next-generation alignment systems – says Queensgate product manager Craig Goodman.

Craig Goodman
Craig Goodman (Courtesy: Queensgate)

“The wafers are getting bigger, which puts a bigger strain on the positioning requirements and we’re here to help solve problems that that’s causing,” explains Goodman. “We are getting lots of inquiries from OEMs about how our technology can be used in the precision positioning of wafers used to produce next-generation high-performance semiconductor devices”.

The move to 300 mm means that fabs need to align wafers that are both larger in area and much heavier. What is more, a much heavier chuck is required to hold a 300 mm wafer during production. This leads to conflicting requirements for a positioning system. It must be accurate over shorter distances as feature sizes shrink, but also be capable of moving a much larger and much heavier wafer and chuck. Today, Queensgate’s wafer stage can handle wafers weighing up to 14 kg while achieving a spatial resolution of 1.5 nm.

Goodman explains that Queensgate’s technology is not used to make large adjustments in the relative alignment of wafer and mask – which is done by longer travel stages using technologies such as air-bearings. Instead, the firm’s nanopositioning systems are used in the final stage of alignment, moving the wafer by less than 1 mm at nanometre precision.

Eliminating noise

Achieving this precision was a huge challenge that Queensgate has overcome by focusing on the sources of noise in its nanopositioning systems. Goodman says that there are two main types of noise that must be minimized. One is external vibration, which can come from a range of environmental sources – even human voices. The other is noise in the electronics that control the nanopositioning system’s piezoelectric actuators.

Goodman explains that noise reduction is achieved through the clever design of the mechanical and electronic systems used for nanopositioning. The positioning stage, for example, must be stiff to reject vibrational noise and notch filters are used to minimize the effect of electronic noise to the sub-nanometre level.

Queensgate provides its nanopostioning technology to OEMs, who integrate it within their products – which are then sold to chipmakers. Goodman says that Queensgate works in-house with its OEM customers to ensure that the desired specifications are achieved. “A stage or a positioner for 300 mm wafers is a highly customized application of our technologies,” he explains.

While the resulting nanopositioning systems are state of the art, Goodman points out that they will be used in huge facilities that process tens of thousands of wafers per month. “It is our aim and our customer’s aim that Queensgate nanopositioning technologies will be used in the mass manufacture of chips,” says Goodman. This means that the system must be very fast to achieve high throughput. “That is why we are using piezoelectric actuators for the final micron of positioning – they are very fast and very precise.”

Today most chip manufacturing is done in Asia, but there are ongoing efforts to boost production in the US and Europe to ensure secure supplies in the future. Goodman says this trend to semiconductor independence is an important opportunity for Queensgate. “It’s a highly competitive, growing and interesting market to be a part of,” he says.

The post Wafer mask alignment: Queensgate focuses on the move to 300 mm appeared first on Physics World.

Setting the scale: the life and work of Anders Celsius

Par : No Author

On Christmas Day in 1741, when Swedish scientist Anders Celsius first noted down the temperature in his Uppsala observatory using his own 100-point – or “Centi-grade” – scale, he would have had no idea that this was to be his greatest legacy.

A newly published, engrossing biography – Celsius: a Life and Death by Degrees  – by Ian Hembrow, tells the life story of the man whose name is so well known. The book reveals the broader scope of Celsius’ scientific contributions beyond the famous centigrade scale, as well as highlighting the collaborative nature of scientific endeavours, and drawing parallels to modern scientific challenges such as climate change.

That winter, Celsius, who was at the time in his early 40s, was making repeated measurements of the period of a pendulum – the time it takes for one complete swing back and forth. He could use that to calculate a precise value for the acceleration caused by gravity, and he was expecting to find that value to be very slightly greater in Sweden than at more southern latitudes. That would provide further evidence for the flattening of the Earth at the poles, something that Celsius had already helped establish. But it required great precision in the experimental work, and Celsius was worried that the length (and therefore the period) of the pendulum would vary slightly with temperature. He had started these measurements that summer and now it was winter, which meant he had lit a fire to hopefully match the summer temperatures. But would that suffice?

Throughout his career, Celsius had been a champion of precise measurement, and he knew that temperature readings were often far from precise. He was using a thermometer sent to him by the French astronomer Joseph-Nicolas Delisle, with a design based on the expansion of mercury. That method was promising, but Delisle used a scale that took the boiling point of water and the temperature in the basement of his home in Paris as its two  reference points. Celsius was unconvinced by the latter. So he made adaptations (which are still there to be seen in an Uppsala museum), twisting wire around the glass tube at the boiling and freezing points of water, and dividing the length between the two into 100 even steps.

Anders Celsius
Man of scale Anders Celsius, painted by Magnus Bratt. Copy of Olof Arenius’s portrait of Celsius, which is at the Astronomical Observatory in Uppsala, Sweden. (Courtesy: Museum Gustavianum, photograph by Mikael Wallerstedt)

The centigrade scale, later renamed in his honour, was born. In his first recorded readings he found the temperature in the pleasantly heated room to be a little over 80 degrees! Following Delisle’s system – perhaps noting that this would mean he had to do less work with negative numbers – he placed the boiling point at zero on his scale, and the freezing point at 100. It was some years later, after his death, that a scientific consensus flipped the scale on its head to create the version we know so well today.

Hembrow does a great job at placing this moment in the context of the time, and within the context of Celsius’ life. He spends considerable time recounting the scientist’s many other achievements and the milestones of his fascinating life.

The expedition that had established the flattening of the Earth at the poles was the culmination of a four-year grand tour that Celsius had undertaken in his early 30s. Already a professor at Uppsala University, in the town where he had grown up in an academic family, he travelled to Germany, Italy, France and London. There he saw at first hand the great observatories that he had heard of and established links with the people who had built and maintained them.

On his extended travels he became a respected figure in the world of science and so it was no surprise when he was selected to join a French expedition to the Arctic in 1736, led by mathematician Pierre Louis Maupertuis, to measure a degree of latitude. Isaac Newton had died just a few years before and his ideas relating to gravitation were not yet universally accepted. If it could be shown that the distance between two lines of latitude was greater near the poles than on the equator, that would prove Newton right about the shape of the Earth, a key prediction of his theory of gravitation.

After a period of time in London equipping themselves with the precision instruments, the team started the arduous journey to the Far North. Once there they had to survey the land – a task made challenging by the thick forest and hilly territory. They selected nine mountains to climb with their heavy equipment, felling dozens of trees on each and then creating a sturdy wooden marker on each peak. This allowed them to create a network of triangles stretching north, with each point visible from the two next to it. But they also needed one straight line of known length to complete their calculations. With his local knowledge, Celsius knew that this could only be achieved on the frozen surface of the Torne river – and that it would involve several weeks of living on the ice, working largely in the dark and the intense cold, and sleeping in tents.

After months of hardship, the calculations were complete and showed that the length of one degree of latitude in the Arctic was almost 1.5 km longer than the equivalent value in France. The spheroid shape of the Earth had been established.

Of course, not everybody accepted the result. Politics and personalities got in the way. Hembrow uses this as the starting point for a polemic about aspects of modern science and climate change with which he ends his fine book. He argues that the painstaking work carried out by an international team, willing to share ideas and learn from each other, provides us with a template by which modern problems must be addressed.

Considering how often we use his name, most of us know little about Celsius. This book helps to address that deficit. It is a very enjoyable and accessible read and would appeal, I think, to anybody with an interest in the history of science.

  • 2024 History Press 304pp £25hb

The post Setting the scale: the life and work of Anders Celsius appeared first on Physics World.

Millions of smartphones monitor Earth’s ever-changing ionosphere

Par : No Author

A plan to use millions of smartphones to map out real-time variations in Earth’s ionosphere has been tested by researchers in the US. Developed by Brian Williams and colleagues at Google Research in California, the system could improve the accuracy of global navigation satellite systems (GNSSs) such as GPS and provide new insights into the ionosphere.

A GNSS uses a network of satellites to broadcast radio signals to ground-based receivers. Each receiver calculates its position based on the arrival times of signals from several satellites. These signals first pass through Earth’s ionosphere, which is a layer of weakly-ionized plasma about 50–1500 km above Earth’s surface. As a GNSS signal travels through the ionosphere, it interacts with free electrons and this slows down the signals slightly – an effect that depends on the frequency of the signal.

The problem is that the free electron density is not constant in either time or space. It can spike dramatically during solar storms and it can also be affected by geographical factors such as distance from the equator. The upshot is that variations in free electron density can lead to significant location errors if not accounted for properly.

To deal with this problem, navigation satellites send out two separate signals at different frequencies. These are received by dedicated monitoring stations on Earth’s surface and the differences between arrival times of the two frequencies is used create a real-time maps of the free electron density of the ionosphere. Such maps can then be used to correct location errors. However, these monitoring stations are expensive to install and tend to be concentrated in wealthier regions of the world. This results in large gaps in ionosphere maps.

Dual-frequency sensors

In their study, Williams’ team took advantage of the fact that many modern mobile phones have sensors that detect GNSS signals at two different frequencies. “Instead of thinking of the ionosphere as interfering with GPS positioning, we can flip this on its head and think of the GPS receiver as an instrument to measure the ionosphere,” Williams explains. “By combining the sensor measurements from millions of phones, we create a detailed view of the ionosphere that wouldn’t otherwise be possible.”

This is not a simple task, however, because individual smartphones are not designed for mapping the ionosphere. Their antennas are much less efficient than those of dedicated monitoring stations and the signals that smartphones receive are often distorted by surrounding buildings – and even users’ bodies. Also, these measurements are affected by the design of the phone and its GNSS hardware.

The big benefit of using smartphones is that their ownership is ubiquitous across the globe – including in developing regions such as India, Africa, and Southeast Asia. “In these parts of the world, there are still very few dedicated scientific monitoring stations that are being used by scientists to generate ionosphere maps,” says Williams. “Phone measurements provide a view of parts of the ionosphere that isn’t otherwise possible.”

The team’s proposal involves creating a worldwide network comprising millions of smartphones that will each carry out error correction measurements using the dual-frequency signals from GNSS satellites. Although each individual measurement will be relatively poor, the large number of measurements can be used to improve the overall accuracy of the map.

Simultaneous calibration

“By combining measurements from many phones, we can simultaneously calibrate the individual sensors and produce a map of ionosphere conditions, leading to improved location accuracy, and a better understanding of this important part of the Earth’s atmosphere,” Williams explains.

In their initial tests of the system, the researchers aggregated ionosphere measurements from millions of Android devices around the world. Crucially, there was no need to identify individual devices contributing to the study – ensuring the privacy and security of users.

Williams’ team was able to map a diverse array of variations in Earth’s ionosphere. These included plasma bubbles over India and South America; the effects of a small solar storm over North America; and a depletion in free electron density over Europe. These observations doubled the coverage are of existing maps and boosted resolution when compared to maps made using data from monitoring stations.

If such a smartphone-based network is rolled out, ionosphere-related location errors could be reduced by several metres – which would be a significant advantage to smartphone users.

“For example, devices could differentiate between a highway and a parallel rugged frontage road,” Williams predicts. “This could ensure that dispatchers send the appropriate first responders to the correct place and provide help more quickly.”

The research is described in Nature.

The post Millions of smartphones monitor Earth’s ever-changing ionosphere appeared first on Physics World.

On the proper use of a Warburg impedance

Par : No Author

Recent battery papers commonly employ interpretation models for which diffusion impedances are in series with interfacial impedance. The models are fundamentally flawed because the diffusion impedance should be part of the interfacial impedance. A general approach is presented that shows how the charge-transfer resistance and diffusion resistance are functions of the concentration of reacting species at the electrode surface. The resulting impedance model incorporates diffusion impedances as part of the interfacial impedance.

A Q&A session follows the presentation.

Mark Orazem

Mark Orazem obtained his BS and MS degrees from Kansas State University and his PhD in 1983 from the University of California, Berkeley. In 1983, he began his career as assistant professor at the University of Virginia, and in 1988 joined the faculty of the University of Florida, where he is Distinguished Professor of Chemical Engineering and Associate Chair for Graduate Studies. Mark is a fellow of The Electrochemical Society, International Society of Electrochemistry, and American Association for the Advancement of Science. He served as President of the International Society of Electrochemistry and co-authored, with Bernard Tribollet of the Centre national de la recherche scientifique (CNRS), the textbook entitled Electrochemical Impedance Spectroscopy, now in its second edition. Mark received the ECS Henry B. Linford Award, ECS Corrosion Division H. H. Uhlig Award, and with co-author Bernard Tribollet, the 2019 Claude Gabrielli Award for contributions to electrochemical impedance spectroscopy. In addition to writing books, he has taught short courses on impedance spectroscopy for The Electrochemical Society since 2000.

 

The Electrochemical Society

The post On the proper use of a Warburg impedance appeared first on Physics World.

❌