How to Make AI Faster and SmarterâWith a Little Help from Physics
The White House is withdrawing the nomination of Jared Isaacman to be administrator of NASA, throwing an agency already reeling from proposed massive budget cuts into further disarray.
The post White House to withdraw Isaacman nomination to lead NASA appeared first on SpaceNews.
Blue Origin sent six people to space on a suborbital spaceflight May 31 that the companyâs chief executive says is both a good business and a way to test technology.
The post Blue Origin performs 12th crewed New Shepard suborbital flight appeared first on SpaceNews.
Firm evidence of Majorana bound states in quantum dots has been reported by researchers in the Netherlands. Majorana modes appeared at both edges of a quantum dot chain when an energy gap suppressed them in the centre, and the experiment could allow researchers to investigate the unique properties of these particles in hitherto unprecedented detail. This could bring topologically protected quantum bits (qubits) for quantum computing one step closer.
Majorana fermions were first proposed in 1937 by the Italian physicist Ettore Majorana. They were imagined as elementary particles that would be their own antiparticles. However, such elementary particles have never been definitively observed. Instead, physicists have worked to create Majorana quasiparticles (particle-like collective excitations) in condensed matter systems.
In 2001, the theoretical physicist Alexei Kitaev at Microsoft Research, proposed that âMajorana bound statesâ could be produced in nanowires comprising topological superconductors. The Majorana quasiparticle would exist as a single nonlocal mode at either end of a wire, while being zero-valued in the centre. Both ends would be constrained by the laws of physics to remain identical despite being spatially separated. This phenomenon could produce âtopological qubitsâ robust to local disturbance.
Microsoft and others continue to research Majorana modes using this platform to this day. Multiple groups claim to have observed them, but this remains controversial. âItâs still a matter of debate in these extended 1D systems: have people seen them? Have they not seen them?â, says Srijit Goswami of QuTech in Delft.
In 2012, theoretical physicists Jay Sau, then of Harvard University and Sankar Das Sarma of the University of Maryland proposed looking for Majorana bound states in quantum dots. âWe looked at [the nanowires] and thought âOK, this is going to be a while given the amount of disorder that system has â what are the ways this disorder could be controlled?â and this is exactly one of the ways we thought it could work,â explains Sau. The research was not taken seriously at the time, however, Sau says, partly because people underestimated the problem of disorder.
Goswami and others have previously observed âpoor manâs Majoranasâ (PMMs) in two quantum dots. While they share some properties with Majorana modes, PMMs lack topological protection. Last year the group coupled two spin-polarized quantum dots connected by a semiconductorâsuperconductor hybrid material. At specific points, the researchers found zero-bias conductance peaks.
âKitaev says that if you tune things exactly right you have one Majorana on one dot and another Majorana on another dot,â says Sau. âBut if youâre slightly off then theyâre talking to each other. So itâs an uncomfortable notion that theyâre spatially separated if you just have two dots next to each other.â
Recently, a group that included Goswamiâs colleagues at QuTech found that the introduction of a third quantum dot stabilized the Majorana modes. However, they were unable to measure the energy levels in the quantum dots.
In new work, Goswamiâs team used systems of three electrostatically-gated, spin-polarized quantum dots in a 2D electron gas joined by hybrid semiconductorâsuperconductor regions. The quantum dots had to be tuned to zero energy. The dots exchanged charge in two ways: by standard electron hopping through the semiconductor and by Cooper-pair mediated coupling through the superconductor.
âYou have to change the energy level of the superconductorâsemiconductor hybrid region so that these two processes have equal probability,â explains Goswami. âOnce you satisfy these conditions, then you get Majoranas at the ends.â
In addition to more topological protection, the addition of a third qubit provided the team with crucial physical insight. âTopology is actually a property of a bulk system,â he explains; âSomething special happens in the bulk which gives rise to things happening at the edges. Majoranas are something that emerge on the edges because of something happening in the bulk.â With three quantum dots, there is a well-defined bulk and edge that can be probed separately: âWe see that when you have what is called a gap in the bulk your Majoranas are protected, but if you donât have that gap your Majoranas are not protected,â Goswami says.
To produce a qubit will require more work to achieve the controllable coupling of four Majorana bound states and the integration of a readout circuit to detect this coupling. In the near-term, the researchers are investigating other phenomena, such as the potential to swap Majorana bound states.
Sau is now at the University of Maryland and says that an important benefit of the experimental platform is that it can be determined unambiguously whether or not Majorana bound states have been observed. âYou can literally put a theory simulation next to the experiment and they look very similar.â
The research is published in Nature.
The post Majorana bound states spotted in system of three quantum dots appeared first on Physics World.
NASA released more details about its proposed fiscal year 2026 budget May 30, canceling dozens of science missions and cutting thousands of jobs.
The post NASA budget would cancel dozens of science missions, lay off thousands appeared first on SpaceNews.
GPS III SV-08, built by Lockheed Martin, is the eighth of 10 GPS III spacecraft.
The post SpaceX launches latest GPS III satellite for U.S. military appeared first on SpaceNews.
The Leinweber Foundation has awarded five US institutions $90m to create their own theoretical research institutes. The investment, which the foundation says is the largest ever for theoretical physics research, will be used to fund graduate students and postdocs at each institute as well as several Leinweber Physics Fellows.
The Leinweber Foundation was founded in 2015 by the software entrepreneur Larry Leinweber. In 1982 Leinweber founded the software company New World Systems Corporation, which provided software to the emergency services. In 2015 he sold the company to Tyler Technologies for $670m.
Based in Michigan, Leinweber Foundation supports research, education and community endeavours where it has provided Leinweber Software Scholarships to undergraduates at Michiganâs universities.
A Leinweber Institute for Theoretical Physics (LITP) will now be created at the universities of California, Berkeley, Chicago and Michigan as well as at the Massachusetts Institute of Technology (MIT) and at Princetonâs Institute for Advanced Study (IAS), where the institute will instead be named the Leinweber Forum for Theoretical and Quantum Physics.
The MIT LIPT, initially led by Washington Taylor before physicist Tracy Slatyer takes over later this year, will receive $20m from the foundation and will provide support for six postdocs, six graduate students as well as visitors, seminars and âother scholarly activitiesâ.
âThis landmark endowment from the Leinweber Foundation will enable us to support the best graduate students and postdoctoral researchers to develop their own independent research programmes and to connect with other researchers in the Leinweber Institute network,â says Taylor.
UC Berkeley, meanwhile, will receive $14.4m from the foundation in which the existing Berkeley Center for Theoretical Physics (BITP) will be renamed LITP at Berkeley and led by physicist Yasunori Nomura.
The money will be used for four postdoc positions to join the existing 15 at the BITP as well as to support graduate students and visitors. âThis is transformative,â notes Nomura. âThe gift will really have a huge impact on a wide range of research at Berkeley, including particle physics, quantum gravity, quantum information, condensed matter physics and cosmology.â
Chicago will receive $18.4m where the existing Kadanoff Center for Theoretical Physics will be merged into a new LITP at the University of Chicago and led by physicist Dam Thanh Son.
The remaining $37.2m will be split between the Leinweber Forum for Theoretical and Quantum Physics at the IAS and at Michigan, in which the existing Leinweber Center for Theoretical Physics will expand and become an institute.
âTheoretical physics may seem abstract to many, but it is the tip of the spear for innovation. It fuels our understanding of how the world works and opens the door to new technologies that can shape society for generations,â says Leinweber in a statement. âAs someone who has had a lifelong fascination with theoretical physics, I hope this investment not only strengthens U.S. leadership in basic science, but also inspires curiosity, creativity, and groundbreaking discoveries for generations to come.â
The post Leinweber Foundation ploughs $90m into US theoretical physics appeared first on Physics World.
Despite a surge of interest in Europe in establishing autonomy in space systems, there remains skepticism that one of the biggest efforts along those lines.
The post Skepticism lingers about cost and business case for IRIS² appeared first on SpaceNews.
China has launched its first mission to retrieve samples from an asteroid. The Tianwen-2 mission launched at 01:31 a.m. local time on 28 May from the Xichang satellite launch center, southwest China, aboard a Long March B rocket.
Tianwen-2âs target is a small near-Earth asteroid called 469219 KamoĘťoalewa, which is between 15-39 million km away and is known as a âquasi-satelliteâ of Earth.
The mission is set to reach the body, which is between 40-100 m wide, in July 2026 where it will first study it up close using a suite of 11 instruments including cameras, spectrometers and radar, before aiming to collect about 100 g of material.
This will be achieved via three possible methods. One is via hovering close to the asteroid , the other is using a robotic arm to collect samples from the body while a third is dubbed âtouch and goâ, which involves gently landing on the asteroid and using drills at the end of each leg to retrieve material.
The collected samples will then be stored in a module that is released and returned to Earth in November 2027. If successful, it will make China the third nation to retrieve asteroid material behind the US and Japan.
The second part of the 10-year mission involves using Earth for a gravitational swing-by to spend six year travelling to another target â 311P/PanSTARRS. The body lies in the main asteroid belt between Mars and Jupiter and at its closest distance is about 140 million km away from Earth.
The 480 m-wide object, which was discovered in 2013, has six dust tails and has characteristics of both asteroids and comets. Tianwen-2 will not land on 311P/PanSTARRS but instead use its instruments to study the âactive asteroidâ from a distance.
Tianwen-2âs predecessor, Tianwen-1, was Chinaâs first mission to Mars, successfully landing on Utopia Planitia â a largely flat impact basin but scientifically interesting with potential water-ice underneath â following a six-month journey.
Chinaâs third interplanetary mission, Tianwen-3, will aim to retrieve sample from Mars and could launch as soon as 2028. If successful, it would make China the first country to achieve the feat.
The post China launches Tianwen-2 asteroid sample-return mission appeared first on Physics World.
Researchers in China have adapted the interlocking structure of mortise-and-tenon joints â as used by woodworkers around the world since ancient times â to the design of nanoscale devices known as memristors. The new devices are far more uniform than previous such structures, and the researchers say they could be ideal for scientific computing applications.
The memory-resistor, or âmemristorâ, was described theoretically at the beginning of the 1970s, but the first practical version was not built until 2008. Unlike standard resistors, the resistance of a memristor changes depending on the current previously applied to it, hence the âmemoryâ in its name. This means that a desired resistance can be programmed into the device and subsequently stored. Importantly, the remembered value of the resistive state persists even when the power is switched off.
Thanks to numerous technical advances since 2008, memristors can now be integrated onto chips in large numbers. They are also capable of processing large amounts of data in parallel, meaning they could be ideal for emerging âin-memoryâ computing technologies that require calculations known as large-scale matrix-vector multiplications (MVMs). Many such calculations involve solving partial differential equations (PDEs), which are used to model complex behaviour in fields such as weather forecasting, fluid dynamics and astrophysics, to name but a few.
One remaining hurdle, however, is that it is hard to make memristors with uniform characteristics. The electronic properties of devices containing multiple memristors can therefore vary considerably, which adversely affects the computational accuracy of large-scale arrays.
Physicists co-led by Shi-Jun Liang and Feng Miao of Nanjing Universityâs School of Physics say they have now overcome this problem by designing a memristor that uses a mortise-tenon-shaped (MTS) architecture. Humans have been using these strong and stable structures in wooden furniture for thousands of years, with one of the earliest examples dating back to the Hemudu culture in China 7 000 years ago.
Liang, Miao and colleagues created the mortise part of their structure by using plasma etching to create a hole within a nanosized-layer of hexagonal boron nitride (h-BN). They then constructed a tenon in a top electrode made of tantalum (Ta) that precisely matches the mortise. This ensures that this electrode directly contacts the deviceâs switching layer (which is made from HfO2) only in the designated region. A bottom electrode completes the device.
The new architecture ensures highly uniform switching within the designated mortise-and-tenon region, resulting in a localized path for electronic conduction. âThe result is a memristor with exceptional fundamental properties across three key metrics,â Miao tells Physics World. âThese are: high endurance (over more than 109 cycles); long-term and stable memory retention (of over 104 s), and a fast switching speed of around 4.2 ns.â
The cycle-to-cycle variation of the low-resistance state (LRS) can also be reduced from 30.3% for a traditional memristor to 2.5% for the MTS architecture and the high-resistance state (HRS) from 62.4 to 27.2%.
To test their device, the researchers built a PDE solver with it. They found that their new MTS memristor could solve the Poisson equation five times faster than a conventional memristor based on HfO2 without h-BN.
The new technique, which is detailed in Science Advances, is a promising strategy for developing high-uniformity memristors, and could pave the way for high-accuracy, energy-efficient scientific computing platforms, Liang claims. âWe are now looking to develop large-scale integration of our MTS device and make a prototype system,â he says.
The post Ancient woodworking technique inspires improved memristor appeared first on Physics World.
On April 28, Spain experienced one of the most extensive power outages in recent memory. Millions of citizens and businesses were suddenly cut off, revealing how unprepared even developed nations [âŚ]
The post When Earth fails, space responds appeared first on SpaceNews.
A new contact lens enables humans to see near-infrared light without night vision goggles or other bulky equipment. The lens, which incorporates metallic nanoparticles that âupconvertâ normally-invisible wavelengths into visible ones, could have applications for rescue workers and others who would benefit from enhanced vision in conditions with poor visibility.
The infrared (IR) part of the electromagnetic spectrum encompasses light with wavelengths between 700 nm and 1 mm. Human eyes cannot normally detect these wavelengths because opsins, the light-sensitive protein molecules that allow us to see, do not have the required thermodynamic properties. This means we see only a small fraction of the electromagnetic spectrum, typically between 400â700 nm.
While devices such as night vision goggles and infrared-visible converters can extend this range, they require external power sources. They also cannot distinguish between different wavelengths of IR light.
In a previous work, researchers led by neuroscientist Tian Xue of the University of Science and Technology of China (USTC) injected photoreceptor-binding nanoparticles into the retinas of mice. While this technique was effective, it is too invasive and risky for human volunteers. In the new study, therefore, Xue and colleagues integrated the nanoparticles into biocompatible polymeric materials similar to those used in standard soft contact lenses.
The nanoparticles in the lenses are made from Au/NaGdF4: Yb3+, Er3+ and have a diameter of approximately 45 nm each. They work by capturing photons with lower energies (longer wavelengths) and re-emitting them as photons with higher energies (shorter wavelengths). This process is known as upconversion and the emitted light is said to be anti-Stokes shifted.
When the researchers tested the new upconverting contact lenses (UCLs) on mice, the rodentsâ behaviour suggested they could sense IR wavelengths. For example, when given a choice between a dark box and an IR-illuminated one, the lens-wearing mice scurried into the dark box. In contrast, a control group of mice not wearing lenses showed no preference for one box over the other. The pupils of the lens-wearing mice also constricted when exposed to IR light, and brain imaging revealed that processing centres in their visual cortex were activated.
The team then moved on to human volunteers. âIn humans, the near-infrared UCLs enabled participants to accurately detect flashing Morse code-like signals and perceive the incoming direction of near-infrared (NIR) light,â Xue says, referring to light at wavelengths between 800â1600 nm. Counterintuitively, the flashing images appeared even clearer when the volunteers closed their eyes â probably because IR light is better than visible light at penetrating biological tissue such as eyelids. Importantly, Xue notes that wearing the lenses did not affect participantsâ normal vision.
The team also developed a wearable system with built-in flat UCLs. This system allowed volunteers to distinguish between patterns such as horizontal and vertical lines; S and O shapes; and triangles and squares.
But Xue and colleagues did not stop there. By replacing the upconverting nanoparticles with trichromatic orthogonal ones, they succeeded in converting NIR light into three different spectral bands. For example, they converted infrared wavelengths of 808, 980 nm and 1532 nm into 540, 450, and 650 nm respectively â wavelengths that humans perceive as green, blue and red.
âAs well as allowing wearers to garner more detail within the infrared spectrum, this technology could also help colour-blind individuals see wavelengths they would otherwise be unable to detect by appropriately adjusting the absorption spectrum,â Xue tells Physics World.
According to the USTC researchers, who report their work in Cell, the devices could have several other applications. Apart from providing humans with night vision and offering an adaptation for colour blindness, the lenses could also give wearers better vision in foggy or dusty conditions.
At present, the devices only work with relatively bright IR emissions (the study used LEDs). However, the researchers hope to increase the photosensitivity of the nanoparticles so that lower levels of light can trigger the upconversion process.
The post New contact lenses allow wearers to see in the near-infrared appeared first on Physics World.
The University of Colorado, Boulder, is preparing to announce the establishment of the Colorado Space Policy Center (CSPC).
The post University of Colorado, Boulder to announce new space policy center appeared first on SpaceNews.
Astronstone, one of Chinaâs newest commercial launch startups, has raised early-stage funding for a stainless steel, reusable launch vehicle modeled on SpaceXâs Starship system.
The post Chinaâs Astronstone raises early funding for stainless steel rocket with âchopstickâ recovery appeared first on SpaceNews.
Powerful flares on highly-magnetic neutron stars called magnetars could produce up to 10% of the universeâs gold, silver and platinum, according to a new study. What is more, astronomers may have already observed this cosmic alchemy in action.
Gold, silver, platinum and a host of other rare heavy nuclei are known as rapid-process (r-process) elements. This is because astronomers believe that these elements are produced by the rapid capture of neutrons by lighter nuclei. Neutrons can only exist outside of an atomic nucleus for about 15 min before decaying (except in the most extreme environments). This means that the r-process must be fast and take place in environments rich in free neutrons.
In August 2017, an explosion resulting from the merger of two neutron stars was witnessed by telescopes operating across the electromagnetic spectrum and by gravitational-wave detectors. Dubbed a kilonova, the explosion produced approximately 16,000 Earth-masses worth of r-process elements, including about ten Earth masses of gold and platinum.
While the observations seem to answer the question of where precious metals came from, there remains a suspicion that neutron-star mergers cannot explain the entire abundance of r-process elements in the universe.
Now researchers led by Anirudh Patel, who is a PhD student at New Yorkâs Columbia University, have created a model that describes how flares on the surface of magnetars can create r-process elements.
Patel tells Physics World that âThe rate of giant flares is significantly greater than mergers.â However, given that one merger âproduces roughly 10,000 times more r-process mass than a single magnetar flareâ, neutron-star mergers are still the dominant factory of rare heavy elements.
A magnetar is an extreme type of neutron star with a magnetic field strength of up to a thousand trillion gauss. This makes magnetars the most magnetic objects in the universe. Indeed, if a magnetar were as close to Earth as the Moon, its magnetic field would wipe your credit card.
Astrophysicists believe that when a magnetarâs powerful magnetic fields are pulled taut, the magnetic tension will inevitably snap. This would result in a flare, which is an energetic ejection of neutron-rich material from the magnetarâs surface.
However, the physics isnât entirely understood, according to Jakub Cehula of Charles University in the Czech Republic, who is a member of Patelâs team. âWhile the source of energy for a magnetarâs giant flares is generally agreed to be the magnetic field, the exact mechanism by which this energy is released is not fully understood,â he explains.
One possible mechanism is magnetic reconnection, which creates flares on the Sun. Flares could also be produced by energy released during starquakes following a build-up of magnetic stress. However, neither satisfactorily explains the giant flares, of which only nine have thus far been detected.
In 2024 Cehula led research that attempted to explain the flares by combining starquakes with magnetic reconnection. âWe assumed that giant flares are powered by a sudden and total dissipation of the magnetic field right above a magnetarâs surface,â says Cehula.
This sudden release of energy drives a shockwave into the magnetarâs neutron-rich crust, blasting a portion of it into space at velocities greater than a tenth of the speed of light, where in theory heavy elements are formed via the r-process.
Remarkably, astronomers may have already witnessed this in 2004, when a giant magnetar flare was spotted as a half-second gamma-ray burst that released more energy than the Sun does in a million years. What happened next remained unexplained until now. Ten minutes after the initial burst, the European Space Agencyâs INTEGRAL satellite detected a second, weaker signal that was not understood.
Now, Patel and colleagues have shown that the r-process in this flare created unstable isotopes that quickly decayed into stable heavy elements â creating the gamma-ray signal.
Patel calculates that the 2004 flare resulted in the creation of two million billion billion kilograms of r-process elements, equivalent to about the mass of Mars.
Extrapolating, Patel calculates that giant flares on magnetars contribute between 1â10% of all the r-process elements in the universe.
âThis estimate accounts for the fact that these giant flares are rare,â he says, âBut itâs also important to note that magnetars have lifetimes of 1000 to 10,000 years, so while there may only be a couple of dozen magnetars known to us today, there have been many more magnetars that have lived and died over the course of the 13 billion-year history of our galaxy.â
Magnetars would have been produced early in the universe by the supernovae of massive stars, whereas it can take a billion years or longer for two neutron stars to merge. Hence, magnetars would have been a more dominant source of r-process elements in the early universe. However, they may not have been the only source.
âIf I had to bet, I would say there are other environments in which r-process elements can be produced, for example in certain rare types of core-collapse supernovae,â says Patel.
Either way, it means that some of the gold and silver in your jewellery was forged in the violence of immense magnetic fields snapping on a dead star.
The research is described in Astrophysical Journal Letters.
The post How magnetar flares give birth to gold and platinum appeared first on Physics World.
âWeâre not done,â Rocket Lab CEO Peter Beck told SpaceNews
The post With strategic acquisitions, Rocket Lab pursues prime defense contractor status appeared first on SpaceNews.
China carried out its ninth launch of the month early Thursday, sending the secretive Shijian-26 spacecraft into orbit.
The post China launches classified Shijan-26 satellite with Long March 4B rocket appeared first on SpaceNews.
Northrop Grumman is investing $50 million into Firefly Aerospace to further development of a medium-lift launch vehicle with a new name.
The post Northrop invests $50 million into Firefly for launch vehicle development appeared first on SpaceNews.
At the core of Beijing's campaign are advanced data centers, on Earth and in orbit
The post New report details Chinaâs push to dominate artificial intelligence appeared first on SpaceNews.
Join us on May 6 for a timely discussion on those challenging Starlink and the push for multi-orbit and multi-operator solutions.
The post Webinar: Geospatial Intelligence â New Data to Solutions appeared first on SpaceNews.
Quantum science is enjoying a renaissance as nascent quantum computers emerge from the lab and quantum sensors are being used for practical applications.
As the technologies we use become more quantum in nature, it follows that everyone should have a basic understanding of quantum physics. To explore how quantum physics can be taught to the masses, I am joined by Arjan Dhawan, Aleks Kissinger and Bob Coecke â who are all based in the UK.
Coecke is chief scientist at Quantinuum â which develops quantum computing hardware and software. Kissinger is associate professor of quantum computing at the University of Oxford; and Dhawan is studying mathematics at the University of Durham.
Kissinger and Coecke have developed a way of teaching quantum physics using diagrams. In 2023, Oxford and Quantinuum joined forces to use the method in a pilot summer programme for 15 to 17 year-olds. Dhawan was one of their students.
Physics World is brought to you by IOP Publishing, which also publishes scholarly journals, conference proceedings and ebooks.
You can download the book The Ringed Planet: Second Edition free of charge for a limited time only. By Joshua Colwell, the book is a must read on Saturn and the Cassini mission. An updated and expanded third edition is also hot off the press.
Browse all ebooks here and remember that you can always read the first chapters of all IOPP ebooks for free.
This article forms part of Physics Worldâs contribution to the 2025 International Year of Quantum Science and Technology (IYQ), which aims to raise global awareness of quantum physics and its applications.
Stayed tuned to Physics World and our international partners throughout the next 12 months for more coverage of the IYQ.
Find out more on our quantum channel.
The post Teaching quantum physics to everyone: pictures offer a new way of understanding appeared first on Physics World.
In this week's episode of Space Minds Ashley Johnson, President and CFO of Planet, explains the company's ambitious goal to make global change visible, accessible and actionable.
The post The power of daily Earth imaging appeared first on SpaceNews.
Todd Surdey will head Maxarâs business enterprise segment
The post Maxar Intelligence taps tech veteran to lead commercial business appeared first on SpaceNews.
Adaptive radiotherapy, an advanced cancer treatment in which each fraction is tailored to the patientâs daily anatomy, offers the potential to maximize target conformality and minimize dose to surrounding healthy tissue. Based on daily scans â such as MR images recorded by an MR-Linac, for example â treatment plans are adjusted each day to account for anatomical changes in the tumour and surrounding healthy tissue.
Creating a new plan for every treatment fraction, however, increase the potential for errors, making fast and effective quality assurance (QA) procedures more important than ever. To meet this need, the physics team at Hospital Almater in Mexicali, Mexico, is using Elekta ONE | QA, powered by ThinkQA Secondary Dose Check* (ThinkQA SDC) software to ensure that each adaptive plan is safe and accurate before it is delivered to the patient.
Radiotherapy requires a series of QA checks prior to treatment delivery, starting with patient-specific QA, where the dose calculated by the treatment planning system is delivered to a phantom. This procedure ensures that the delivered dose distribution matches the prescribed plan. Alongside, secondary dose checks can be performed, in which an independent algorithm verifies that the calculated dose distribution corresponds with that delivered to the actual patient anatomy.
âThe secondary dose check is an independent dose calculation that uses a different algorithm to the one in the treatment planning system,â explains Alexis Cabrera Santiago, a medical physicist at Hospital Almater. âThinkQA SDC software calculates the dose based on the patient anatomy, which is actually more realistic than using a rigid phantom, so we can compare both results and catch any differences before treatment.â
For adaptive radiotherapy in particular, this second check is invaluable. Performing phantom-based QA following each daily imaging session is often impractical. Instead, in many cases, itâs possible to use ThinkQA SDC instead.
âSecondary dose calculation is necessary in adaptive treatments, for example using the MR-Linac, because you are changing the treatment plan for each session,â says JosĂŠ Alejandro RojasâLĂłpez, who commissioned and validated ThinkQA SDC at Hospital Almater. âYou are not able to shift the patient to realise patient-specific QA, so this secondary dose check is needed to analyse each treatment session.â
ThinkQA SDCâs ability to achieve patient-specific QA without shifting the patient is extremely valuable, allowing time savings while upholding the highest level of QA safety. âThe AAPM TG 219 report recognises secondary dose verification as a validated alternative to patient-specific QA, especially when there is no time for traditional phantom checks in adaptive fractions,â adds Cabrera Santiago.
At Hospital Almater, all external-beam radiation treatments are performed using an Elekta Unity MR-Linac (with brachytherapy employed for gynaecological cancers). This enables the hospital to offer adaptive radiotherapy for all cases, including head-and-neck, breast, prostate, rectal and lung cancers.
To ensure efficient workflow and high-quality treatments, the team turned to the ThinkQA SDC software. ThinkQA SDC received FDA 510(k) clearance in early 2024 for use with both the Unity MR-Linac and conventional Elekta linacs.
RojasâLĂłpez (who now works at Hospital Angeles Puebla) says that the team chose ThinkQA SDC because of its user-friendly interface, ease of integration into the clinical workflow and common integrated QA platform for both CT and MR-Linac systems. The software also offers the ability to perform 3D evaluation of the entire planning treatment volume (PTV) and the organs-at-risk, making the gamma evaluation more robust.
Commissioning of ThinkQA SDC was fast and straightforward, RojasâLĂłpez notes, requiring minimal data input into the software. For absolute dose calibration, the only data needed are the cryostat dose attenuation response, the output dose geometry and the CT calibration.
âThis makes a difference compared with other commercial solutions where you have to introduce more information, such as MLC [multileaf collimator] leakage and MLC dosimetric leaf gap, for example,â he explains. âIf you have to introduce more data for commissioning, this delays the clinical introduction of the software.â
Cabrera Santiago is now using ThinkQA SDC to provide secondary dose calculations for all radiotherapy treatments at Hospital Almater. The team has established a protocol with a 3%/2 mm gamma criterion, a tolerance limit of 95% and an action limit of 90%. He emphasizes that the software has proved robust and flexible, and provides confidence in the delivered treatment.
âThinkQA SDC lets us work with more confidence, reduces risk and saves time without losing control over the patientâs safety,â he says. âIt checks that the plan is correct, catches issues before treatment and helps us find any problems like set-up errors, contouring mistakes and planning issues.â
The software integrates smoothly into the Elekta ONE adaptive workflow, providing reliable results without slowing down the clinical workflow. âIn our institution, we set up ThinkQA SDC so that it automatically receives the new plan, runs the check, compares it with the original plan and creates a report â all in around two minutes,â says Cabrera Santiago. âThis saves us a lot of time and removes the need to do everything manually.â
As an example of ThinkQA SDCâs power to ease the treatment workflow, RojasâLĂłpez describes a paediatric brain tumour case at Hospital Almater. The young patient needed sedation during their treatment, requiring the physics team to optimize the treatment time for the entire adaptive radiotherapy workflow. âThinkQA SDC served to analyse, in a fast mode, the treatment plan QA for each session. The measurements were reliable, enabling us to deliver all of the treatment sessions without any delay,â he explains.
Indeed, the ability to use secondary dose checks for each treatment fraction provides time advantages for the entire clinical workflow over phantom-based pre-treatment QA. âTime in the bunker is very expensive,â RojasâLĂłpez points out. âIf you reduce the time required for QA, you can use the bunker for patient treatments instead and treat more patients during the clinical time. Secondary dose check can optimize the workflow in the entire department.â
Importantly, in a recent study comparing patient-specific QA measurements using Sun Nuclearâs ArcCheck with ThinkQA SDC calculations, RojasâLĂłpez and colleagues confirmed that the two techniques provided comparable results, with very similar gamma passing rates. As such, they are working to reduce phantom measurements and, in most cases, replace them with a secondary dose check using ThinkQA SDC.
The team at Hospital Almater concur that ThinkQA SDC provides a reliable tool to evaluate radiation treatments, including the first fraction and all of the adaptive sessions, says RojasâLĂłpez. âYou can use it for all anatomical sites, with reliable and confident results,â he notes. âAnd you can reduce the need for measurements using another patient-specific QA tool.â
âI think that any centre doing adaptive radiotherapy should seriously consider using a tool like ThinkQA SDC,â adds Cabrera Santiago.
*ThinkQA is manufactured by DOSIsoft S.A. and distributed by Elekta.
The post Secondary dose checks ensure safe and accurate delivery of adaptive radiotherapy appeared first on Physics World.
Chinese rocket maker Sepoch has carried out a first vertical liftoff and splashdown landing ahead of a potential orbital launch attempt later this year.
The post Chinese launch startup conducts vertical takeoff and splashdown test appeared first on SpaceNews.
The first high-resolution images of Boliviaâs Uturuncu volcano have yielded unprecedented insights into whether this volcanic âzombieâ is likely to erupt in the near future. The images were taken using a technique that combines seismology, rock physics and petrological analyses, and the scientists who developed it say it could apply to other volcanoes, too.
Volcanic eruptions occur when bubbles of gases such as SO2 and CO2 rise to the Earthâs surface through dikes and sills in the planetâs crust, bringing hot, molten rock known as magma with them. To evaluate the chances of this happening, researchers need to understand how much gas and melted rock have accumulated in the volcanoâs shallow upper crust, or crater. This is not easy, however, as the structures that convey gas and magma to the surface are complex and mapping them is challenging with current technologies.
In the new work, a team led by Mike Kendall of the University of Oxford, UK and Haijiang Zhang from the University of Science and Technology of China (USTC) employed a combination of seismological and petrophysical analyses to create such a map for Uturuncu. Located in the Central Andes, this volcano formed in the Pleistocene era (around 2.58 million to 11,700 years ago) as the oceanic Nazca plate was forced beneath the South American continental plate. It is made up of around 50 km3 of homogeneous, porphyritic dacite lava flows that are between 62% and 67% silicon dioxide (SiO2) by weight, and it sits atop the AltiplanoâPuna magma body, which is the worldâs largest body of partially-melted silicic rock.
Although Uturuncu has not erupted for nearly 250,000 years, it is not extinct. It regularly emits plumes of gas, and earthquakes are a frequent occurrence in the shallow crust beneath and around it. Previous geodetic studies also detected a 150-km-wide deformed region of rock centred around 3 km south-west of its summit. These signs of activity, coupled with Uturuncuâs lack of a geologically recent eruption, have led some scientists to describe it as a âzombieâ.
To tease out the reasons for Uturuncuâs semi-alive behaviour, the team turned to seismic tomography â a technique Kendall compares to medical imaging of a human body. The idea is to detect the seismic waves produced by earthquakes travelling through the Earthâs crust, analyse their arrival times, and use this information to create three-dimensional images of what lies beneath the surface of the structure being studied.
Writing in PNAS, Kendall and colleagues explain that they used seismic tomography to analyse signals from more than 1700 earthquakes in the region around Uturuncu. They performed this analysis in two ways. First, they assumed that seismic waves travel through the crust at the same speed regardless of their direction of propagation. This isotropic form of tomography gave them a first image of the regionâs structure. In their second analysis, they took the directional dependence of the seismic wavesâ speed into account. This anisotropic tomography gave them complementary information about the structure.
The researchers then combined their tomographic measurements with previous geophysical imaging results to construct rock physics models. These models contain information about the paths that hot migrating fluids and gases take as they migrate to the surface. In Uturuncuâs case, the models showed fluids and gases accumulating in shallow magma reservoirs directly below the volcanoâs crater and down to a depth of around 5 km. This movement of liquid and gas explains Uturuncuâs unrest, the team say, but the good news is that it has a low probability of producing eruptions any time soon.
According to Kendall, the teamâs methods should be applicable to more than 1400 other potentially active volcanoes around the world. âIt could also be applied to identifying potential geothermal energy sites and for critical metal recovery in volcanic fluids,â he tells Physics World.
The post âZombieâ volcano reveals its secrets appeared first on Physics World.
NASA has decided to switch to a backup propellant line on its Psyche asteroid mission to allow the spacecraft to resume use of its electric propulsion system.
The post NASA switches to backup propellant line on Psyche spacecraft appeared first on SpaceNews.
The Japanese military has selected companies Space One and Space BD to launch a small satellite.
The post Space One and Space BD to launch satellite for Japanese military appeared first on SpaceNews.
The satellites are scheduled for delivery by 2031.
The post Space Force orders two more GPS IIIF satellites for $509.7 million appeared first on SpaceNews.
The launch of GPS III SV-08 â the eighth satellite in the GPS III constellation â was originally assigned to United Launch Alliance (ULA)
The post SpaceX to launch another GPS III satellite in record turnaround appeared first on SpaceNews.
Relationship focuses on co-developing high-performance, radiation-hardened computing systems for next-generation space missions
The post Frontgrade Technologies and VORAGO Announce Strategic Collaboration to Advance Space Computing Solutions for Autonomous Applications appeared first on SpaceNews.
China launched its second planetary exploration mission Wednesday, sending Tianwen-2 to sample a near Earth asteroid and later survey a main belt comet.
The post China launches Tianwen-2 mission to sample near Earth asteroid appeared first on SpaceNews.
Everyday life is three dimensional, with even a sheet of paper having a finite thickness. Shengxi Huang from Rice University in the US, however, is attracted by 2D materials, which are usually just one atomic layer thick. Graphene is perhaps the most famous example â a single layer of carbon atoms arranged in a hexagonal lattice. But since it was first created in 2004, all sorts of other 2D materials, notably boron nitride, have been created.
An electrical engineer by training, Huang did a PhD at the Massachusetts Institute of Technology and postdoctoral research at Stanford University before spending five years as an assistant professor at the Pennsylvania State University. Huang has been at Rice since 2022, where she is now an associate professor in the Department of Electrical and Computer Engineering, the Department of Material Science and NanoEngineering, and the Department of Bioengineering.
Her group at Rice currently has 12 people, including eight graduate students and four postdocs. Some are physicists, some are engineers, while others have backgrounds in material science or chemistry. But they all share an interest in understanding the optical and electronic properties of quantum materials and seeing how they can be used, for example, as biochemical sensors. Lab equipment from Picoquant is vital in helping in that quest, as Huang explains in an interview with Physics World.
Iâm an electrical engineer by training, which is a very broad field. Some electrical engineers focus on things like communication and computing, but others, like myself, are more interested in how we can use fundamental physics to build useful devices, such as semiconductor chips. Iâm particularly interested in using 2D materials for optoelectronic devices and as single-photon emitters.
The materials I am particularly interested in are transition metal dichalcogenides, which consist of a layer of transition-metal atoms sandwiched between two layers of chalcogen atoms â sulphur, selenium or tellurium. One of the most common examples is molybdenum disulphide, which in its monolayer form has a layer of sulphur on either side of a layer of molybdenum. In multi-layer molybdenum disulphide, the van der Waals forces between the tri-layers are relatively weak, meaning that the material is widely used as a lubricant â just like graphite, which is a many-layer version of graphene.
Transition metal dichalcogenides have some very useful optoelectronic properties. In particular, they emit light whenever the electron and hole that make up an âexcitonâ recombine. Now because these dichalcogenides are so thin, most of the light they emit can be used. In a 3D material, in contrast, most light is generated deep in the bulk of the material and doesnât penetrate beyond the surface. Such 2D materials are therefore very efficient and, whatâs more, can be easily integrated onto chip-based devices such as waveguides and cavities.
Transition metal dichalcogenide materials also have promising electronic applications, particularly as the active material in transistors. Over the years, weâve seen silicon-based transistors get smaller and smaller as weâve followed Mooreâs law, but weâre rapidly reaching a limit where we canât shrink them any further, partly because the electrons in very thin layers of silicon move so slowly. In 2D transition metal dichalcogenides, in contrast, the electron mobility can actually be higher than in silicon of the same thickness, making them a promising material for future transistor applications.
Single photons are useful for quantum communication and quantum cryptography. Carrying information as zero and one, they basically function as a qubit, providing a very secure communication channel. Single photons are also interesting for quantum sensing and even quantum computing. But itâs vital that you have a highly pure source of photons. You donât want them mixed up with âclassical photonsâ, which â like those from the Sun â are emitted in bunches as otherwise the tasks youâre trying to perform cannot be completed.
What we do is introduce atomic defects into a 2D material to give it optical properties that are different to what youâd get in the bulk. There are several ways of doing this. One is to irradiate a sample with ions or electrons, which can bombard individual atoms out to generate âvacancy defectsâ. Another option is to use plasmas, whereby atoms in the sample get replaced by atoms from the plasma.
We can probe defect emission using a technique called photoluminescence, which basically involves shining a laser beam onto the material. The laser excites electrons from the ground state to an excited state, prompting them to emit light. As the laser beam is about 500-1000 nm in diameter, we can see single photon emission from an individual defect if the defect density is suitable.
We start by engineering our materials at the atomic level to introduce the correct type of defect. We also try to strain the material, which can increase how many single photons are emitted at a time. Once weâve confirmed weâve got the correct defects in the correct location, we check the material is emitting single photons by carrying out optical measurements, such as photoluminescence. Finally, we characterize the purity of our single photons â ideally, they shouldnât be mixed up with classical photons but in reality, you never have a 100% pure source. As single photons are emitted one at a time, they have different statistical characteristics to classical light. We also check the brightness and lifetime of the source, the efficiency, how stable it is, and if the photons are polarized. In fact, we have a feedback loop: what improvements can we do at the atomic level to get the properties weâre after?
Itâs pretty challenging. You want to add just one defect to an area that might be just one micron square so you have to control the atomic structure very finely. Itâs made harder because 2D materials are atomically thin and very fragile. So if you donât do the engineering correctly, you may accidentally introduce other types of defects that you donât want, which will alter the defectsâ emission.
Because the defect concentration is so low, we cannot use methods that are typically used to characterise materials, such as X-ray photo-emission spectroscopy or scanning electron microscopy. Instead, the best and most practical way is to see if the defects generate the correct type of optical emission predicted by theory. But even that is challenging because our calculations, which we work on with computational groups, might not be completely accurate.
We have two main pieces of equipment â a MicroTime 100 photoluminescence microscope and a FluoTime 300 spectrometer. These have been customized to form a Hanbury Brown Twiss interferometer, which measures the purity of a single photon source. We also use the microscope and spectrometer to characterise photoluminescence spectrum and lifetime. Essentially, if the material emits light, we can then work out how long it takes before the emission dies down.
Itâs more of a customised instrument with different components â lasers, microscopes, detectors and so on â connected together so we can do multiple types of measurement. I put in a request to Picoquant, who discussed my requirements with me to work out how to meet my needs. The equipment has been very important for our studies as we can carry out high-throughput measurements over and over again. Weâve tailored it for our own research purposes basically.
The best single-photon source that we currently work with is boron nitride, which has a single-photon purity of 98.5% at room temperature. In other words, for every 200 photons only three are classical. With transition-metal dichalcogenides, we get a purity of 98.3% at cryogenic temperatures.
Thereâs still lots to explore in terms of making better single-photon emitters and learning how to control them at different wavelengths. We also want to see if these materials can be used as high-quality quantum sensors. In some cases, if we have the right types of atomic defects, we get a high-quality source of single photons, which we can then entangle with their spin. The emitters can therefore monitor the local magnetic environment with better performance than is possible with classical sensing methods.
The post Shengxi Huang: how defects can boost 2D materials as single-photon emitters appeared first on Physics World.
In the evolving landscape of space technology, a pivotal transformation is quietly taking shape: the development of spacecraft autonomy. While launch capabilities often dominate headlines, the real innovation frontier lies [âŚ]
The post Overcoming conservatism in the autonomous space revolution appeared first on SpaceNews.
East Aurora, NY â Moog Inc. (NYSE: MOG.A and MOG.B), a worldwide designer, manufacturer and systems integrator of high-performance precision motion and fluid controls and control systems, announced today that [âŚ]
The post Air Force Research Laboratory Awards Moog Contract to Develop New Multimode Propulsion System to Enhance Dynamic Space Operations appeared first on SpaceNews.
The 2025 Shaw Prize in Astronomy has been awarded to Richard Bond and George Efstathiou âfor their pioneering research in cosmology, in particular for their studies of fluctuations in the cosmic microwave backgroundâ. The prize citation continues, âTheir predictions have been verified by an armada of ground-, balloon- and space-based instruments, leading to precise determinations of the age, geometry, and massâenergy content of the universeâ.
Efstathiou is professor of astrophysics at the University of Cambridge in the UK. Bond is a professor at the Canadian Institute for Theoretical Astrophysics (CITA) and university professor at the University of Toronto in Canada. They share the $1.2m prize money equally.
The annual award is given by the Shaw Prize Foundation, which was founded in 2002 by the Hong Kong-based filmmaker, television executive and philanthropist Run Run Shaw (1907â2014). It will be presented at a ceremony in Hong Kong on 21 October. There are also Shaw Prizes for life sciences and medicine; and mathematical sciences.
Bond studied mathematics and physics at Toronto. In 1979 he completed a PhD in theoretical physics at the California Institute of Technology (Caltech). He directed CITA in 1996-2006.
Efstathiou studied physics at Oxford before completing a PhD in astronomy at the UKâs Durham University in 1979. He is currently director of the Institute of Astronomy in Cambridge.
The post Richard Bond and George Efstathiou share the 2025 Shaw Prize in Astronomy appeared first on Physics World.
Comics are regarded as an artform in France, where they account for a quarter of all book sales. Nevertheless, the graphic novel World Without End: an Illustrated Guide to the Climate Crisis was a surprise French bestseller when it first came out in 2022. Taking the form of a Socratic dialogue between French climate expert Jean-Marc Jancovici and acclaimed comic artist Christophe Blain, itâs serious, scientific stuff.
Now translated into English by Edward Gauvin, the book follows the conventions of French-language comic strips or bandes dessinĂŠes. Jancovici is drawn with a small nose â denoting seriousness â while Blainâs larger nose signals humour. The first half explores energy and consumption, with the rest addressing the climate crisis and possible solutions.
Overall, this is a Trojan horse of a book: what appears to be a playful comic is packed with dense, academic content. Though marketed as a graphic novel, it reads more like illustrated notes from a series of sharp, provocative university lectures. It presents a frightening vision of the future and the humour doesnât always land.
The book spans a vast array of disciplines â not just science and economics but geography and psychology too. In fact, thereâs so much to unpack that, had I Blainâs skills, I might have reviewed it in the form of a comic strip myself. The old adage that âa picture is worth a thousand wordsâ has never rung more true.
Absurd yet powerful visual metaphors feature throughout. We see a parachutist with a flaming main chute that represents our dependence on fossil fuels. The falling man jettisons his reserve chute â nuclear power â and tries to knit an alternative using clean energy, mid-fall. The message is blunt: nuclear may not be ideal, but it works.
World Without End is bold, arresting, provocative and at times polemical.
The book is bold, arresting, provocative and at times polemical. Charts and infographics are presented to simplify complex issues, even if the details invite scrutiny. Explanations are generally clear and concise, though the authorâs claim that accidents like Chernobyl and Fukushima couldnât happen in France smacks of hubris.
Jancovici makes plenty of attention-grabbing statements. Some are sound, such as the notion that fossil fuels spared whales from extinction as we didnât need this animalâs oil any more. Others are dubious â would a 4 °C temperature rise really leave a third of humanity unable to survive outdoors?
But Jancovici is right to say that the use of fossil fuels makes logical sense. Oil can be easily transported and one barrel delivers the equivalent of five years of human labour. A character called Armor Man (a parody of Iron Man) reminds us that fossil fuels are like having 200 mechanical slaves per person, equivalent to an additional 1.5 trillion people on the planet.
Fossil fuels brought prosperity â but now threaten our survival. For Jancovici, the answer is nuclear power, which is perhaps not surprising as it produces 72% of electricity in the authorâs homeland. But he cherry picks data, accepting â for example â the United Nations figure that only about 50 people died from the Chernobyl nuclear accident.
While acknowledging that many people had to move following the disaster, the author downplays the fate of those responsible for âcleaning upâ the site, the long-term health effects on the wider population and the staggering economic impact â estimated at âŹ200â500bn. He also sidesteps nuclear-waste disposal and the cost and complexity of building new plants.
While conceding that nuclear is ânot the whole answerâ, Jancovici dismisses hydrogen and views renewables like wind and solar as too intermittent â they require batteries to ensure electricity is supplied on demand â and diffuse. Imagine blanketing the Earth in wind turbines.
Still, his views on renewables seem increasingly out of step. They now supply nearly 30% of global electricity â 13% from wind and solar, ahead of nuclear at 9%. Renewables also attract 70% of all new investment in electricity generation and (unlike nuclear) continue to fall in price. Itâs therefore disingenuous of the author to say that relying on renewables would be like returning to pre-industrial life; todayâs wind turbines are far more efficient than anything back then.
Beyond his case for nuclear, Jancovici offers few firm solutions. Weirdly, he suggests âeducating womenâ and providing pensions in developing nations â to reduce reliance on large families â to stabilize population growth. He also cites French journalist SĂŠbastien Bohler, who thinks our brains are poorly equipped to deal with long-term threats.
But he says nothing about the need for more investment in nuclear fusion or for âcleanâ nuclear fission via, say, liquid fluoride thorium reactors (LFTRs), which generate minimal waste, wonât melt down and cannot be weaponized.
Perhaps our survival depends on delaying gratification, resisting the lure of immediate comfort, and adopting a less extravagant but sustainable world. We know what changes are needed â yet we do nothing. The climate crisis is unfolding before our eyes, but weâre paralysed by a global-scale bystander effect, each of us hoping someone else will act first. Jancoviciâs call for âenergy sobrietyâ (consuming less) seems idealistic and futile.
Still, World Without End is a remarkable and deeply thought-provoking book that deserves to be widely read. I fear that it will struggle to replicate its success beyond France, though Raymond Briggsâ When the Wind Blows â a Cold War graphic novel about nuclear annihilation â was once a British bestseller. If enough people engaged with the book, it would surely spark discussion and, one day, even lead to meaningful action.
The post No laughing matter: a comic book about the climate crisis appeared first on Physics World.
Satellite manufacturer Apex has unveiled its largest spacecraft yet, a bus designed to serve commercial and government constellation customers.
The post Apex announces Comet satellite bus for constellations appeared first on SpaceNews.
The 20th of May is World Metrology Day, and this year it was extra special because it was also the 150th anniversary of the treaty that established the metric system as the preferred international measurement standard. Known as the Metre Convention, the treaty was signed in 1875 in Paris, France by representatives of all 17 nations that belonged to the Bureau International des Poids et Mesures (BIPM) at the time, making it one of the first truly international agreements. Though nations might come and go, the hope was that this treaty would endure âfor all times and all peoplesâ.
To celebrate the treatyâs first century and a half, the BIPM and the United Nations Educational, Scientific and Cultural Organisation (UNESCO) held a joint symposium at the UNESCO headquarters in Paris. The event focused on the achievements of BIPM as well as the international scientific collaborations the Metre Convention enabled. It included talks from the Nobel prize-winning physicist William Phillips of the US National Institute of Standards and Technology (NIST) and the BIPM director Martin Milton, as well as panel discussions on the future of metrology featuring representatives of other national metrology institutes (NMIs) and metrology professionals from around the globe.
The history of metrology dates back to ancient times. As UNESCOâs Hu Shaofeng noted in his opening remarks, the Egyptians recognized the importance of precision measurements as long ago as the 21st century BCE. Like other early schemes, the Egyptiansâ system of measurement used parts of the human body as references, with units such as the fathom (the length of a pair of outstretched arms) and the foot. This was far from ideal since, as Phillips pointed out in his keynote address, people come in various shapes and sizes. These variations led to a profusion of units. By some estimates, pre-revolutionary France had a whopping 250,000 different measures, with differences arising not only between towns but also between professions.
The French Revolutionaries were determined to put an end to this mess. In 1795, just six years after the Revolution, the law of 18 Geminal An III (according to the new calendar of the French Republic) created a preliminary version of the worldâs first metric system. The new system tied length and mass to natural standards (the metre was originally one-forty-millionth of the Paris meridian, while the kilogram is the mass of a cubic decimetre of water), and it became the standard for all of France in 1799. That same year, the system also became more practical, with units becoming linked, for the first time, to physical artefacts: a platinum metre and kilogram deposited in the French National Archives.
When the Metre Convention adopted this standard internationally 80 years later, it kick-started the construction of new length and mass standards. The new International Prototype of the Metre and International Prototype of the Kilogram were manufactured in 1879 and officially adopted as replacements for the Revolutionariesâ metre and kilogram in 1889, though they continued to be calibrated against the old prototypes held in the National Archives.
The BIPM itself was originally conceived as a means of reconciling France and Germany after the 1870â1871 FrancoâPrussian War. At first, its primary roles were to care for the kilogram and metre prototypes and to calibrate the standards of its member states. In the opening decades of the 20th century, however, it extended its activities to cover other kinds of measurements, including those related to electricity, light and radiation. Then, from the 1960s onwards, it became increasingly interested in improving the definition of length, thanks to new interferometer technology that made it possible to measure distance at a precision rivalling that of the physical metre prototype.
It was around this time that the BIPM decided to replace its expanded metric system with a framework encompassing the entire field of metrology. This new framework consisted of six basic units â the metre, kilogram, second, ampere, degree Kelvin (later simply the kelvin), candela and mole â plus a set of âderivedâ units (the Newton, Hertz, Joule and Watt) built from the six basic ones. Thus was born the International System of Units, or SI after the French initials for Système International dâunitĂŠs.
The next major step â a âbrilliant choiceâ, in Phillipsâ words â came in 1983, when the BIPM decided to redefine the metre in terms of the speed of light. In the future, the Bureau decreed that the metre would officially be the length travelled by light in vacuum during a time interval of 1/299,792,458 seconds.
This decision set the stage for defining the rest of the seven base units in terms of natural fundamental constants. The most recent unit to join the club was the kilogram, which was defined in terms of the Planck constant, h, in 2019. In fact, the only base unit currently not defined in terms of a fundamental constant is the second, which is instead determined by the transition between the two hyperfine levels of the ground state of caesium-133. The international metrology community is, however, working to remedy this, with meetings being held on the subject in Versailles this month.
Measurement affects every aspect of our daily lives, and as the speakers at last weekâs celebrations repeatedly reminded the audience, a unified system of measurement has long acted as a means of building trust across international and disciplinary borders. The Metre Conventionâs survival for 150 years is proof that peaceful collaboration can triumph, and it has allowed humankind to advance in ways that would not have been possible without such unity. A lesson indeed for todayâs troubled world.
The post The evolution of the metre: How a product of the French Revolution became a mainstay of worldwide scientific collaboration appeared first on Physics World.
SpaceXâs Starship suffered a loss of attitude control after reaching space on its latest test flight May 27, leading to an uncontrolled reentry and a third consecutive failure.
The post Starship breaks up on reentry after loss of attitude control appeared first on SpaceNews.
The deal positions Rocket Lab to compete for Golden Dome missile tracking satellite contracts
The post Rocket Lab to acquire satellite payload manufacturer Geost for $275 million appeared first on SpaceNews.