↩ Accueil

Vue normale

Reçu aujourd’hui — 4 juin 20256.5 📰 Sciences English

Black-hole scattering calculations could shed light on gravitational waves

4 juin 2025 à 17:00

By adapting mathematical techniques used in particle physics, researchers in Germany have developed an approach that could boost our understanding of the gravitational waves that are emitted when black holes collide. Led by Jan Plefka at The Humboldt University of Berlin, the team’s results could prove vital to the success of future gravitational-wave detectors.

Nearly a decade on from the first direct observations of gravitational waves, physicists are hopeful that the next generation of ground- and space-based observatories will soon allow us to study these ripples in space–time with unprecedented precision. But to ensure the success of upcoming projects like the LISA space mission, the increased sensitivity offered by these detectors will need to be accompanied with a deeper theoretical understanding of how gravitational waves are generated through the merging of two black holes.

In particular, they will need to predict more accurately the physical properties of gravitational waves produced by any given colliding pair and account for factors including their respective masses and orbital velocities. For this to happen, physicists will need to develop more precise solutions to the relativistic two-body problem. This problem is a key application of the Einstein field equations, which relate the geometry of space–time to the distribution of matter within it.

No exact solution

“Unlike its Newtonian counterpart, which is solved by Kepler’s Laws, the relativistic two-body problem cannot be solved exactly,” Plefka explains. “There is an ongoing international effort to apply quantum field theory (QFT) – the mathematical language of particle physics – to describe the classical two-body problem.”

In their study, Plefka’s team started from state-of-the-art techniques used in particle physics for modelling the scattering of colliding elementary particles, while accounting for their relativistic properties. When viewed from far away, each black hole can be approximated as a single point which, much like an elementary particle, carries a single mass, charge, and spin.

Taking advantage of this approximation, the researchers modified existing techniques in particle physics to create a framework called worldline quantum field theory (WQFT). “The advantage of WQFT is a clean separation between classical and quantum physics effects, allowing us to precisely target the classical physics effects relevant for the vast distances involved in astrophysical observables,” Plefka describes

Ordinarily, doing calculations with such an approach would involve solving millions of integrals that sum-up every single contribution to the black hole pair’s properties across all possible ways that the interaction between them could occur. To simplify the problem, Plefka’s team used a new algorithm that identified relationships between the integrals. This reduced the problem to just 250 “master integrals”, making the calculation vastly more manageable.

With these master integrals, the team could finally produce expressions for three key physical properties of black hole binaries within WQFT. These includes the changes in momentum during the gravity-mediated scattering of two black holes and the total energy radiated by both bodies over the course of the scattering.

Genuine physical process

Altogether, the team’s WQFT framework produced the most accurate solution to the Einstein field equations ever achieved to date. “In particular, the radiated energy we found contains a new class of mathematical functions known as ‘Calabi–Yau periods’,” Plefka explains. “While these functions are well-known in algebraic geometry and string theory, this marks the first time they have been shown to describe a genuine physical process.”

With its unprecedented insights into the structure of the relativistic two-body problem, the team’s approach could now be used to build more precise models of gravitational-wave formation, which could prove invaluable for the next generation of gravitational-wave detectors.

More broadly, however, Plefka predicts that the appearance of Calabi–Yau periods in their calculations could lead to an entirely new class of mathematical functions applicable to many areas beyond gravitational waves.

“We expect these periods to show up in other branches of physics, including collider physics, and the mathematical techniques we employed to calculate the relevant integrals will no doubt also apply there,” he says.

The research is described in Nature.

The post Black-hole scattering calculations could shed light on gravitational waves appeared first on Physics World.

Harmonious connections: bridging the gap between music and science

4 juin 2025 à 12:00

CP Snow’s classic The Two Cultures lecture, published in book form in 1959, is the usual go-to reference when exploring the divide between the sciences and humanities. It is a culture war that was raging long before the term became social-media shorthand for today’s tribal battles over identity, values and truth.

While Snow eloquently lamented the lack of mutual understanding between scientific and literary elites, the 21st-century version of the two-cultures debate often plays out with a little less decorum and a lot more profanity. Hip hop duo Insane Clown Posse certainly didn’t hold back in their widely memed 2010 track “Miracles”, which included the lyric “And I don’t wanna talk to a scientist / Y’all motherfuckers lying and getting me pissed”. An extreme example to be sure, but it hammers home the point: Snow’s two-culture concerns continue to resonate strongly almost 70 years after his influential lecture and writings.

A Perfect Harmony: Music, Mathematics and Science by David Darling is the latest addition to a growing genre that seeks to bridge that cultural rift. Like Peter Pesic’s Music and the Making of Modern Science, Susan Rogers and Ogi Ogas’ This Is What It Sounds Like, and Philip Ball’s The Music Instinct, Darling’s book adds to the canon that examines the interplay between musical creativity and the analytical frameworks of science (including neuroscience) and mathematics.

I’ve also contributed, in a nanoscopically small way, to this music-meets-science corpus with an analysis of the deep and fundamental links between quantum physics and heavy metal (When The Uncertainty Principle Goes To 11), and have a long-standing interest in music composed from maths and physics principles and constants (see my Lateral Thoughts articles from September 2023 and July 2024). Darling’s book, therefore, struck a chord with me.

Darling is not only a talented science writer with an expansive back-catalogue to his name but he is also an accomplished musician (check out his album Songs Of The Cosmos ), and his enthusiasm for all things musical spills off the page. Furthermore, he is a physicist, with a PhD in astronomy from the University of Manchester. So if there’s a writer who can genuinely and credibly inhabit both sides of the arts–science cultural divide, it’s Darling.

But is A Perfect Harmony in tune with the rest of the literary ensemble, or marching to a different beat? In other words, is this a fresh new take on the music-meets-maths (meets pop sci) genre or, like too many bands I won’t mention, does it sound suspiciously like something you’ve heard many times before? Well, much like an old-school vinyl album, Darling’s work has the feel of two distinct sides. (And I’ll try to make that my final spin on groan-worthy musical metaphors. Promise.)

Not quite perfect pitch

Although the subtitle for A Perfect Harmony is “Music, Mathematics and Science”, the first half of the book is more of a history of the development and evolution of music and musical instruments in various cultures, rather than a new exploration of the underpinning mathematical and scientific principles. Engaging and entertaining though this is – and all credit to Darling for working in a reference to Van Halen in the opening lines of chapter 1 – it’s well-worn ground: Pythagorean tuning, the circle of fifths, equal temperament, Music of the Spheres (not the Coldplay album, mercifully), resonance, harmonics, etc. I found myself wishing, at times, for a take that felt a little more off the beaten track.

One case in point is Darling’s brief discussion of the theremin. If anything earns the title of “The Physicist’s Instrument”, it’s the theremin – a remarkable device that exploits the innate electrical capacitance of the human body to load a resonant circuit and thus produce an ethereal, haunting tone whose pitch can be varied, without, remarkably, any physical contact.

While I give kudos to Darling for highlighting the theremin, the brevity of the description is arguably a lost opportunity when put in the broader context of the book’s aim to explain the deeper connections between music, maths and science. This could have been a novel and fascinating take on the links between electrical and musical resonance that went well beyond the familiar territory mapped out in standard physics-of-music texts.

Using the music of the eclectic Australian band King Gizzard and the Lizard Wizard to explain microtonality is nothing short of inspired

As the book progresses, however, Darling moves into more distinctive territory, choosing a variety of inventive examples that are often fascinating and never short of thought-provoking. I particularly enjoyed his description of orbital resonance in the system of seven planets orbiting the red dwarf TRAPPIST-1, 41 light-years from Earth. The orbital periods have ratios, which, when mapped to musical intervals, correspond to a minor sixth, a major sixth, two perfect fifths, a perfect fourth and another perfect fifth. And it’s got to be said that using the music of the eclectic Australian band King Gizzard and the Lizard Wizard to explain microtonality is nothing short of inspired.

A Perfect Harmony doesn’t entirely close the cultural gap highlighted by Snow all those years ago, but it does hum along pleasantly in the space between. Though the subject matter occasionally echoes well-trodden themes, Darling’s perspective and enthusiasm lend it freshness. There’s plenty here to enjoy, especially for physicists inclined to tune into the harmonies of the universe.

  • 2025 Oneworld Publications 288pp £10.99pb/£6.99e-book

The post Harmonious connections: bridging the gap between music and science appeared first on Physics World.

Reçu hier — 3 juin 20256.5 📰 Sciences English

The Commercial Space Federation Announces the Creation of the Space Supply Chain Council (S2C2)

3 juin 2025 à 20:00
Commercial Space Federation logo

June 3, 2025 – Washington, D.C. – In an effort to strengthen advocacy for the U.S. space industry, the Commercial Space Federation (CSF) is proud to announce the creation of […]

The post The Commercial Space Federation Announces the Creation of the Space Supply Chain Council (S2C2) appeared first on SpaceNews.

New analysis of M67 cluster helps decode the sound of stars

3 juin 2025 à 16:00

Stars are cosmic musical instruments: they vibrate with complex patterns that echo through their interiors. These vibrations, known as pressure waves, ripple through the star, similar to the earthquakes that shake our planet. The frequencies of these waves hold information about the star’s mass, age and internal structure.

In a study led by researchers at UNSW Sydney, Australia, astronomer Claudia Reyes and colleagues “listened” to the sound from stars in the M67 cluster and discovered a surprising feature: a plateau in their frequency pattern. This plateau appears during the subgiant and red giant phases of stars where they expand and evolve after exhausting the hydrogen fuel in their cores. This feature, reported in Nature, reveals how deep the outer layers of the star have pushed into the interior and offers a new diagnostic to improve mass and age estimates of stars beyond the main sequence (the core-hydrogen-burning phase).

How do stars create sound?

Beneath the surface of stars, hot gases are constantly rising, cooling and sinking back down, much like hot bubbles in boiling water. This constant churning is called convection. As these rising and sinking gas blobs collide or burst at the stellar surface, they generate pressure waves. These are essentially acoustic waves, bouncing within the stellar interior to create standing wave patterns.

Stars do not vibrate at just one frequency; they oscillate simultaneously at multiple frequencies, producing a spectrum of sounds. These acoustic oscillations cannot be heard in space directly, but are observed as tiny fluctuations in the star’s brightness over time.

M67 cluster as stellar laboratory

Star clusters offer an ideal environment in which to study stellar evolution as all stars in a cluster form from the same gas cloud at about the same time with the same chemical compositions but with different masses. The researchers investigated stars from the open cluster M67, as this cluster has a rich population of evolved stars including subgiants and red giants with a chemical composition similar to the Sun’s. They measured acoustic oscillations in 27 stars using data from NASA’s Kepler/K2 mission.

Stars oscillate across a range of tones, and in this study the researchers focused on two key features in this oscillation spectrum: large and small frequency separations. The large frequency separation, which probes stellar density, is the frequency difference between oscillations of the same angular degree () but different radial orders (n). The small frequency separation refers to frequency differences between the modes of degrees and ℓ + 2, of consecutive orders of n.  For main sequence stars, small separations are reliable age indicators because their changes during hydrogen burning are well understood. In later stages of stellar evolution, however, their relationship to the stellar interior remained unclear.

In 27 stars, Reyes and colleagues investigated the small separation between modes of degrees 0 and 2. Plotting a graph of small versus large frequency separations for each star, called a C–D diagram, they uncovered a surprising plateau in small frequency separations.

C–D diagrams for two M67 stars
A surprising feature C–D diagram showing different evolutionary stages of stars of mass 1 (left) and 1.7 solar masses (right) made from stellar models. Each point represents a specific stage in stellar evolution from the main sequence (A) to the red giant (F). The plateau seen from points F to E during the post-main-sequence phase reveals a transition in the stellar interior. (Courtesy: CC BY 4.0/C Reyes et al. Nature 10.1038/s41586-025-08760-2)

The researchers traced this plateau to the evolution of the lower boundary of the star’s convective envelope. As the envelope expands and cools, this boundary sinks deeper into the interior. Along this boundary, the density and sound speed change rapidly due to the difference in chemical composition on either side. These steep changes cause acoustic glitches that disturb how the pressure waves move through the star and temporarily stall the evolution of the small frequency separations, observed as a plateau in the frequency pattern.

This stalling occurs at a specific stage in stellar evolution – when the convective envelope deepens enough to encompass nearly 80% of the star’s mass. To confirm this connection, the researchers varied the amount of convective boundary mixing in their stellar models. They found that the depth of the envelope directly influenced both the timing and shape of the plateau in the small separations.

A new window on galactic history

This plateau serves as a new diagnostic tool to identify a specific evolutionary stage in red giant stars and improve estimates of their mass and age.

“The discovery of the ‘plateau’ frequencies is significant because it represents one more corroboration of the accuracy of our stellar models, as it shows how the turbulent regions at the bottom of a star’s envelope affect the sound speed,” explains Reyes, who is now at the Australian National University in Canberra. “They also have great potential to help determine with ease and great accuracy the mass and age of a star, which is of great interest for galactic archaeology, the study of the history of our galaxy.”

The sounds of starquakes offer a new window to study the evolution of stars and, in turn, recreate the history of our galaxy. Clusters like M67 serve as benchmarks to study and test stellar models and understand the future evolution of stars like our Sun.

“We plan to look for stars in the field which have very well-determined masses and which are in their ‘plateau’ phase,” says Reyes. “We will use these stars to benchmark the diagnostic potential of the plateau frequencies as a tool, so it can later be applied to stars all over the galaxy.”

The post New analysis of M67 cluster helps decode the sound of stars appeared first on Physics World.

Bury it, don’t burn it: turning biomass waste into a carbon solution

3 juin 2025 à 12:00

If a tree fell in a forest almost 4000 years ago, did it make a sound? Well, in the case of an Eastern red cedar in what is now Quebec, Canada, it’s certainly still making noise today.

That’s because in 2013, a team of scientists were digging a trench when they came across the 3775-year-old log. Despite being buried for nearly four millennia, the wood wasn’t rotten and useless. In fact, recent analysis unearthed an entirely different story.

The team, led by atmospheric scientist Ning Zeng of the University of Maryland in the US, found that the wood had only lost 5% of its carbon compared with a freshly cut Eastern red cedar log. “The wood is nice and solid – you could probably make a piece of furniture out of it,” says Zeng. The log had been preserved in such remarkable shape because the clay soil it was buried in was highly impermeable. That limited the amount of oxygen and water reaching the wood, suppressing the activity of micro-organisms that would otherwise have made it decompose.

Asian man in an office holding an ancient wooden log
Fortified and ancient Ning Zeng and colleagues discovered this 3775-year-old preserved log while conducting a biomass burial pilot project in Quebec, Canada. (Courtesy: Mark Sherwood)

This ancient log is a compelling example of “biomass burial”. When plants decompose or are burnt, they release the carbon dioxide (CO2) they had absorbed from the atmosphere. One idea to prevent this CO2 being released back into the atmosphere is to bury the waste biomass under conditions that prevent or slow decomposition, thereby trapping the carbon underground for centuries.

In fact, Zeng and his colleagues discovered the cedar log while they were digging a huge trench to bury 35 tonnes of wood to test this very idea. Nine years later, when they dug up some samples, they found that the wood had barely decomposed. Further analysis suggested that if the logs had been left buried for a century, they would still hold 97% of the carbon that was present when they were felled.

Digging holes

To combat climate change, there is often much discussion about how to remove carbon from the atmosphere. As well as conventional techniques like restoring peatland and replanting forests, there are a variety of more technical methods being developed (figure 1). These include direct air capture (DAC) and ocean alkalinity enhancement, which involves tweaking the chemistry of oceans so that they absorb more CO2. But some scientists – like Sinéad Crotty, a managing director at the Carbon Containment Lab in Connecticut, US – think that biomass burial could be a simpler and cheaper way to sequester carbon.

1 Ready or not

Diagram showing a list of 15 methods of carbon removal
(Adapted from Smith et al. (2024) State of Carbon Dioxide Removal – Edition 2. DOI:10.17605/OSF.IO/F85QJ)

There are multiple methods being developed for capturing, converting and storing carbon dioxide (CO2), each at different stages of readiness for deployment, with varying removal capabilities and storage durability timescales.

This figure – adapted from the State of Carbon Dioxide Removal report – shows methods that are already deployed or analysed in research literature. They are categorized as either “conventional”, processes that are widely established and deployed at scale; or “novel”, those that are at a lower level of readiness and therefore only used on smaller scales. The figure also rates their Technology Readiness Level (TRL), maximum mitigation potential (how many gigatonnes (109 tonnes) of CO2 can be sequestered per year), and storage timescale.

The report defines each technique as follows:

  • Afforestation – Conversion to forest of land that was previously not forest.
  • Reforestation – Conversion to forest of land that was previously deforested.
  • Agroforestry – Growing trees on agricultural land while maintaining agricultural production.
  • Forest management – Stewardship and use of existing forests. To count as carbon dioxide removal (CDR), forest management practices must enhance the long-term average carbon stock in the forest system.
  • Peatland and coastal wetland restoration – Assisted recovery of inland ecosystems that are permanently or seasonally flooded or saturated by water (such as peatlands) and of coastal ecosystems (such as tidal marshes, mangroves and seagrass meadows). To count as CDR, this recovery must lead to a durable increase in the carbon content of these systems.
  • Durable wood products – Wood products which meet a given threshold of durability, typically used in construction. These can include sawn wood, wood panels and composite beams, but exclude less durable products such as paper.
  • Biochar – Relatively stable, carbon-rich material produced by heating biomass in an oxygen-limited environment. Assumed to be applied as a soil amendment unless otherwise stated.
  • Mineral products – Production of solid carbonate materials for use in products such as aggregates, asphalt, cement and concrete, using CO2 captured from the atmosphere.
  • Enhanced rock weathering – Increasing the natural rate of removal of CO2 from the atmosphere by applying crushed rocks, rich in calcium and magnesium, to soil or beaches.
  • Biomass burial – Burial of biomass in land sites such as soils or exhausted mines. Excludes storage in the typical geological formations associated with carbon capture and storage (CCS).
  • Bio-oil storage – Oil made by biomass conversion and placed into geological storage.
  • Bioenergy with carbon capture and storage – Process by which biogenic CO2 is captured from a bioenergy facility, with subsequent geological storage.
  • Direct air carbon capture and storage – Chemical process by which CO2 is captured from the ambient air, with subsequent geological storage.
  • Ocean fertilization – Enhancement of nutrient supply to the near-surface ocean with the aim of sequestering additional CO2 from the atmosphere stimulated through biological production. Methods include direct addition of micro-nutrients or macro-nutrients. To count as CDR, the biomass must reach the deep ocean where the carbon has the potential to be sequestered durably.
  • Ocean alkalinity enhancement – Spreading of alkaline materials on the ocean surface to increase the alkalinity of the water and thus increase ocean CO2 uptake.
  • Biomass sinking – Sinking of terrestrial (e.g. straw) or marine (e.g. macroalgae) biomass in the marine environment. To count as CDR, the biomass must reach the deep ocean where the carbon has the potential to be sequestered durably.
  • Direct ocean carbon capture and storage – Chemical process by which CO2 is captured directly from seawater, with subsequent geological storage. To count as CDR, this capture must lead to increased ocean CO2 uptake.

The 3775-year-old log shows that carbon can be stored for centuries underground, but the wood has to be buried under specific conditions. “People tend to think, ‘Who doesn’t know how to dig a hole and bury some wood?’” Zeng says. “But think about how many wooden coffins were buried in human history. How many of them survived? For a timescale of hundreds or thousands of years, we need the right conditions.”

The key for scientists seeking to test biomass burial is to create dry, low-oxygen environments, similar to those in the Quebec clay soil. Last year, for example, Crotty and her colleagues dug more than 100 pits at a site in Colorado, in the US, filled them with woody material and then covered them up again. In five years’ time they plan to dig the biomass back out of the pits to see how much it has decomposed.

The pits vary in depth, and have been refilled and packed in different ways, to test how their build impacts carbon storage. The researchers will also be calculating the carbon emissions of processes such as transporting and burying the biomass – including the amount of carbon released from the soil when the pits are dug. “What we are trying to do here is build an understanding of what works and what doesn’t, but also how we can measure, report and verify that what we are doing is truly carbon negative,” Crotty says.

Over the next five years the team will continuously measure surface CO2 and methane fluxes from several of the pits, while every pit will have its CO2 and methane emissions measured monthly. There are also moisture sensors and oxygen probes buried in the pits, plus a full weather station on the site.

Crotty says that all this data will allow them to assess how different depths, packing styles and the local environment alter conditions in the chambers. When the samples are excavated in five years, the researchers will also explore what types of decomposition the burial did and did not suppress. This will include tests to identify different fungal and bacterial signatures, to uncover the micro-organisms involved in any decay.

The big questions

Experiments like Crotty’s will help answer one of the key concerns about terrestrial storage of biomass: how long can the carbon be stored?

In 2023 a team led by Lawrence Livermore National Laboratory (LLNL) did a large-scale analysis of the potential for CO2 removal in the US. The resulting Road to Removal report outlined how CO2 removal could be used to help the US achieve its net zero goals (these have since been revoked by the Trump administration), focusing on techniques like direct air capture (DAC), increasing carbon uptake in forests and agricultural lands, and converting waste biomass into fuels and CO2.

The report did not, however, look at biomass burial. One of the report authors, Sarah Baker – an expert in decarbonization and CO2 removal at LLNL – told Physics World that this was because of a lack of evidence around the durability of the carbon stored. The report’s minimum requirement for carbon storage was at least 100 years, and there were not enough data available to show how much carbon stored in biomass would remain after that period, Baker explains.

The US Department of Energy is also working to address this question. It has funded a set of projects, which Baker is involved with, to bridge some of the knowledge gaps on carbon-removal pathways. This includes one led by the National Renewable Energy Lab, measuring how long carbon in buried biomass remains stored under different conditions.

Bury the problem

Crotty’s Colorado experiment is also addressing another question: are all forms of biomass equally appropriate for burial? To test this, Crotty’s team filled the pits with a range of woody materials, including different types of wood and wood chip as well as compressed wood, and “slash” – small branches, leaves, bark and other debris created by logging and other forestry work.

Indeed, Crotty and her colleagues see biomass storage as crucial for those managing our forests. The western US states, in particular, have seen an increased risk of wildfires through a mix of climate change and aggressive fire-suppression policies that do not allow smaller fires to burn and thereby produce overgrown forests. “This has led to a build-up of fuels across the landscape,” Crotty says. “So, in a forest that would typically have a high number of low-severity fires, it’s changed the fire regime into a very high-intensity one.”

These concerns led the US Forest Service to announce a 10-year wildfire crisis plan in 2022 that seeks to reduce the risk of fires by thinning and clearing 50 million acres of forest land, in addition to 20 million acres already slated for treatment. But this creates a new problem.

“There are currently very few markets for the types of residues that need to come out of these forests – it is usually small-diameter, low-value timber,” explains Crotty. “They typically can’t pay their way out of the forests, so business as usual in many areas is to simply put them in a pile and burn them.”

Large pile of wood burning in snowy landscape at edge of forest
Cheap but costly Typically, waste biomass from forest management is burnt, like this pile of slash at the edge of Coconino National Forest in Arizona – but doing so releases carbon dioxide. (Courtesy: Josh Goldstein/Coconino National Forest)

A recent study Crotty co-authored suggests that every year “pile burning” in US National Forests emits greenhouse gases equivalent to almost two million tonnes of CO2, and more than 11 million tonnes of fine particulate matter – air pollution that is linked to a range of health problems. Conservative estimates by the Carbon Containment Lab indicate that the material scheduled for clearance under the Forest Service’s 10-year crisis plan will contain around two gigatonnes (Gt) of CO2 equivalents. This is around 5% of current annual global CO2 emissions.

There are also cost implications. Crotty’s recent analysis found that piling and burning forest residue costs around $700 to $1300 per acre. By adding value to the carbon in the forest residues and keeping it out of the atmosphere, biomass storage may offer a solution to these issues, Crotty says.

As an incentive to remove carbon from the atmosphere, trading mechanisms exist whereby individuals, companies and governments can buy and sell carbon emissions. In essence, carbon has a price attached to it, meaning that someone who has emitted too much, say, can pay someone else to capture and store the equivalent amount of emissions, with an often-touted figure being $100 per tonne of CO2 stored. For a long time, this has been seen as the price at which carbon capture becomes affordable, enabling scale up to the volumes needed to tackle climate change.

“There is only so much capital that we will ever deploy towards [carbon removal] and thus the cheaper the solution, the more credits we’ll be able to generate, the more carbon we will be able to remove from the atmosphere,” explains Justin Freiberg, a managing director of the Carbon Containment Lab. “$100 is relatively arbitrary, but it is important to have a target and aim low on pricing for high quality credits.”

DAC has not managed to reach this magical price point. Indeed, the Swiss firm Climeworks – which is one of the biggest DAC companies – has stated that its costs might be around $300 per tonne by 2030.

A tomb in a mine

Another carbon-removal company, however, claims it has hit this benchmark using biomass burial. “We’re selling our first credits at $100 per tonne,” says Hannah Murnen, chief technology officer at Graphyte – a US firm backed by Bill Gates.

Graphyte is confident that there is significant potential in biomass burial. Based in Pine Bluff, Arkansas, the firm dries and compresses waste biomass into blocks before storage. “We dry it to below a level at which life can exist,” says Murnen, which effectively halts decomposition.

The company claims that it will soon be storing 50,000 tonnes of CO2 per year and is aiming for five million tonnes per year by 2030. Murnen acknowledges that these are “really significant figures”, particularly compared with what has been achieved in carbon capture so far. Nevertheless, she adds, if you look at the targets around carbon capture “this is the type of scale we need to get to”.

The need for carbon capture

The Intergovernmental Panel on Climate Change says that carbon capture is essential to limit global warming to 1.5 °C above pre-industrial levels.

To stay within the Paris Agreement’s climate targets, the 2024 State of Carbon Dioxide Removal report estimated that 7–9 gigatonnes (Gt) of CO2 removal will be needed annually by 2050. According to the report – which was put together by multiple institutions, led by the University of Oxford – currently two billion tonnes of CO2 are being removed per year, mostly through “conventional” methods like tree planting and wetland restoration. “Novel” methods – such as direct air capture (DAC), bioenergy with carbon capture, and ocean alkalinity enhancement – contribute 1.3 million tonnes of CO₂ removal per year, less than 0.1% of the total.

Graphyte is currently working with sawmill residue and rice hulls, but in the future Murnen says it plans to accept all sorts of biomass waste. “One of the great things about biomass for the purpose of carbon removal is that, because we are not doing any sort of chemical transformation on the biomass, we’re very flexible to the type of biomass,” Murnen adds.

And there appears to be plenty available. Estimates by researchers in the UK and India (NPJ Climate and Atmospheric Science 2 35) suggest that every year around 140 Gt of biomass waste is generated globally from forestry and agriculture. Around two-thirds of the agricultural residues are from cereals, like wheat, rice, barley and oats, while sugarcane stems and leaves are the second largest contributors. The rest is made up of things like leaves, roots, peels and shells from other crops. Like forest residues, much of this waste ends up being burnt or left to rot, releasing its carbon.

Currently, Graphyte has one storage site about 30 km from Pine Bluff, where its compressed biomass blocks are stored underground, enclosed in an impermeable layer that prevents water ingress. “We took what used to be an old gravel mine – so basically a big hole in the ground – and we’ve created a lined storage tomb where we are placing the biomass and then sealing it closed,” says Murnen.

Large quarry-like area with hundreds of black blocks stacked in rows and large plant machinery moving more blocks around
Big hole in the ground Graphyte is using an old gravel mine 30 km from Pine Bluff in Arkansas to store its compressed biomass bricks. (Courtesy: Graphyte)

Once sealed, Graphyte monitors the CO2 and methane concentrations in the headspace of the vaults, to check for any decomposition of the biomass. The company also analyses biomass as it enters the facility, to track how much carbon it is storing. Wood residues, like sawmill waste are generally around 50% carbon, says Murnen, but rice hulls are closer to 35% carbon.

Graphyte is confident that its storage is physically robust and could avoid any re-emission for what Murnen calls “a very long period of time”. However, it is also exploring how to prevent accidental disturbance of the biomass in the future – possibly long after the company ceases to exist. One option is to add a conservation easement to the site, a well-established US legal mechanism for adding long-term protection to land.

“We feel pretty strongly that the way we are approaching [carbon removal] is one of the most scalable ways,” Murnen says. “In as far as impediments or barriers to scale, we have a much easier permitting pathway, we don’t need pipelines, we are pretty flexible on the type of land that we can use for our storage sites, and we have a huge variety of feedstocks that we can take into the process.”

A simple solution

Back at LLNL, Baker says that although she hasn’t “run the numbers”, and there are a lot caveats, she suspects that biomass burial is “true carbon removal because it is so simple”.

Once associated upstream and downstream emissions are taken into account, many techniques that people call carbon removal are probably not, she says, because they emit more fossil CO2 than they store.

Biomass burial is also cheap. As the Road to Removal analysis found, “thermal chemical” techniques, like pyrolysis, have great potential for removing and storing carbon while converting biomass into hydrogen and sustainable aviation fuel. But they require huge investment, with larger facilities potentially costing hundreds of millions of dollars. Biomass burial could even act as temporary storage until facilities are ready to convert the carbon into sustainable fuels. “Buy ourselves time and then use it later,” says Baker.

Either way, biomass burial has great potential for the future of carbon storage, and therefore our environment. “The sooner we can start doing these things the greater the climate impact,” Baker says.

We just need to know that the storage is durable – and if that 3775-year-old log is any indication, there’s the potential to store biomass for hundreds, maybe thousands of years.

The post Bury it, don’t burn it: turning biomass waste into a carbon solution appeared first on Physics World.

Wireless e-tattoos help manage mental workload

3 juin 2025 à 10:00

Managing one’s mental workload is a tricky balancing act that can affect cognitive performance and decision making abilities. Too little engagement with an ongoing task can lead to boredom and mistakes; too high could cause a person to become overwhelmed.

For those performing safety-critical tasks, such as air traffic controllers or truck drivers for example, monitoring how hard their brain is working is even more important – lapses in focus could have serious consequences. But how can a person’s mental workload be assessed? A team at the University of Texas at Austin proposes the use of temporary face tattoos that can track when a person’s brain is working too hard.

“Technology is developing faster than human evolution. Our brain capacity cannot keep up and can easily get overloaded,” says lead author Nanshu Lu in a press statement. “There is an optimal mental workload for optimal performance, which differs from person to person.”

The traditional approach for monitoring mental workload is electroencephalography (EEG), which analyses the brain’s electrical activity. But EEG devices are wired, bulky and uncomfortable, making them impractical for real-world situations. Measurements of eye movements using electrooculography (EOG) are another option for assessing mental workload.

Lu and colleagues have developed an ultrathin wireless e-tattoo that records high-fidelity EEG and EOG signals from the forehead. The e-tattoo combines a disposable sticker-like electrode layer and a reusable battery-powered flexible printed circuit (FPC) for data acquisition and wireless transmission.

The serpentine-shaped electrodes and interconnects are made from low-cost, conductive graphite-deposited polyurethane, coated with an adhesive polymer composite to reduce contact impedance and improve skin attachment. The e-tattoo stretches and conforms to the skin, providing reliable signal acquisition, even during dynamic activities such as walking and running.

To assess the e-tattoo’s ability to record basic neural activities, the team used it to measure alpha brainwaves as a volunteer opened and closed their eyes. The e-tattoo captured equivalent neural spectra to that recorded by a commercial gel electrode-based EEG system with comparable signal fidelity.

The researchers next tested the e-tattoo on six participants while they performed a visuospatial memory task that gradually increased in difficulty. They analysed the signals collected by the e-tattoo during the tasks, extracting EEG band powers for delta, theta, alpha, beta and gamma brainwaves, plus various EOG features.

As the task got more difficult, the participants showed higher activity in the theta and delta bands, a feature associated with increased cognitive demand. Meanwhile, activity in the alpha and beta bands decreased, indicating mental fatigue.

The researchers built a machine learning model to predict the level of mental workload experienced during the tasks, training it on forehead EEG and EOG features recorded by the e-tattoo. The model could reliably estimate mental workload in each of the six subjects, demonstrating the feasibility of real-time cognitive state decoding.

“Our key innovation lies in the successful decoding of mental workload using a wireless, low-power, low-noise and ultrathin EEG/EOG e-tattoo device,” the researchers write. “It addresses the unique challenges of monitoring forehead EEG and EOG, where wearability, non-obstructiveness and signal stability are critical to assessing mental workload in the real world.”

They suggest that future applications could include real-time cognitive load monitoring in pilots, operators and healthcare professionals. “We’ve long monitored workers’ physical health, tracking injuries and muscle strain,” says co-author Luis Sentis. “Now we have the ability to monitor mental strain, which hasn’t been tracked. This could fundamentally change how organizations ensure the overall well-being of their workforce.”

The e-tattoo is described in Device.

The post Wireless e-tattoos help manage mental workload appeared first on Physics World.

South Korea’s Venus-focused cubesat advances as larger missions face NASA cuts

2 juin 2025 à 23:19

South Korea’s state-backed Institute for Basic Science has ordered the first of five cubesats to study Venus from LEO starting next year, bolstering sustained planetary research as flagship missions face budget uncertainty.

The post South Korea’s Venus-focused cubesat advances as larger missions face NASA cuts appeared first on SpaceNews.

Reçu avant avant-hier6.5 📰 Sciences English

Andromeda galaxy may not collide with the Milky Way after all

2 juin 2025 à 17:00

Since 1912, we’ve known that the Andromeda galaxy is racing towards our own Milky Way at about 110 kilometres per second. A century later, in 2012, astrophysicists at the Space Telescope Science Institute (STScI) in Maryland, US came to a striking conclusion. In four billion years, they predicted, a collision between the two galaxies was a sure thing.

Now, it’s not looking so sure.

Using the latest data from the European Space Agency’s Gaia astrometric mission, astrophysicists led by Till Sawala of the University of Helsinki, Finland re-modelled the impending crash, and found that it’s 50/50 as to whether a collision happens or not.

This new result differs from the 2012 one because it considers the gravitational effect of an additional galaxy, the Large Magellanic Cloud (LMC), alongside the Milky Way, Andromeda and the nearby Triangulum spiral galaxy, M33. While M33’s gravity, in effect, adds to Andromeda’s motion towards us, Sawala and colleagues found that the LMC’s gravity tends to pull the Milky Way out of Andromeda’s path.

“We’re not predicting that the merger is not going to happen within 10 billion years, we’re just saying that from the data we have now, we can’t be certain of it,” Sawala tells Physics World.

“A step in the right direction”

While the LMC contains only around 10% of the Milky Way’s mass, Sawala and colleagues’ work indicates that it may nevertheless be massive enough to turn a head-on collision into a near-miss. Incorporating its gravitational effects into simulations is therefore “a step in the right direction”, says Sangmo Tony Sohn, a support scientist at the STScI and a co-author of the 2012 paper that predicted a collision.

Even with more detailed simulations, though, uncertainties in the motion and masses of the galaxies leave room for a range of possible outcomes. According to Sawala, the uncertainty with the greatest effect on merger probability lies in the so-called “proper motion” of Andromeda, which is its motion as it appears on our night sky. This motion is a mixture of Andomeda’s radial motion towards the centre of the Milky Way and the two galaxies’ transverse motion perpendicular to one another.

If the combined transverse motion is large enough, Andromeda will pass the Milky Way at a distance greater than 200 kiloparsecs (652,000 light years). This would avert a collision in the next 10 billion years, because even when the two galaxies loop back on each other, their next pass would still be too distant, according to the models.

Conversely, a smaller transverse motion would limit the distance at closest approach to less than 200 kiloparsecs. If that happens, Sawala says the two galaxies are “almost certain to merge” because of the dynamical friction effect, which arises from the diffuse halo of old stars and dark matter around galaxies. When two galaxies get close enough, these haloes begin interacting with each other, generating tidal and frictional heating that robs the galaxies of orbital energy and makes them fall ever closer.

The LMC itself is an excellent example of how this works. “The LMC is already so close to the Milky Way that it is losing its orbital energy, and unlike [Andromeda], it is guaranteed to merge with the Milky Way,” Sawala says, adding that, similarly, M33 stands a good chance of merging with Andromeda.

“A very delicate task”

Because Andromeda is 2.5 million light years away, its proper motion is very hard to measure. Indeed, no-one had ever done it until the STScI team spent 10 years monitoring the galaxy, which is also known as M31, with the Hubble Space Telescope – something Sohn describes as “a very delicate task” that continues to this day.

Another area where there is some ambiguity is in the mass estimate of the LMC. “If the LMC is a little more massive [than we think], then it pulls the Milky Way off the collision course with M31 a little more strongly, reducing the possibility of a merger between the Milky Way and M31,” Sawala explains.

The good news is that these ambiguities won’t be around forever. Sohn and his team are currently analysing new Hubble data to provide fresh constraints on the Milky Way’s orbital trajectory, and he says their results have been consistent with the Gaia analyses so far. Sawala agrees that new data will help reduce uncertainties. “There’s a good chance that we’ll know more about what is going to happen fairly soon, within five years,” he says.

Even if the Milky Way and Andromeda don’t collide in the next 10 billion years, though, that won’t be the end of the story. “I would expect that there is a very high probability that they will eventually merge, but that could take tens of billions of years,” Sawala says.

The research is published in Nature Astronomy.

The post Andromeda galaxy may not collide with the Milky Way after all appeared first on Physics World.

❌