↩ Accueil

Vue lecture

‘Slapp addict’ Italian oil firm accused of trying to silence green activists

Eni has filed at least six defamation suits against journalists and NGOs since 2019 in what critics say is intimidation campaign

When Antonio Tricarico was summoned to his local police station in October and told he was being investigated for defamation, he was stressed but not shocked. Months earlier, Tricarico, the director of the Italian environment NGO ReCommon, had filed a joint legal challenge against the country’s biggest oil company, Eni, which he knew had a history of using lawyers to clamp down on critics.

The company had previously limited itself to civil defamation lawsuits, including against ReCommon, but in Tricarico’s case it initiated criminal proceedings over statements he had made in a television interview.

Continue reading...

© Photograph: Carlo Dojmi di Delupis/ReCommon

© Photograph: Carlo Dojmi di Delupis/ReCommon

  •  

‘Legal bullying’: global protest rights on line in Dutch court case, say activists

After US jury said it should pay oil pipeline firm $660m, Greenpeace is hoping to reclaim funds via EU anti-Slapp law

The outcome of a court case in the Netherlands could shape the right to protest around the globe for decades to come, campaigners have warned, as figures show a dramatic rise in legal action taken by fossil fuel companies against activists and journalists.

Greenpeace International is using a recently introduced EU directive to try to reclaim costs and damages it incurred when a US jury decided it should pay the oil pipeline corporation Energy Transfer more than $660m in damages earlier this year.

Continue reading...

© Photograph: John L Mone/AP

© Photograph: John L Mone/AP

  •  

Rise in legal action against renewables companies over minerals

Renewable energy and critical minerals projects often want to mine on sacred lands but minority groups are fighting back through the courts

Located in Wikieup, Arizona, at the meeting point of the Mojave and Sonoran deserts, H’a’Kamwe’ has for centuries had sacred significance for the Hualapai tribe. They regard the hot spring, fed by water naturally stored underground in volcanic rocks, as a place for healing that symbolises their connection to the land.

So when an Australian mining company announced plans to begin exploratory drilling for lithium at 100 locations on Hualapai land, including as close as just 700 metres from H’a’Kamwe’, they regarded it as a potential desecration.

Continue reading...

© Photograph: Bloomberg/Getty Images

© Photograph: Bloomberg/Getty Images

  •  

King Charles to receive £132m next year after crown estate makes £1.1bn profit

Offshore wind power boom helps push profit from land and property to more than double what it was two years ago

King Charles is set to receive official annual income of £132m next year, after his portfolio of land and property made more than £1bn in profits thanks to a boom in the offshore wind sector.

Profits at the crown estate – which partly funds the monarchy – were flat at £1.1bn in its financial year to the end of March but more than double their level two years ago, at £442.6m.

Continue reading...

© Photograph: Chris Jackson/Reuters

© Photograph: Chris Jackson/Reuters

  •  

Bury it, don’t burn it: turning biomass waste into a carbon solution

If a tree fell in a forest almost 4000 years ago, did it make a sound? Well, in the case of an Eastern red cedar in what is now Quebec, Canada, it’s certainly still making noise today.

That’s because in 2013, a team of scientists were digging a trench when they came across the 3775-year-old log. Despite being buried for nearly four millennia, the wood wasn’t rotten and useless. In fact, recent analysis unearthed an entirely different story.

The team, led by atmospheric scientist Ning Zeng of the University of Maryland in the US, found that the wood had only lost 5% of its carbon compared with a freshly cut Eastern red cedar log. “The wood is nice and solid – you could probably make a piece of furniture out of it,” says Zeng. The log had been preserved in such remarkable shape because the clay soil it was buried in was highly impermeable. That limited the amount of oxygen and water reaching the wood, suppressing the activity of micro-organisms that would otherwise have made it decompose.

Asian man in an office holding an ancient wooden log
Fortified and ancient Ning Zeng and colleagues discovered this 3775-year-old preserved log while conducting a biomass burial pilot project in Quebec, Canada. (Courtesy: Mark Sherwood)

This ancient log is a compelling example of “biomass burial”. When plants decompose or are burnt, they release the carbon dioxide (CO2) they had absorbed from the atmosphere. One idea to prevent this CO2 being released back into the atmosphere is to bury the waste biomass under conditions that prevent or slow decomposition, thereby trapping the carbon underground for centuries.

In fact, Zeng and his colleagues discovered the cedar log while they were digging a huge trench to bury 35 tonnes of wood to test this very idea. Nine years later, when they dug up some samples, they found that the wood had barely decomposed. Further analysis suggested that if the logs had been left buried for a century, they would still hold 97% of the carbon that was present when they were felled.

Digging holes

To combat climate change, there is often much discussion about how to remove carbon from the atmosphere. As well as conventional techniques like restoring peatland and replanting forests, there are a variety of more technical methods being developed (figure 1). These include direct air capture (DAC) and ocean alkalinity enhancement, which involves tweaking the chemistry of oceans so that they absorb more CO2. But some scientists – like Sinéad Crotty, a managing director at the Carbon Containment Lab in Connecticut, US – think that biomass burial could be a simpler and cheaper way to sequester carbon.

1 Ready or not

Diagram showing a list of 15 methods of carbon removal
(Adapted from Smith et al. (2024) State of Carbon Dioxide Removal – Edition 2. DOI:10.17605/OSF.IO/F85QJ)

There are multiple methods being developed for capturing, converting and storing carbon dioxide (CO2), each at different stages of readiness for deployment, with varying removal capabilities and storage durability timescales.

This figure – adapted from the State of Carbon Dioxide Removal report – shows methods that are already deployed or analysed in research literature. They are categorized as either “conventional”, processes that are widely established and deployed at scale; or “novel”, those that are at a lower level of readiness and therefore only used on smaller scales. The figure also rates their Technology Readiness Level (TRL), maximum mitigation potential (how many gigatonnes (109 tonnes) of CO2 can be sequestered per year), and storage timescale.

The report defines each technique as follows:

  • Afforestation – Conversion to forest of land that was previously not forest.
  • Reforestation – Conversion to forest of land that was previously deforested.
  • Agroforestry – Growing trees on agricultural land while maintaining agricultural production.
  • Forest management – Stewardship and use of existing forests. To count as carbon dioxide removal (CDR), forest management practices must enhance the long-term average carbon stock in the forest system.
  • Peatland and coastal wetland restoration – Assisted recovery of inland ecosystems that are permanently or seasonally flooded or saturated by water (such as peatlands) and of coastal ecosystems (such as tidal marshes, mangroves and seagrass meadows). To count as CDR, this recovery must lead to a durable increase in the carbon content of these systems.
  • Durable wood products – Wood products which meet a given threshold of durability, typically used in construction. These can include sawn wood, wood panels and composite beams, but exclude less durable products such as paper.
  • Biochar – Relatively stable, carbon-rich material produced by heating biomass in an oxygen-limited environment. Assumed to be applied as a soil amendment unless otherwise stated.
  • Mineral products – Production of solid carbonate materials for use in products such as aggregates, asphalt, cement and concrete, using CO2 captured from the atmosphere.
  • Enhanced rock weathering – Increasing the natural rate of removal of CO2 from the atmosphere by applying crushed rocks, rich in calcium and magnesium, to soil or beaches.
  • Biomass burial – Burial of biomass in land sites such as soils or exhausted mines. Excludes storage in the typical geological formations associated with carbon capture and storage (CCS).
  • Bio-oil storage – Oil made by biomass conversion and placed into geological storage.
  • Bioenergy with carbon capture and storage – Process by which biogenic CO2 is captured from a bioenergy facility, with subsequent geological storage.
  • Direct air carbon capture and storage – Chemical process by which CO2 is captured from the ambient air, with subsequent geological storage.
  • Ocean fertilization – Enhancement of nutrient supply to the near-surface ocean with the aim of sequestering additional CO2 from the atmosphere stimulated through biological production. Methods include direct addition of micro-nutrients or macro-nutrients. To count as CDR, the biomass must reach the deep ocean where the carbon has the potential to be sequestered durably.
  • Ocean alkalinity enhancement – Spreading of alkaline materials on the ocean surface to increase the alkalinity of the water and thus increase ocean CO2 uptake.
  • Biomass sinking – Sinking of terrestrial (e.g. straw) or marine (e.g. macroalgae) biomass in the marine environment. To count as CDR, the biomass must reach the deep ocean where the carbon has the potential to be sequestered durably.
  • Direct ocean carbon capture and storage – Chemical process by which CO2 is captured directly from seawater, with subsequent geological storage. To count as CDR, this capture must lead to increased ocean CO2 uptake.

The 3775-year-old log shows that carbon can be stored for centuries underground, but the wood has to be buried under specific conditions. “People tend to think, ‘Who doesn’t know how to dig a hole and bury some wood?’” Zeng says. “But think about how many wooden coffins were buried in human history. How many of them survived? For a timescale of hundreds or thousands of years, we need the right conditions.”

The key for scientists seeking to test biomass burial is to create dry, low-oxygen environments, similar to those in the Quebec clay soil. Last year, for example, Crotty and her colleagues dug more than 100 pits at a site in Colorado, in the US, filled them with woody material and then covered them up again. In five years’ time they plan to dig the biomass back out of the pits to see how much it has decomposed.

The pits vary in depth, and have been refilled and packed in different ways, to test how their build impacts carbon storage. The researchers will also be calculating the carbon emissions of processes such as transporting and burying the biomass – including the amount of carbon released from the soil when the pits are dug. “What we are trying to do here is build an understanding of what works and what doesn’t, but also how we can measure, report and verify that what we are doing is truly carbon negative,” Crotty says.

Over the next five years the team will continuously measure surface CO2 and methane fluxes from several of the pits, while every pit will have its CO2 and methane emissions measured monthly. There are also moisture sensors and oxygen probes buried in the pits, plus a full weather station on the site.

Crotty says that all this data will allow them to assess how different depths, packing styles and the local environment alter conditions in the chambers. When the samples are excavated in five years, the researchers will also explore what types of decomposition the burial did and did not suppress. This will include tests to identify different fungal and bacterial signatures, to uncover the micro-organisms involved in any decay.

The big questions

Experiments like Crotty’s will help answer one of the key concerns about terrestrial storage of biomass: how long can the carbon be stored?

In 2023 a team led by Lawrence Livermore National Laboratory (LLNL) did a large-scale analysis of the potential for CO2 removal in the US. The resulting Road to Removal report outlined how CO2 removal could be used to help the US achieve its net zero goals (these have since been revoked by the Trump administration), focusing on techniques like direct air capture (DAC), increasing carbon uptake in forests and agricultural lands, and converting waste biomass into fuels and CO2.

The report did not, however, look at biomass burial. One of the report authors, Sarah Baker – an expert in decarbonization and CO2 removal at LLNL – told Physics World that this was because of a lack of evidence around the durability of the carbon stored. The report’s minimum requirement for carbon storage was at least 100 years, and there were not enough data available to show how much carbon stored in biomass would remain after that period, Baker explains.

The US Department of Energy is also working to address this question. It has funded a set of projects, which Baker is involved with, to bridge some of the knowledge gaps on carbon-removal pathways. This includes one led by the National Renewable Energy Lab, measuring how long carbon in buried biomass remains stored under different conditions.

Bury the problem

Crotty’s Colorado experiment is also addressing another question: are all forms of biomass equally appropriate for burial? To test this, Crotty’s team filled the pits with a range of woody materials, including different types of wood and wood chip as well as compressed wood, and “slash” – small branches, leaves, bark and other debris created by logging and other forestry work.

Indeed, Crotty and her colleagues see biomass storage as crucial for those managing our forests. The western US states, in particular, have seen an increased risk of wildfires through a mix of climate change and aggressive fire-suppression policies that do not allow smaller fires to burn and thereby produce overgrown forests. “This has led to a build-up of fuels across the landscape,” Crotty says. “So, in a forest that would typically have a high number of low-severity fires, it’s changed the fire regime into a very high-intensity one.”

These concerns led the US Forest Service to announce a 10-year wildfire crisis plan in 2022 that seeks to reduce the risk of fires by thinning and clearing 50 million acres of forest land, in addition to 20 million acres already slated for treatment. But this creates a new problem.

“There are currently very few markets for the types of residues that need to come out of these forests – it is usually small-diameter, low-value timber,” explains Crotty. “They typically can’t pay their way out of the forests, so business as usual in many areas is to simply put them in a pile and burn them.”

Large pile of wood burning in snowy landscape at edge of forest
Cheap but costly Typically, waste biomass from forest management is burnt, like this pile of slash at the edge of Coconino National Forest in Arizona – but doing so releases carbon dioxide. (Courtesy: Josh Goldstein/Coconino National Forest)

A recent study Crotty co-authored suggests that every year “pile burning” in US National Forests emits greenhouse gases equivalent to almost two million tonnes of CO2, and more than 11 million tonnes of fine particulate matter – air pollution that is linked to a range of health problems. Conservative estimates by the Carbon Containment Lab indicate that the material scheduled for clearance under the Forest Service’s 10-year crisis plan will contain around two gigatonnes (Gt) of CO2 equivalents. This is around 5% of current annual global CO2 emissions.

There are also cost implications. Crotty’s recent analysis found that piling and burning forest residue costs around $700 to $1300 per acre. By adding value to the carbon in the forest residues and keeping it out of the atmosphere, biomass storage may offer a solution to these issues, Crotty says.

As an incentive to remove carbon from the atmosphere, trading mechanisms exist whereby individuals, companies and governments can buy and sell carbon emissions. In essence, carbon has a price attached to it, meaning that someone who has emitted too much, say, can pay someone else to capture and store the equivalent amount of emissions, with an often-touted figure being $100 per tonne of CO2 stored. For a long time, this has been seen as the price at which carbon capture becomes affordable, enabling scale up to the volumes needed to tackle climate change.

“There is only so much capital that we will ever deploy towards [carbon removal] and thus the cheaper the solution, the more credits we’ll be able to generate, the more carbon we will be able to remove from the atmosphere,” explains Justin Freiberg, a managing director of the Carbon Containment Lab. “$100 is relatively arbitrary, but it is important to have a target and aim low on pricing for high quality credits.”

DAC has not managed to reach this magical price point. Indeed, the Swiss firm Climeworks – which is one of the biggest DAC companies – has stated that its costs might be around $300 per tonne by 2030.

A tomb in a mine

Another carbon-removal company, however, claims it has hit this benchmark using biomass burial. “We’re selling our first credits at $100 per tonne,” says Hannah Murnen, chief technology officer at Graphyte – a US firm backed by Bill Gates.

Graphyte is confident that there is significant potential in biomass burial. Based in Pine Bluff, Arkansas, the firm dries and compresses waste biomass into blocks before storage. “We dry it to below a level at which life can exist,” says Murnen, which effectively halts decomposition.

The company claims that it will soon be storing 50,000 tonnes of CO2 per year and is aiming for five million tonnes per year by 2030. Murnen acknowledges that these are “really significant figures”, particularly compared with what has been achieved in carbon capture so far. Nevertheless, she adds, if you look at the targets around carbon capture “this is the type of scale we need to get to”.

The need for carbon capture

The Intergovernmental Panel on Climate Change says that carbon capture is essential to limit global warming to 1.5 °C above pre-industrial levels.

To stay within the Paris Agreement’s climate targets, the 2024 State of Carbon Dioxide Removal report estimated that 7–9 gigatonnes (Gt) of CO2 removal will be needed annually by 2050. According to the report – which was put together by multiple institutions, led by the University of Oxford – currently two billion tonnes of CO2 are being removed per year, mostly through “conventional” methods like tree planting and wetland restoration. “Novel” methods – such as direct air capture (DAC), bioenergy with carbon capture, and ocean alkalinity enhancement – contribute 1.3 million tonnes of CO₂ removal per year, less than 0.1% of the total.

Graphyte is currently working with sawmill residue and rice hulls, but in the future Murnen says it plans to accept all sorts of biomass waste. “One of the great things about biomass for the purpose of carbon removal is that, because we are not doing any sort of chemical transformation on the biomass, we’re very flexible to the type of biomass,” Murnen adds.

And there appears to be plenty available. Estimates by researchers in the UK and India (NPJ Climate and Atmospheric Science 2 35) suggest that every year around 140 Gt of biomass waste is generated globally from forestry and agriculture. Around two-thirds of the agricultural residues are from cereals, like wheat, rice, barley and oats, while sugarcane stems and leaves are the second largest contributors. The rest is made up of things like leaves, roots, peels and shells from other crops. Like forest residues, much of this waste ends up being burnt or left to rot, releasing its carbon.

Currently, Graphyte has one storage site about 30 km from Pine Bluff, where its compressed biomass blocks are stored underground, enclosed in an impermeable layer that prevents water ingress. “We took what used to be an old gravel mine – so basically a big hole in the ground – and we’ve created a lined storage tomb where we are placing the biomass and then sealing it closed,” says Murnen.

Large quarry-like area with hundreds of black blocks stacked in rows and large plant machinery moving more blocks around
Big hole in the ground Graphyte is using an old gravel mine 30 km from Pine Bluff in Arkansas to store its compressed biomass bricks. (Courtesy: Graphyte)

Once sealed, Graphyte monitors the CO2 and methane concentrations in the headspace of the vaults, to check for any decomposition of the biomass. The company also analyses biomass as it enters the facility, to track how much carbon it is storing. Wood residues, like sawmill waste are generally around 50% carbon, says Murnen, but rice hulls are closer to 35% carbon.

Graphyte is confident that its storage is physically robust and could avoid any re-emission for what Murnen calls “a very long period of time”. However, it is also exploring how to prevent accidental disturbance of the biomass in the future – possibly long after the company ceases to exist. One option is to add a conservation easement to the site, a well-established US legal mechanism for adding long-term protection to land.

“We feel pretty strongly that the way we are approaching [carbon removal] is one of the most scalable ways,” Murnen says. “In as far as impediments or barriers to scale, we have a much easier permitting pathway, we don’t need pipelines, we are pretty flexible on the type of land that we can use for our storage sites, and we have a huge variety of feedstocks that we can take into the process.”

A simple solution

Back at LLNL, Baker says that although she hasn’t “run the numbers”, and there are a lot caveats, she suspects that biomass burial is “true carbon removal because it is so simple”.

Once associated upstream and downstream emissions are taken into account, many techniques that people call carbon removal are probably not, she says, because they emit more fossil CO2 than they store.

Biomass burial is also cheap. As the Road to Removal analysis found, “thermal chemical” techniques, like pyrolysis, have great potential for removing and storing carbon while converting biomass into hydrogen and sustainable aviation fuel. But they require huge investment, with larger facilities potentially costing hundreds of millions of dollars. Biomass burial could even act as temporary storage until facilities are ready to convert the carbon into sustainable fuels. “Buy ourselves time and then use it later,” says Baker.

Either way, biomass burial has great potential for the future of carbon storage, and therefore our environment. “The sooner we can start doing these things the greater the climate impact,” Baker says.

We just need to know that the storage is durable – and if that 3775-year-old log is any indication, there’s the potential to store biomass for hundreds, maybe thousands of years.

The post Bury it, don’t burn it: turning biomass waste into a carbon solution appeared first on Physics World.

  •  

Plasma physics sets upper limit on the strength of ‘dark electromagnetism’

Physicists have set a new upper bound on the interaction strength of dark matter by simulating the collision of two clouds of interstellar plasma. The result, from researchers at Ruhr University Bochum in Germany, CINECA in Italy and the Instituto Superior Tecnico in Portugal, could force a rethink on theories describing this mysterious substance, which is thought to make up more than 85% of the mass in the universe.

Since dark matter has only ever been observed through its effect on gravity, we know very little about what it’s made of. Indeed, various theories predict that dark matter particles could have masses ranging from around 10−22 eV to around 1019 GeV — a staggering 50 orders of magnitude.

Another major unknown about dark matter is whether it interacts via forces other than gravity, either with itself or with other particles. Some physicists have hypothesized that dark matter particles might possess positive and negative “dark charges” that interact with each other via “dark electromagnetic forces”. According to this supposition, dark matter could behave like a cold plasma of self-interacting particles.

Bullet Cluster experiment

In the new study, the team searched for evidence of dark interactions in a cluster of galaxies located several billion light years from Earth. This galactic grouping is known as the Bullet Cluster, and it contains a subcluster that is moving away from the main body after passing through it at high speed.

Since the most basic model of dark-matter interactions relies on the same equations as ordinary electromagnetism, the researchers chose to simulate these interactions in the Bullet Cluster system using the same computational tools they would use to describe electromagnetic interactions in a standard plasma. They then compared their results with real observations of the Bullet Cluster galaxy.

A graph of the dark electromagnetic coupling constant 𝛼𝐷 as a function of the dark matter mass 𝑚𝐷. There is a blue triangle in the upper left corner of the graph, a wide green region below it running from the bottom left to the top right, and a thin red strip below that. A white triangle at the bottom right of the graph represents a region not disallowed by the measurements.
Interaction strength: Constraints on the dark electromagnetic coupling constant 𝛼𝐷 based on observations from the Bullet Cluster. 𝛼𝐷 must lie below the blue, green and red regions. Dashed lines show the reference value used for the mass of 1 TeV. (Courtesy: K Schoefler et al., “Can plasma physics establish a significant bound on long-range dark matter interactions?” Phys Rev D 111 L071701, https://doi.org/10.1103/PhysRevD.111.L071701)

The new work builds on a previous study in which members of the same team simulated the collision of two clouds of standard plasma passing through one another. This study found that as the clouds merged, electromagnetic instabilities developed. These instabilities had the effect of redistributing energy from the opposing flows of the clouds, slowing them down while also broadening the temperature range within them.

Ruling out many of the simplest dark matter theories

The latest study showed that, as expected, the plasma components of the subcluster and main body slowed down thanks to ordinary electromagnetic interactions. That, however, appeared to be all that happened, as the data contained no sign of additional dark interactions. While the team’s finding doesn’t rule out dark electromagnetic interactions entirely, team member Kevin Schoeffler explains that it does mean that these interactions, which are characterized by a parameter known as 𝛼𝐷, must be far weaker than their ordinary-matter counterpart. “We can thus calculate an upper limit for the strength of this interaction,” he says.

This limit, which the team calculated as 𝛼𝐷 < 4 x 10-25 for a dark matter particle with a mass of 1 TeV, rules out many of the simplest dark matter theories and will require them to be rethought, Schoeffler says. “The calculations were made possible thanks to detailed discussions with scientists working outside of our speciality of physics, namely plasma physicists,” he tells Physics World. “Throughout this work, we had to overcome the challenge of connecting with very different fields and interacting with communities that speak an entirely different language to ours.”

As for future work, the physicists plan to compare the results of their simulations with other astronomical observations, with the aim of constraining the upper limit of the dark electromagnetic interaction even further. More advanced calculations, such as those that include finer details of the cloud models, would also help refine the limit. “These more realistic setups would include other plasma-like electromagnetic scenarios and ‘slowdown’ mechanisms, leading to potentially stronger limits,” Schoeffler says.

The present study is detailed in Physical Review D.

The post Plasma physics sets upper limit on the strength of ‘dark electromagnetism’ appeared first on Physics World.

  •  

European centre celebrates 50 years at the forefront of weather forecasting

What is the main role of the European Centre for Medium-Range Weather Forecasts (ECMWF)?

Making weather forecasts more accurate is at the heart of what we do at the ECMWF, working in close collaboration with our member states and their national meteorological services (see box below). That means enhanced forecasting for the weeks and months ahead as well as seasonal and annual predictions. We also have a remit to monitor the atmosphere and the environment – globally and regionally – within the context of a changing climate.

How does the ECMWF produce its weather forecasts?

Our task is to get the best representation, in a 3D sense, of the current state of the atmosphere versus key metrics like wind, temperature, humidity and cloud cover. We do this via a process of reanalysis and data assimilation: combining the previous short-range weather forecast, and its component data, with the latest atmospheric observations – from satellites, ground stations, radars, weather balloons and aircraft. Unsurprisingly, using all this observational data is a huge challenge, with the exploitation of satellite measurements a significant driver of improved forecasting over the past decade.

In what ways do satellite measurements help?

Consider the EarthCARE satellite that was launched in May 2024 by the European Space Agency (ESA) and is helping ECMWF to improve its modelling of clouds, aerosols and precipitation. EarthCARE has a unique combination of scientific instruments – a cloud-profiling radar, an atmospheric lidar, a multispectral imager and a broadband radiometer – to infer the properties of clouds and how they interact with solar radiation as well as thermal-infrared radiation emitted by different layers of the atmosphere.

How are you combining such data with modelling?

The ECMWF team is learning how to interpret and exploit the EarthCARE data to directly initiate our models. Put simply, mathematical models that better represent clouds and, in turn, yield more accurate forecasts. Indirectly, EarthCARE is also revealing a clearer picture of  the fundamental physics governing cloud formation, distribution and behaviour. This is just one example of numerous developments taking advantage of new satellite data. We are looking forward, in particular, to fully exploiting next-generation satellite programmes from the European Organisation for the Exploitation of Meteorological Satellites (EUMETSAT) – including the EPS-SG polar-orbiting system and the Meteosat Third Generation geostationary satellite for continuous monitoring over Europe, Africa and the Indian Ocean.

ECMWF high-performance computing centre
Big data, big opportunities: the ECMWF’s high-performance computing facility in Bologna, Italy, is the engine-room of the organization’s weather and climate modelling efforts. (Courtesy: ECMWF)

What other factors help improve forecast accuracy?

We talk of “a day, a decade” improvement in weather forecasting, such that a five-day forecast now is as good as a three-day forecast 20 years ago. A richer and broader mix of observational data underpins that improvement, with diverse data streams feeding into bigger supercomputers that can run higher-resolution models and better algorithms. Equally important is ECMWF’s team of multidisciplinary scientists, whose understanding of the atmosphere and climate helps to optimize our models and data assimilation methods. A case study in this regard is Destination Earth, an ambitious European Union initiative to create a series of “digital twins” – interactive computer simulations – of our planet by 2030. Working with ESA and EUMETSTAT, the ECMWF is building the software and data environment for Destination Earth as well as developing the first two digital twins.

What are these two twins?

Our Digital Twin on Weather-Induced and Geophysical Extremes will assess and predict environmental extremes to support risk assessment and management. Meanwhile, in collaboration with others, the Digital Twin on Climate Change Adaptation complements and extends existing capabilities for the analysis and testing of “what if” scenarios – supporting sustainable development and climate adaptation and mitigation policy-making over multidecadal timescales.

Progress in machine learning and AI has been dramatic over the past couple of years

What kind of resolution will these models have?

Both digital twins integrate sea, atmosphere, land, hydrology and sea ice and their deep connections with a resolution currently impossible to reach. Right now, for example, the ECMWF’s operational forecasts cover the whole globe in a 9 km grid – effectively a localized forecast every 9 km. With Destination Earth, we’re experimenting with 4 km, 2 km, and even 1 km grids.

In February, the ECMWF unveiled a 10-year strategy to accelerate the use of machine learning and AI. How will this be implemented?

The new strategy prioritizes growing exploitation of data-driven methods anchored on established physics-based modelling – rapidly scaling up our previous deployment of machine learning and AI. There are also a variety of hybrid approaches combining data-driven and physics-based modelling.

What will this help you achieve?

On the one hand, data assimilation and observations will help us to directly improve as well as initialize our physics-based forecasting models – for example, by optimizing uncertain parameters or learning correction terms. We are also investigating the potential of applying machine-learning techniques directly on observations – in effect, to make another step beyond the current state-of-the-art and produce forecasts without the need for reanalysis or data assimilation.

How is machine learning deployed at the moment?

Progress in machine learning and AI has been dramatic over the past couple of years – so much so that we launched our Artificial Intelligence Forecasting System (AIFS) back in February. Trained on many years of reanalysis and using traditional data assimilation, AIFS is already an important addition to our suite of forecasts, though still working off the coat-tails of our physics-based predictive models. Another notable innovation is our Probability of Fire machine-learning model, which incorporates multiple data sources beyond weather prediction to identify regional and localized hot-spots at risk of ignition. Those additional parameters – among them human presence, lightning activity as well as vegetation abundance and its dryness – help to pinpoint areas of targeted fire risk, improving the model’s predictive skill by up to 30%.

What do you like most about working at the ECMWF?

Every day, the ECMWF addresses cutting-edge scientific problems – as challenging as anything you’ll encounter in an academic setting – by applying its expertise in atmospheric physics, mathematical modelling, environmental science, big data and other disciplines. What’s especially motivating, however, is that the ECMWF is a mission-driven endeavour with a straight line from our research outcomes to wider societal and economic benefits.

ECMWF at 50: new frontiers in weather and climate prediction

The European Centre for Medium-Range Weather Forecasts (ECMWF) is an independent intergovernmental organization supported by 35 states – 23 member states and 12 co-operating states. Established in 1975, the centre employs around 500 staff from more than 30 countries at its headquarters in Reading, UK, and sites in Bologna, Italy, and Bonn, Germany. As a research institute and 24/7 operational service, the ECMWF produces global numerical weather predictions four times per day and other data for its member/cooperating states and the broader meteorological community.

The ECMWF processes data from around 90 satellite instruments as part of its daily activities (yielding 60 million quality-controlled observations each day for use in its Integrated Forecasting System). The centre is a key player in Copernicus – the Earth observation component of the EU’s space programme – by contributing information on climate change for the Copernicus Climate Change Service; atmospheric composition to the Copernicus Atmosphere Monitoring Service; as well as flooding and fire danger for the Copernicus Emergency Management Service. This year, the ECMWF is celebrating its 50th anniversary and has a series of celebratory events scheduled in Bologna (15–19 September) and Reading (1–5 December).

The post European centre celebrates 50 years at the forefront of weather forecasting appeared first on Physics World.

  •  

Axion quasiparticle appears in a topological antiferromagnet

Physicists have observed axion quasiparticles for the first time in a two-dimensional quantum material. As well as having applications in materials science, the discovery could aid the search for fundamental axions, which are a promising (but so far hypothetical) candidate for the unseen dark matter pervading our universe.

Theorists first proposed axions in the 1970s as a way of solving a puzzle involving the strong nuclear force and charge-parity (CP) symmetry. In systems that obey this symmetry, the laws of physics are the same for a particle and the spatial mirror image of its oppositely charged antiparticle. Weak interactions are known to violate CP symmetry, and the theory of quantum chromodynamics (QCD) allows strong interactions to do so, too. However, no-one has ever seen evidence of this happening, and the so-called “strong CP problem” remains unresolved.

More recently, the axion has attracted attention as a potential constituent of dark matter – the mysterious substance that appears to make up more than 85% of matter in the universe. Axions are an attractive dark matter candidate because while they do have mass, and theory predicts that the Big Bang should have generated them in large numbers, they are much less massive than electrons, and they carry no charge. This combination means that axions interact only very weakly with matter and electromagnetic radiation – exactly the behaviour we expect to see from dark matter.

Despite many searches, though, axions have never been detected directly. Now, however, a team of physicists led by Jianxiang Qiu of Harvard University has proposed a new detection strategy based on quasiparticles that are axions’ condensed-matter analogue. According to Qiu and colleagues, these quasiparticle axions, as they are known, could serve as axion “simulators”, and might offer a route to detecting dark matter in quantum materials.

Topological antiferromagnet

To detect axion quasiparticles, the Harvard team constructed gated electronic devices made from several two-dimensional layers of manganese bismuth telluride (MnBi2Te4). This material is a rare example of a topological antiferromagnet – that is, a material that is insulating in its bulk while conducting electricity on its surface, and that has magnetic moments that point in opposite directions. These properties allow quasiparticles known as magnons (collective oscillations of spin magnetic moments) to appear in and travel through the MnBi2Te4. Two types of magnon mode are possible: one in which the spins oscillate in sync; and another in which they are out of phase.

Qiu and colleagues applied a static magnetic field across the plane of their MnBi2Te4 sheets and bombarded the devices with sub-picosecond light pulses from a laser. This technique, known as ultrafast pump-probe spectroscopy, allowed them to observe the 44 GHz coherent oscillation of the so-called condensed-matter field. This field is the CP-violating term in QCD, and it is proportional to a material’s magnetoelectric coupling constant. “This is uniquely enabled by the out-of-phase magnon in this topological material,” explains Qiu. “Such coherent oscillations are the smoking-gun evidence for the axion quasiparticle and it is the combination of topology and magnetism in MnBi2Te4 that gives rise to it.”

A laboratory for axion studies

Now that they have detected axion quasiparticles, Qiu and colleagues say their next step will be to do experiments that involve hybridizing them with particles such as photons. Such experiments would create a new type of “axion-polariton” that would couple to a magnetic field in a unique way – something that could be useful for applications in ultrafast antiferromagnetic spintronics, in which spin-polarized currents can be controlled with an electric field.

The axion quasiparticle could also be used to build an axion dark matter detector. According to the team’s estimates, the detection frequency for the quasiparticle is in the milli-electronvolt (meV) range. While several theories for the axion predict that it could have a mass in this range, most existing laboratory detectors and astrophysical observations search for masses outside this window.

“The main technical barrier to building such a detector would be grow high-quality large crystals of MnBi2Te4 to maximize sensitivity,” Qiu tells Physics World. “In contrast to other high-energy experiments, such a detector would not require expensive accelerators or giant magnets, but it will require extensive materials engineering.”

The research is described in Nature.

The post Axion quasiparticle appears in a topological antiferromagnet appeared first on Physics World.

  •  

Fluid electrodes make soft, stretchable batteries

Researchers at Linköping University in Sweden have developed a new fluid electrode and used it to make a soft, malleable battery that can recharge and discharge over 500 cycles while maintaining its high performance. The device, which continues to function even when stretched to twice its length, might be used in next-generation wearable electronics.

Futuristic wearables such as e-skin patches, e-textiles and even internal e-implants on the organs or nerves will need to conform far more closely to the contours of the human body than today’s devices can. To fulfil this requirement of being soft and stretchable as well as flexible, such devices will need to be made from mechanically pliant components powered by soft, supple batteries. Today’s batteries, however, are mostly rigid. They also tend to be bulky because long-term operations and power-hungry functions such as wireless data transfer, continuous sensing and complex processing demand plenty of stored energy.

To overcome these barriers, researchers led by the Linköping chemist Aiman Rahmanudin decided to rethink the very concept of battery electrode design. Instead of engineering softness and stretchability into a solid electrode, as was the case in most previous efforts, they made the electrode out of a fluid. “Bulky batteries compromise the mechanical compliance of wearable devices, but since fluids can be easily shaped into any configuration, this limitation is removed, opening up new design possibilities for next-generation wearables,” Rahmanudin says.

A “holistic approach”

Designing a stretchable battery requires a holistic approach, he adds, as all the device’s components need to be soft and stretchy. For example, they used a modified version of the wood-based biopolymer lignin as the cathode and a conjugated poly(1-amino-5-chloroanthraquinone) (PACA) as the anode. They made these electrodes fluid by dispersing them separately with conductive carbon fillers in an aqueous electrolyte medium consisting of 0.1 M HClO4.

To integrate these electrodes into a complete cell, they had to design a stretchable current collector and an ion-selective membrane to prevent the cathodic and anodic fluids from crossing over. They also encapsulated the fluids in a robust, low-permeability elastomer to prevent them from drying up.

Designing energy storage devices from the “inside out”

Previous flexible, high-performance electrode work by the Linköping team focused on engineering the mechanical properties of solid battery electrodes by varying their Young’s modulus. “For example, think of a rubber composite that can be stretched and bent,” explains Rahmanudin. “The thicker the rubber, however, the higher the force required to stretch it, which affects mechanical compliancy.

“Learning from our past experience and work on electrofluids (which are conductive particles dispersed in a liquid medium employed as stretchable conductors), we figured that mixing redox particles with conductive particles and suspending them in an electrolyte could potentially work as battery electrodes. And we found that it did.”

Rahmanudin tells Physics World that fluid-based electrodes could lead to entirely new battery designs, including batteries that could be moulded into textiles, embedded in skin-worn patches or integrated into soft robotics.

After reporting their work in Science Advances, the researchers are now working on increasing the voltage output of their battery, which currently stands 0.9 V. “We are also looking into using Earth-abundant and sustainable materials like zinc and manganese oxide for future versions of our device and aim at replacing the acidic electrolyte we used with a safer pH neutral and biocompatible equivalent,” Rahmanudin says.

Another exciting direction, he adds, will be to exploit the fluid nature of such materials to build batteries with more complex three-dimensional shapes, such as spirals or lattices, that are tailored for specific applications. “Since the electrodes can be poured, moulded or reconfigured, we envisage a lot of creative potential here,” Rahmanudin says.

The post Fluid electrodes make soft, stretchable batteries appeared first on Physics World.

  •