↩ Accueil

Vue normale

index.feed.received.yesterday — 9 mai 20256.5 📰 Sciences English

Ray Dolby Centre opens at the University of Cambridge

9 mai 2025 à 17:30

A ceremony has been held today to officially open the Ray Dolby Centre at the University of Cambridge. Named after the Cambridge physicist and sound pioneer Ray Dolby, who died in 2013, the facility is the new home of the Cavendish Laboratory and will feature 173 labs as well as lecture halls, workshops, cleanrooms and offices.

Designed by the architecture and interior design practice Jestico + Whiles (who also designed the UK’s £61m National Graphene Institute) and constructed by Bouygues UK, the centre has been funded by £85m from Dolby’s estate as well as £75m from the UK’s Engineering and Physical Sciences Research Council (EPSRC).

Spanning 33 000 m² across five floors, the new centre will house 1100 staff members and students.

The basement will feature microscopy and laser labs containing vibration-sensitive equipment as well as 2500 m² of clean rooms.

The Dolby centre will also serve as a national hub for physics, hosting the Collaborative R&D Environment – a EPSRC National Facility – that will foster collaboration between industry and university researchers and enhance public access to new research.

Parts of the centre will be open to the public, including a café as well as outreach and exhibition spaces that are organised around six courtyards.

The centre also provides a new home for the Cavendish Museum, which includes the model of DNA created by James Watson and Francis Crick as well as the cathode ray tube that was used to discover the electron.

The ceremony today was attended by Dagmar Dolby, president of the Ray and Dagmar Dolby Family Fund, Deborah Prentice, vice-chancellor of the University of Cambridge and physicist Mete Atatüre, who is head of the Cavendish Laboratory.

“The greatest impacts on society – including the Cavendish’s biggest discoveries – have happened because of that combination of technological capability and human ingenuity,” notes  Atatüre. “Science is getting more complex and technically demanding with progress, but now we have the facilities we need for our scientists to ask those questions, in the pursuit of discovering creative paths to the answers – that’s what we hope to create with the Ray Dolby Centre.”

The post Ray Dolby Centre opens at the University of Cambridge appeared first on Physics World.

Neutron Airy beams make their debut

9 mai 2025 à 16:00

Physicists have succeeded in making neutrons travel in a curved parabolic waveform known as an Airy beam. This behaviour, which had previously been observed in photons and electrons but never in a non-elementary particle, could be exploited in fundamental quantum science research and in advanced imaging techniques for materials characterization and development.

In free space, beams of light propagate in straight lines. When they pass through an aperture, they diffract, becoming wider and less intense. Airy beams, however, are different. Named for the 19th-century British scientist George Biddell Airy, who developed the mathematics behind them while studying rainbows, they follow a parabola-shaped path – a property known as self-acceleration – and do not spread out as they travel. Airy beams are also “self-healing”, meaning that they reconstruct themselves after passing through an obstacle that blocked part of the beam.

Scientists have been especially interested in Airy beams since 1979, when theoretical work by the physicist Michael Berry suggested several possible applications for them, says Dmitry Pushing, a physicist at the Institute for Quantum Computing (IQC) and the University of Waterloo, Canada. Researchers created the first Airy beams from light in 2007, followed by an electron Airy beam in 2013.

“Inspired by the unusual properties of these beams in optics and electron experiments, we wondered whether similar effects could be harnessed for neutrons,” Pushing says.

Making such beams out of neutrons turned out to be challenging, however. Because neutrons have no charge, they cannot be shaped by electric fields. Also, lenses that focus neutron beams do not exist.

A holographic approach

A team led by Pushing and Dusan Sarenac of the University at Buffalo’s Department of Physics in the US has now overcome these difficulties using a holographic approach based on a custom-microfabricated silicon diffraction grating. The team made this grating from an array of 6 250 000 micron-sized cubic phase patterns etched onto a silicon slab. “The grating modulates incoming neutrons into an Airy form and the resulting beam follows a curved trajectory, exhibiting the characteristics of a two-dimensional Airy profile at a neutron detector,” Sarenac explains.

According to Pushing, it took years of work to figure out the correct dimensions for the array. Once the design was optimized, however, fabricating it took just 48 hours at the IQC’s nanofabrication facility. “Developing a precise wave phase modulation method using holography and silicon microfabrication allowed us to overcome the difficulties in manipulating neutrons,” he says.

The researchers say the self-acceleration and self-healing properties of Airy beams could improve existing neutron imaging techniques (including neutron scattering and diffraction), potentially delivering sharper and more detailed images. The new beams might even allow for new types of neutron optics and could be particularly useful, for example, when targeting specific regions of a sample or navigating around structures.

Creating the neutron Airy beams required access to international neutron science facilities such as the US National Institute of Standards and Technology’s Center for Neutron Research; the US Department of Energy’s Oak Ridge National Laboratory; and the Paul Scherrer Institute in Villigen, Switzerland. To continue their studies, the researchers plan to use the UK’s ISIS Neutron and Muon Source to explore ways of combining neutron Airy beams with other structured neutron beams (such as helical waves of neutrons or neutron vortices). This could make it possible to investigate complex properties such as the chirality, or handedness, of materials. Such work could be useful in drug development and materials science. Since a material’s chirality affects how its electrons spin, it could be important for spintronics and quantum computing, too.

“We also aim to further optimize beam shaping for specific applications,” Sarenac tells Physics World. “Ultimately, we hope to establish a toolkit for advanced neutron optics that can be tailored for a wide range of scientific and industrial uses.”

The present work is detailed in Physical Review Letters.

The post Neutron Airy beams make their debut appeared first on Physics World.

‘Chatty’ artificial intelligence could improve student enthusiasm for physics and maths, finds study

9 mai 2025 à 13:38

Chatbots could boost students’ interest in maths and physics and make learning more enjoyable. So say researchers in Germany, who have compared the emotional response of students using artificial intelligence (AI) texts to learn physics compared to those who only read traditional textbooks. The team, however, found no difference in test performance between the two groups.

The study has been led by Julia Lademann, a physics-education researcher from the University of Cologne, who wanted to see if AI could boost students’ interested in physics. They did this by creating a customized chatbot using OpenAI’s ChatGPT model with a tone and language that was considered accessible to second-year high-school students in Germany.

After testing the chatbot for factual accuracy and for its use of motivating language, the researchers prompted it to generate explanatory text on proportional relationships in physics and mathematics. They then split 214 students, who had an average age of 11.7, into two groups. One was given textbook material on the topic along with chatbot text, while the control group only got the textbook .

The researchers first surveyed the students’ interest in mathematics and physics and then gave them 15 minutes to review the learning material. Their interest was assessed again afterwards along with the students’ emotional state and “cognitive load” – the mental effort required to do the work – through a series of questionnaires.

Higher confidence

The chatbot was found to significantly enhance students’ positive emotions – including pleasure and satisfaction, interest in the learning material and self-belief in their understanding of the subject — compared with those who only used textbook text. “The text of the chatbot is more human-like, more conversational than texts you will find in a textbook,” explains Lademann. “It is more chatty.”

Chatbot text was also found to reduce cognitive load. “The group that used the chatbot explanation experience higher positive feelings about the subject [and] they also had a higher confidence in their learning comprehension,” adds Lademann.

Tests taken within 30 minutes of the “learning phase” of the experiment, however, found no difference in performance between students that received the AI-generated explanatory text and the control group, despite the former receiving more information. Lademann says this could be due to the short study time of 15 minutes.

The researchers say that while their findings suggest that AI could provide a superior learning experience for students, further research is needed to assess its impact on learning performance and long-term outcomes. “It is also important that this improved interest manifests in improved learning performance,” Lademann adds.

Lademann would now like to see “longer term studies with a lot of participants and with children actually using the chatbot”. Such research would explore the potential key strength of chatbots; their ability to respond in real time to student’s queries and adapt their learning level to each individual student.

The post ‘Chatty’ artificial intelligence could improve student enthusiasm for physics and maths, finds study appeared first on Physics World.

Loop quantum cosmology may explain smoothness of cosmic microwave background

First light: The cosmic microwave background, as imaged by the European Space Agency’s Planck mission. (Courtesy: ESA and the Planck Collaboration)
First light: The cosmic microwave background, as imaged by the European Space Agency’s Planck mission. (Courtesy: ESA and the Planck Collaboration)

In classical physics, gravity is universally attractive. At the quantum level, however, this may not always be the case. If vast quantities of matter are present within an infinitesimally small volume – at the centre of a black hole, for example, or during the very earliest moments of the universe – spacetime becomes curved at scales that approach the Planck length. This is the fundamental quantum unit of distance, and is around 1020 times smaller than a proton.

In these extremely curved regions, the classical theory of gravity – Einstein’s general theory of relativity – breaks down. However, research on loop quantum cosmology offers a possible solution. It suggests that gravity, in effect, becomes repulsive. Consequently, loop quantum cosmology predicts that our present universe began in a so-called “cosmic bounce”, rather than the Big Bang singularity predicted by general relativity.

In a recent paper published in EPL, Edward Wilson-Ewing, a mathematical physicist at the University of New Brunswick, Canada, explores the interplay between loop quantum cosmology and a phenomenon sometimes described as “the echo of the Big Bang”: the cosmic microwave background (CMB). This background radiation pervades the entire visible universe, and it stems from the moment the universe became cool enough for neutral atoms to form. At this point, light was suddenly able to travel through space without being continually scattered by the plasma of electrons and light nuclei that existed before. It is this freshly-liberated light that makes up the CMB, so studying it offers clues to what the early universe was like.

Cosmologist: Edward Wilson-Ewing uses loop quantum gravity to study quantum effects in the very early universe. (Courtesy: University of New Brunswick)
Cosmologist: Edward Wilson-Ewing uses loop quantum gravity to study quantum effects in the very early universe. (Courtesy: University of New Brunswick)

What was the motivation for your research?

Observations of the CMB show that the early universe (that is, the universe as it was when the CMB formed) was extremely homogeneous, with relative anisotropies of the order of one part in 104. Classical general relativity has trouble explaining this homogeneity on its own, because a purely attractive version of gravity tends to drive things in the opposite direction. This is because if a region has a higher density than the surrounding area, then according to general relativity, that region will become even denser; there is more mass in that region and therefore particles surrounding it will be attracted to it. Indeed, this is how the small inhomogeneities we do see in the CMB grew over time to form stars and galaxies today.

The main way this gets resolved in classical general relativity is to suggest that the universe experienced an episode of super-rapid growth in its earliest moments. This super-rapid growth is known as inflation, and it can suffice to generate homogeneous regions. However, in general, this requires a very large amount of inflation (much more than is typically considered in most models).

Alternately, if for some reason there happens to be a region that is moderately homogeneous when inflation starts, this region will increase exponentially in size while also becoming further homogenized. This second possibility requires a little more than a minimal amount of inflation, but not much more.

My goal in this work was to explore whether, if gravity becomes repulsive in the deep quantum regime (as is the case in loop quantum cosmology), this will tend to dilute regions of higher density, leading to inhomogeneities being smoothed out. In other words, one of the main objectives of this work was to find out whether quantum gravity could be the source of the high degree of homogeneity observed in the CMB.

What did you do in the paper?

In this paper, I studied spherically symmetric spacetimes coupled to dust (a simple model for matter) in loop quantum cosmology.  These spacetimes are known as Lemaître-Tolman-Bondi spacetimes, and they allow arbitrarily large inhomogeneities in the radial direction. They therefore provide an ideal arena to explore whether homogenization can occur: they are simple enough to be mathematically tractable, while still allowing for large inhomogeneities (which, in general, are very hard to handle).

Loop quantum cosmology predicts several leading-order quantum effects. One of these effects is that spacetime, at the quantum level, is discrete: there are quanta of geometry just as there are quanta of matter.  This has implications for the equations of motion, which relate the geometry of spacetime to the matter in it: if we take into account the discrete nature of quantum geometry, we have to modify the equations of motion.

These modifications are captured by so-called effective equations, and in the paper I solved these equations numerically for a wide range of initial conditions. From this, I found that while homogenization doesn’t occur everywhere, it always occurs in some regions. These homogenized regions can then be blown up to cosmological scales by inflation (and inflation will further homogenize them).  Therefore, this quantum gravity homogenization process could indeed explain the homogeneity observed in the CMB.

What do you plan to do next?

It is important to extend this work in several directions to check the robustness of the homogenization effect in loop quantum cosmology.  The restriction to spherical symmetry should be relaxed, although this will be challenging from a mathematical perspective. It will also be important to go beyond dust as a description of matter. The simplicity of dust makes calculations easier, but it is not particularly realistic.

Other relevant forms of matter include radiation and the so-called inflaton field, which is a type of matter that can cause inflation to occur. That said, in cosmology, the physics is to some extent independent of the universe’s matter content, at least at a qualitative level. This is because while different types of matter content may dilute more rapidly than others in an expanding universe, and the universe may expand at different rates depending on its matter content, the main properties of the cosmological dynamics (for example, the expanding universe, the occurrence of an initial singularity and so on) within general relativity are independent of the specific matter being considered.

I therefore think it is reasonable to expect that the quantitative predictions will depend on the matter content, but the qualitative features (in particular, that small regions are homogenized by quantum gravity) will remain the same. Still, further research is needed to test this expectation.

The post Loop quantum cosmology may explain smoothness of cosmic microwave background appeared first on Physics World.

index.feed.received.before_yesterday6.5 📰 Sciences English

Molecular engineering and battery recycling: developing new technologies in quantum, medicine and energy

8 mai 2025 à 15:39

This episode of the Physics World Weekly podcast comes from the Chicago metropolitan area – a scientific powerhouse that is home to two US national labs and some of the country’s leading universities.

Physics World’s Margaret Harris was there recently and met Nadya Mason. She is dean of the Pritzker School of Molecular Engineering at the University of Chicago, which focuses on quantum engineering; materials for sustainability; and immunoengineering. Mason explains how molecular-level science is making breakthroughs in these fields and she talks about her own research on the electronic properties of nanoscale and correlated systems.

Harris also spoke to Jeffrey Spangenberger who leads the Materials Recycling Group at Argonne National Laboratory, which is on the outskirts of Chicago. Spangenberger talks about the challenges of recycling batteries and how we could make it easier to recover materials from batteries of the future. Spangenberger leads the ReCell Center, a national collaboration of industry, academia and national laboratories that is advancing recycling technologies along the entire battery life-cycle.

On 13–14 May, The Economist is hosting Commercialising Quantum Global 2025 in London. The event is supported by the Institute of Physics – which brings you Physics World. Participants will join global leaders from business, science and policy for two days of real-world insights into quantum’s future. In London you will explore breakthroughs in quantum computing, communications and sensing, and discover how these technologies are shaping industries, economies and global regulation. Register now.

The post Molecular engineering and battery recycling: developing new technologies in quantum, medicine and energy appeared first on Physics World.

European centre celebrates 50 years at the forefront of weather forecasting

8 mai 2025 à 15:01

What is the main role of the European Centre for Medium-Range Weather Forecasts (ECMWF)?

Making weather forecasts more accurate is at the heart of what we do at the ECMWF, working in close collaboration with our member states and their national meteorological services (see box). That means enhanced forecasting for the weeks and months ahead as well as seasonal and annual predictions. We also have a remit to monitor the atmosphere and the environment – globally and regionally – within the context of a changing climate.

How does the ECMWF produce its weather forecasts?

Our task is to get the best representation, in a 3D sense, of the current state of the atmosphere versus key metrics like wind, temperature, humidity and cloud cover. We do this via a process of reanalysis and data assimilation: combining the previous short-range weather forecast, and its component data, with the latest atmospheric observations – from satellites, ground stations, radars, weather balloons and aircraft. Unsurprisingly, using all this observational data is a huge challenge, with the exploitation of satellite measurements a significant driver of improved forecasting over the past decade.

In what ways do satellite measurements help?

Consider the EarthCARE satellite that was launched in May 2024 by the European Space Agency (ESA) and is helping ECMWF to improve its modelling of clouds, aerosols and precipitation. EarthCARE has a unique combination of scientific instruments – a cloud-profiling radar, an atmospheric lidar, a multispectral imager and a broadband radiometer – to infer the properties of clouds and how they interact with solar radiation as well as thermal-infrared radiation emitted by different layers of the atmosphere.

How are you combining such data with modelling?

The ECMWF team is learning how to interpret and exploit the EarthCARE data to directly initiate our models. Put simply, mathematical models that better represent clouds and, in turn, yield more accurate forecasts. Indirectly, EarthCARE is also revealing a clearer picture of  the fundamental physics governing cloud formation, distribution and behaviour. This is just one example of numerous developments taking advantage of new satellite data. We are looking forward, in particular, to fully exploiting next-generation satellite programmes from the European Organisation for the Exploitation of Meteorological Satellites (EUMETSAT) – including the EPS-SG polar-orbiting system and the Meteosat Third Generation geostationary satellite for continuous monitoring over Europe, Africa and the Indian Ocean.

ECMWF high-performance computing centre
Big data, big opportunities: the ECMWF’s high-performance computing facility in Bologna, Italy, is the engine-room of the organization’s weather and climate modelling efforts (Courtesy: ECMWF)

What other factors help improve forecast accuracy?

We talk of “a day, a decade” improvement in weather forecasting, such that a five-day forecast now is as good as a three-day forecast 20 years ago. A richer and broader mix of observational data underpins that improvement, with diverse data streams feeding into bigger supercomputers that can run higher-resolution models and better algorithms. Equally important is ECMWF’s team of multidisciplinary scientists, whose understanding of the atmosphere and climate helps to optimize our models and data assimilation methods. A case study in this regard is Destination Earth, an ambitious European Union initiative to create a series of “digital twins” – interactive computer simulations – of our planet by 2030. Working with ESA and EUMETSTAT, the ECMWF is building the software and data environment for Destination Earth as well as developing the first two digital twins.

What are these two twins?

Our Digital Twin on Weather-Induced and Geophysical Extremes will assess and predict environmental extremes to support risk assessment and management. Meanwhile, in collaboration with others, the Digital Twin on Climate Change Adaptation complements and extends existing capabilities for the analysis and testing of “what if” scenarios – supporting sustainable development and climate adaptation and mitigation policy-making over multidecadal timescales.

Progress in machine learning and AI has been dramatic over the past couple of years

What kind of resolution will these models have?

Both digital twins integrate sea, atmosphere, land, hydrology and sea ice and their deep connections with a resolution currently impossible to reach. Right now, for example, the ECMWF’s operational forecasts cover the whole globe in a 9 km grid – effectively a localized forecast every 9 km. With Destination Earth, we’re experimenting with 4 km, 2 km, and even 1 km grids.

In February, the ECMWF unveiled a 10-year strategy to accelerate the use of machine learning and AI. How will this be implemented?

The new strategy prioritizes growing exploitation of data-driven methods anchored on established physics-based modelling – rapidly scaling up our previous deployment of machine learning and AI. There are also a variety of hybrid approaches combining data-driven and physics-based modelling.

What will this help you achieve?

On the one hand, data assimilation and observations will help us to directly improve as well as initialize our physics-based forecasting models – for example, by optimizing uncertain parameters or learning correction terms. We are also investigating the potential of applying machine-learning techniques directly on observations – in effect, to make another step beyond the current state-of-the-art and produce forecasts without the need for reanalysis or data assimilation.

How is machine learning deployed at the moment?

Progress in machine learning and AI has been dramatic over the past couple of years – so much so that we launched our Artificial Intelligence Forecasting System (AIFS) back in February. Trained on many years of reanalysis and using traditional data assimilation, AIFS is already an important addition to our suite of forecasts, though still working off the coat-tails of our physics-based predictive models. Another notable innovation is our Probability of Fire machine-learning model, which incorporates multiple data sources beyond weather prediction to identify regional and localized hot-spots at risk of ignition. Those additional parameters – among them human presence, lightning activity as well as vegetation abundance and its dryness – help to pinpoint areas of targeted fire risk, improving the model’s predictive skill by up to 30%.

What do you like most about working at the ECMWF?

Every day, the ECMWF addresses cutting-edge scientific problems – as challenging as anything you’ll encounter in an academic setting – by applying its expertise in atmospheric physics, mathematical modelling, environmental science, big data and other disciplines. What’s especially motivating, however, is that the ECMWF is a mission-driven endeavour with a straight line from our research outcomes to wider societal and economic benefits.

ECMWF at 50: new frontiers in weather and climate prediction

The European Centre for Medium-Range Weather Forecasts (ECMWF) is an independent intergovernmental organization supported by 35 states – 23 member states and 12 co-operating states. Established in 1975, the centre employs around 500 staff from more than 30 countries at its headquarters in Reading, UK, and sites in Bologna, Italy, and Bonn, Germany. As a research institute and 24/7 operational service, the ECMWF produces global numerical weather predictions four times per day and other data for its member/cooperating states and the broader meteorological community.

The ECMWF processes data from around 90 satellite instruments as part of its daily activities (yielding 60 million quality-controlled observations each day for use in its Integrated Forecasting System). The centre is a key player in Copernicus – the Earth observation component of the European Union’s space programme – by contributing information on climate change for the Copernicus Climate Change Service; atmospheric composition to the Copernicus Atmosphere Monitoring Service; as well as flooding and fire danger for the Copernicus Emergency Management Service. This year, the ECMWF is celebrating its 50th anniversary and has a series of celebratory events scheduled in Bologna (from 15-19 September) and Reading (from 1-5 December).

The post European centre celebrates 50 years at the forefront of weather forecasting appeared first on Physics World.

❌