↩ Accueil

Vue normale

Reçu aujourd’hui — 3 décembre 2025

Studying the role of the quantum environment in attosecond science

3 décembre 2025 à 11:00

Attosecond science is undoubtedly one of the fastest growing branches of physics today.

Its popularity was demonstrated by the award of the 2023 Nobel Prize in Physics to Anne L’Huillier, Paul Corkum and Ferenc Krausz for experimental methods that generate attosecond pulses of light for the study of electron dynamics in matter.

One of the most important processes in this field is dephasing. This happens when an electron loses its phase coherence because of interactions with its surroundings.

This loss of coherence can obscure the fine details of electron dynamics, making it harder to capture precise snapshots of these rapid processes.

The most common way to model this process in light-matter interactions is by using the relaxation time approximation. This approach greatly simplifies the picture as it avoids the need to model every single particle in the system.

Its use is fine for dilute gases, but it doesn’t work as well with intense lasers and denser materials, such as solids, because it greatly overestimates ionisation.

This is a significant problem as ionisation is the first step in many processes such as electron acceleration and high-harmonic generation.

To address this problem, a team led by researchers from the University of Ottawa have developed a new method to correct for this problem.

By introducing a heat bath into the model they were able to represent the many-body environment that interacts with electrons, without significantly increasing the complexity.

This new approach should enable the identification of new effects in attosecond science or wherever strong electromagnetic fields interact with matter.

Read the full article

Strong field physics in open quantum systems – IOPscience

N. Boroumand et al, 2025 Rep. Prog. Phys. 88 070501

 

The post Studying the role of the quantum environment in attosecond science appeared first on Physics World.

Characterising quantum many-body states

3 décembre 2025 à 10:59

Describing the non-classical properties of a complex many-body system (such as entanglement or coherence) is an important part of quantum technologies.

An ideal tool for this task would work well with large systems, be easily computable and easily measurable. Unfortunately, such a tool for every situation does not yet exist.

With this goal in mind a team of researchers from Spain and Poland began work on a special type of quantum state used in quantum computing – graph states.

These states can be visualised as graphs or networks where each vertex represents a qubit, and each edge represents an interaction between pairs of qubits.

The team studied four different shapes of graph states using new mathematical tools they developed. They found that one of these in particular, the Turán graph, could be very useful in quantum metrology.

Their method is (relatively) straightforward and does not require many assumptions. This means that it could be applied to any shape of graph beyond the four studied here.

The results will be useful in various quantum technologies wherever precise knowledge of many-body quantum correlations is necessary.

Read the full article

Many-body quantum resources of graph states – IOPscience

M. Płodzień et al, 2025 Rep. Prog. Phys. 88 077601

 

The post Characterising quantum many-body states appeared first on Physics World.

Reçu avant avant-hier

The link between protein evolution and statistical physics

26 novembre 2025 à 09:37

Proteins are made up of a sequence of building blocks called amino acids. Understanding these sequences is crucial for studying how proteins work, how they interact with other molecules, and how changes (mutations) can lead to diseases.

These mutations happen over vastly different time periods and are not completely random but strongly correlated, both in space (distinct sites along the sequences) and in time (subsequent mutations of the same site).

It turns out that these correlations are very reminiscent of disordered physical systems, notably glasses, emulsions, and foams.

A team of researchers from Italy and France have now used this similarity to build a new statistical model to simulate protein evolution.  They went on to study the role of different factors causing these mutations.

They found that the initial (ancestral) protein sequence has a significant influence on the evolution process, especially in the short term. This means that information from the ancestral sequence can be traced back over a certain period and is not completely lost.

The strength of interactions between different amino acids within the protein affects how long this information persists.

Although ultimately the team did find differences between the evolution of physical systems and that of protein sequences, this kind of insight would not have been possible without using the language of statistical physics, i.e. space-time correlations.

The researchers expect that their results will soon be tested in the lab thanks to upcoming advances in experimental techniques.

Read the full article

Fluctuations and the limit of predictability in protein evolution – IOPscience

S. Rossi et al, 2025 Rep. Prog. Phys. 88 078102

The post The link between protein evolution and statistical physics appeared first on Physics World.

Quantum cryptography in practice

19 novembre 2025 à 09:02

Quantum Conference Key Agreement (QCKA) is a cryptographic method that allows multiple parties to establish a shared secret key using quantum technology. This key can then be used for secure communication among the parties.

Unlike traditional methods that rely on classical cryptographic techniques, QCKA leverages the principles of quantum mechanics, particularly multipartite entanglement, to ensure security.

A key aspect of QCKA is creating and distributing entangled quantum states among the parties. These entangled states have unique properties that make it impossible for an eavesdropper to intercept the key without being detected.

Researchers measure the efficiency and performance of the key agreement protocol using a metric known as the key rate.

One problem with state-of-the-art QCKA schemes is that this key rate decreases exponentially with the number of users.

Previous solutions to this problem, based on single-photon interference, have come at the cost of requiring global phase locking. This makes them impractical to put in place experimentally.

However, the authors of this new study have been able to circumvent this requirement, by adopting an asynchronous pairing strategy. Put simply, this means that measurements taken by different parties in different places do not need to happen at exactly at the same time.

Their solution effectively removes the need for global phase locking while still maintaining the favourable scaling of the key rate as in other protocols based on single-photon interference.

The new scheme represents an important step towards realising QCKA at long distances by allowing for much more practical experimental configurations.

Quantum conference key agreement
Schematic representation of quantum group network via circular asynchronous interference (Courtesy: Hua-Lei Yin)

Read the full article

Repeater-like asynchronous measurement-device-independent quantum conference key agreement – IOPscience

Yu-Shuo Lu et al., 2025 Rep. Prog. Phys. 88 067901

The post Quantum cryptography in practice appeared first on Physics World.

Using AI to find new particles at the LHC

12 novembre 2025 à 09:03

The Standard Model of particle physics is a very well-tested theory that describes the fundamental particles and their interactions. However, it does have several key limitations. For example, it doesn’t account for dark matter or why neutrinos have masses.

One of the main aims of experimental particle physics at the moment is therefore to search for signs of new physical phenomena beyond the Standard Model.

Finding something new like this would point us towards a better theoretical model of particle physics: one that can explain things that the Standard Model isn’t able to.

These searches often involve looking for rare or unexpected signals in high-energy particle collisions such as those at CERN’s Large Hadron Collider (LHC).

In a new paper published by the CMS collaboration, a new analysis method was used to search for new particles produced by proton-proton collisions at the at the LHC.

These particles would decay into two jets, but with unusual internal structure not typical of known particles like quarks or gluons.

The researchers used advanced machine learning techniques to identify jets with different substructures, applying various anomaly detection methods to maximise sensitivity to unknown signals.

Unlike traditional strategies, anomaly detection methods allow the AI models to identify anomalous patterns in the data without being provided specific simulated examples, giving them increased sensitivity to a wider range of potential new particles.

This time, they didn’t find any significant deviations from expected background values. Although no new particles were found, the results enabled the team to put several new theoretical models to the test for the first time.  They were also able to set upper bounds on the production rates of several hypothetical particles.

Most importantly, the study demonstrates that machine learning can significantly enhance the sensitivity of searches for new physics, offering a powerful tool for future discoveries at the LHC.

The post Using AI to find new particles at the LHC appeared first on Physics World.

Probing the fundamental nature of the Higgs Boson

29 octobre 2025 à 09:38

First proposed in 1964, the Higgs boson plays a key role in explaining why many elementary particles of the Standard Model have a rest mass. Many decades later the Higgs boson was observed in 2012 by the ATLAS and CMS collaborations at the Large Hadron Collider (LHC), confirming the decades old prediction.  

This discovery made headline news at the time and, since then, the two collaborations have been performing a series of measurements to establish the fundamental nature of the Higgs boson field and of the quantum vacuum. Researchers certainly haven’t stopped working on the Higgs though. In subsequent years, a series of measurements have been performed to establish the fundamental nature of the new particle. 

One key measurement comes from studying a process known as off-shell Higgs boson production. This is the creation of Higgs bosons with a mass significantly higher than their typical on-shell mass of 125 GeV.  This phenomenon occurs due to quantum mechanics, which allows particles to temporarily fluctuate in mass.

This kind of production is harder to detect but can reveal deeper insights into the Higgs boson’s properties, especially its total width, which relates to how long it exists before decaying. This in turn, allows us to test key predictions made by the Standard Model of particle physics.

Previous observations of this process had been severely limited in their sensitivity. In order to improve on this, the ATLAS collaboration had to introduce a completely new way of interpreting their data (read here for more details).

They were able to provide evidence for off-shell Higgs boson production with a significance of 2.5𝜎 (corresponding to a 99.38% likelihood), using events with four electrons or muons, compared to a significance of 0.8𝜎 using traditional methods in the same channel.

The results mark an important step forward in understanding the Higgs boson as well as other high-energy particle physics phenomena.

The post Probing the fundamental nature of the Higgs Boson appeared first on Physics World.

A recipe for quantum chaos

22 octobre 2025 à 11:44

The control of large, strongly coupled, multi-component quantum systems with complex dynamics is a challenging task.

It is, however, an essential prerequisite for the design of quantum computing platforms and for the benchmarking of quantum simulators.

A key concept here is that of quantum ergodicity. This is because quantum ergodic dynamics can be harnessed to generate highly entangled quantum states.

In classical statistical mechanics, an ergodic system evolving over time will explore all possible microstates states uniformly. Mathematically, this means that a sufficiently large collection of random samples from an ergodic process can represent the average statistical properties of the entire process.

Quantum ergodicity is simply the extension of this concept to the quantum realm.

Closely related to this is the idea of chaos. A chaotic system is one in which is very sensitive to its initial conditions. Small changes can be amplified over time, causing large changes in the future.

The ideas of chaos and ergodicity are intrinsically linked as chaotic dynamics often enable ergodicity.

Until now, it has been very challenging to predict which experimentally preparable initial states will trigger quantum chaos and ergodic dynamics over a reasonable time scale.

In a new paper published in Reports on Progress in Physics, a team of researchers have proposed an ingenious solution to this problem using the Bose–Hubbard Hamiltonian.

They took as an example ultracold atoms in an optical lattice (a typical choice for experiments in this field) to benchmark their method.

The results show that there are certain tangible threshold values which must be crossed in order to ensure the onset of quantum chaos.

These results will be invaluable for experimentalists working across a wide range of quantum sciences.

The post A recipe for quantum chaos appeared first on Physics World.

Neural simulation-based inference techniques at the LHC

22 octobre 2025 à 11:44

Precision measurements of theoretical parameters are a core element of the scientific program of experiments at the Large Hadron Collider (LHC) as well as other particle colliders. 

These are often performed using statistical techniques such as the method of maximum likelihood. However, given the size of datasets generated, reduction techniques, such as grouping data into bins, are often necessary. 

These can lead to a loss of sensitivity, particularly in non-linear cases like off-shell Higgs boson production and effective field theory measurements.  The non-linearity in these cases comes from quantum interference and traditional methods are unable to optimally distinguish the signal from background.

In this paper, the ATLAS collaboration pioneered the use of a neural network based technique called neural simulation-based inference (NSBI) to combat these issues. 

A neural network is a machine learning model originally inspired by how the human brain works. It’s made up of layers of interconnected units called neurons, which process information and learn patterns from data. Each neuron receives input, performs a simple calculation, and passes the result to other neurons. 

NSBI uses these neural networks to analyse each particle collision event individually, preserving more information and improving accuracy.

The framework developed here can handle many sources of uncertainty and includes tools to measure how confident scientists can be in their results.

The researchers benchmarked their method by using it to calculate the Higgs boson signal strength and compared it to previous methods with impressive results (see here for more details about this).

The greatly improved sensitivity gained from using this method will be invaluable in the search for physics beyond the Standard Model in future experiments at ATLAS and beyond.

Read the full article

An implementation of neural simulation-based inference for parameter estimation in ATLAS – IOPscience

The ATLAS Collaboration, 2025 Rep. Prog. Phys. 88 067801

The post Neural simulation-based inference techniques at the LHC appeared first on Physics World.

Further evidence for evolving dark energy?

15 octobre 2025 à 11:34

The term dark energy, first used in 1998, is a proposed form of energy that affects the universe on the largest scales. Its primary effect is to drive the accelerating expansion of the universe – an observation that was awarded the 2011 Nobel Prize in Physics.

Dark energy is now a well established concept and forms a key part of the standard model of Big Bang cosmology, the Lambda-CDM model.

The trouble is, we’ve never really been able to explain exactly what dark energy is, or why it has the value that it does.

Even worse, new data acquired by cutting-edge telescopes have suggested that dark energy might not even exist as we had imagined it.

This is where the new work by Mukherjee and Sen comes in. They combined two of these datasets, while making as few assumptions as possible, to understand what’s going on.

The first of these datasets came from baryon acoustic oscillations. These are patterns in the distribution of matter in the universe, created by sound waves in the early universe.

The second dataset is based on a survey of supernovae data from the last 5 years. Both sets of data can be used to track the expansion history of the universe by measuring distances at different snapshots in time.

The team’s results are in tension with the Lambda-CDM model at low redshifts. Put simply, the results disagree with the current model at recent times. This provides further evidence for the idea that dark energy, previously considered to have a constant value, is evolving over time.

Evolving dark energy
The tension in the expansion rate is most evident at low redshifts (Courtesy: P. Mukherjee)

The is far from the end of the story with dark energy. New observational data, and new analyses such as this one are urgently required to provide a clearer picture.

However, where there’s uncertainty, there’s opportunity. Understanding dark energy could hold the key to understanding quantum gravity, the Big Bang and the ultimate fate of the universe.

 

 

 

The post Further evidence for evolving dark energy? appeared first on Physics World.

Searching for dark matter particles

15 octobre 2025 à 11:34

Dark matter is hypothesised form of matter that does not emit, absorb, or reflect light, making it invisible to electromagnetic observations. Although we have never detected it, its existence is inferred from its gravitational effects on visible matter and the large-scale structure of the universe.

The Standard Model of particle physics does not contain any dark matter particles but there have been several proposed extensions of how they might be included. Several of these are very low mass particles such as the axion or the sterile neutrino.

Detecting these hypothesised particles is very challenging, however, due to the extreme sensitivity required.

Electromagnetic resonant systems, such as cavities and LC circuits, are widely used for this purpose, as well as to detect high-frequency gravitational waves.

When an external signal matches one of these systems’ resonant frequencies, the system responds with a large amplitude, making the signal possible to detect. However, there is always a trade-off between the sensitivity of the detector and the range of frequencies it is able to detect (its bandwidth).

A natural way to overcome this compromise is to consider multi-mode resonators, which can be viewed as coupled networks of harmonic oscillators. Their scan efficiency can be significantly enhanced beyond the standard quantum limit of simple single-mode resonators.

In a recent paper, the researchers demonstrated how multi-mode resonators can achieve the advantages of both sensitive and broadband detection. By connecting adjacent modes inside the resonant cavity, and  tuning these interactions to comparable magnitudes, off-resonant (i.e. unwanted) frequency shifts are effectively cancelled increasing the overall response of the system.

Their method allows us to search for these elusive dark matter particles in a faster, more efficient way.

Dark matter detection circuit
A multi-mode detector design, where the first mode couples to dark matter and the last mode is read out (Courtesy: Y. Chen)

The post Searching for dark matter particles appeared first on Physics World.

A record-breaking anisotropic van der Waals crystal?

9 octobre 2025 à 10:13

In general, when you measure material properties such as optical permittivity, your measurement doesn’t depend on the direction in which you make it.

However, recent research has shown that this is not the case for all materials. In some cases, their optical permittivity is directional. This is commonly known as in-plane optical anisotropy. A larger difference between optical permittivity in different directions means a larger anisotropy.

Materials with very large anisotropies have applications in a wide range of fields from photonics and electronics to medical imaging. However, for most materials remains available today, the value remains relatively low.

These potential applications combined with the current limitation has driven a large amount of research into novel anisotropic materials.

In this latest work, a team of researchers studied the quasi-one-dimensional van der Waals crystal: Ta2NiSe5.

Van der Waals (vdW) crystals are made up of chains, ribbons, or layers of atoms that stick together through weak van der Waals forces.

In quasi-one-dimensional vdW crystals, the atoms are strongly connected along one direction, while the connections in the other directions are much weaker, making their properties very direction-dependent.

This structure makes quasi-one-dimensional vdW crystals a good place to search for large optical anisotropy values. The researchers studied the new crystal by using a range of measurement techniques such as ellipsometry and spectroscopy as well as state of the art first principles computer simulations.

The results show that Ta2NiSe5 has a record-breaking in-plane optical anisotropy across the visible to infrared spectral region, representing the highest value reported among van der Waals materials to date.

The study therefore has large implications for next-generation devices in photonics and beyond.

Read the full article

Giant in-plane anisotropy in novel quasi-one-dimensional van der Waals crystal – IOPscience

Zhou et al. 2025 Rep. Prog. Phys. 88 050502

 

The post A record-breaking anisotropic van der Waals crystal? appeared first on Physics World.

Floquet engineering made easy

24 septembre 2025 à 11:30

Understanding periodically driven quantum systems is currently a major line of research.

These Floquet systems provide versatile platforms to investigate new physical phenomena such as time crystals, and can also be used to create fault-tolerant states for quantum computing.

What’s important here is the ability to precisely control the behaviour of the quantum system by designing its effective Hamiltonian – the mathematical object that governs how the system evolves over time.

When researchers want a system to behave in a very specific way, they engineer the Hamiltonian to match a desired target. This is called Floquet engineering.

Unfortunately, it’s not possible to create a simple (analytical) Floquet Hamiltonian for any given system, and mathematical tools such as the Magnus expansion are usually required to get a Hamiltonian that is sufficiently precise.

However, when you engineer a Hamiltonian using approximations, you get errors – not great for most applications and especially quantum computing.

Mitigating these errors is possible to some degree although up until now it’s been a one system at a time approach. What we really need is a systematic approach for mitigating these errors for any given system.

This is the problem that the latest paper by researchers Xu and Guo tries to address.

They used symmetries (like rotational or mirror symmetry) to simplify the design of these correction terms. This makes the calculations more manageable and the system more predictable.

They also provided a numerical method to calculate these corrections efficiently, which is important for practical implementation

They validated their method by creating Hamiltonians that are directly relevant for quantum computers.

The authors expect to further refine their method in the future, but this represents a big step forward towards practically engineering arbitrary Floquet Hamiltonians.

Read the full article

Perturbative framework for engineering arbitrary Floquet Hamiltonian – IOPscience

Xu and Guo, 2025 Rep. Prog. Phys. 88 037602

The post Floquet engineering made easy appeared first on Physics World.

❌