↩ Accueil

Vue lecture

AI boom has caused same CO2 emissions in 2025 as New York City, report claims

Study author says tech companies are reaping benefits of artificial intelligence age but society is left to pay cost

The AI boom has caused as much carbon dioxide to be released into the atmosphere in 2025 as emitted by the whole of New York City, it has been claimed.

The global environmental impact of the rapidly spreading technology has been estimated in research published on Wednesday, which also found that AI-related water use now exceeds the entirety of global bottled-water demand.

Continue reading...

© Photograph: Federico Torres/NurPhoto/Shutterstock

© Photograph: Federico Torres/NurPhoto/Shutterstock

© Photograph: Federico Torres/NurPhoto/Shutterstock

  •  

Third of UK citizens have used AI for emotional support, research reveals

AI Security Institute report finds most common type of AI tech used was general purpose assistants such as ChatGPT and Amazon Alexa

A third of UK citizens have used artificial intelligence for emotional support, companionship or social interaction, according to the government’s AI security body.

The AI Security Institute (AISI) said nearly one in 10 people used systems like chatbots for emotional purposes on a weekly basis, and 4% daily.

Continue reading...

© Photograph: Jaque Silva/NurPhoto via Getty Images

© Photograph: Jaque Silva/NurPhoto via Getty Images

© Photograph: Jaque Silva/NurPhoto via Getty Images

  •  

Qubit ‘recycling’ gives neutral-atom quantum computing a boost

Errors are the bugbear of quantum computing, and they’re hard to avoid. While quantum computers derive their computational clout from the fact that their qubits can simultaneously court multiple values, the fragility of qubit states ramps up their error rates. Many research groups are therefore seeking to reduce or manage errors so they can increase the number of qubits without reducing the whole enterprise to gibberish.

A team at the US-based firm Atom Computing is now reporting substantial success in this area thanks to a multi-part strategy for keeping large numbers of qubits operational in quantum processors based on neutral atoms. “These capabilities allow for the execution of more complex, longer circuits that are not possible without them,” says Matt Norcia, one of the Atom Computing researchers behind this work.

While neutral atoms offer several advantages over other qubit types, they traditionally have significant drawbacks for one of the most common approaches to error correction. In this approach, some of the entangled qubits are set aside as so-called “ancillaries”, used for mid-circuit measurements that can indicate how a computation is going and what error correction interventions may be necessary.

In neutral-atom quantum computing, however, such interventions are generally destructive. Atoms that are not in their designated state are simply binned off – a profligate approach that makes it challenging to scale up atom-based computers. The tendency to discard atoms is particularly awkward because the traps that confine atoms are already prone to losing atoms, which introduces additional errors while reducing the number of atoms available for computations.

Reduce, re-use, replenish

As well as demonstrating protocols for performing measurements to detect errors in quantum circuits with little atom loss, the researchers at Atom Computing also showed they could re-use ancillary atoms – a double-pronged way of retaining more atoms for calculations. In addition, they demonstrated that they could replenish the register of atoms for the computation from a spatially separated stash in a magneto-optic trap without compromising the quantum state of the atoms already in the register.

Norcia says that these achievements — replacing atoms from a continuous source, while reducing the number of atoms needing replacement to begin with — are key to running computations without running out of atoms.  “To our knowledge, any useful quantum computations will require the execution of many layers of gates, which will not be possible unless the atom number can be maintained at a steady-state level throughout the computation,” he tells Physics World.

Cool and spaced out

Norcia and his collaborators at Microsoft Quantum, the Colorado School of Mines and Stanford University worked with ytterbium (Yb) atoms, which he describes as “natural qubits” since they have two ground states. A further advantage is that the transitions between these qubit states and other states used for imaging and cooling are weak, meaning the researchers could couple just one qubit state to these other states at a time. The team also leveraged a previously-developed approach for mid-circuit measurement that scatters light from only one qubit state and does not disturb the other, making it less destructive.

Still, Norcia tells Physics World, “the challenge was to re-use atoms, and key to this was cooling and performance.” To this end, they first had to shift the atoms undergoing mid-circuit measurements away from the atoms in the computational register, to avoid scattering laser light off the latter. They further avoided laser-related collateral damage by designing the register such that the measurement and cooling light was not at the resonant wavelength of the register atoms. Next, they demonstrated they could cool already-measured atoms for re-use in the calculation. Finally, they showed they could non-disruptively replenish these atoms with others from a magneto-optical trap positioned 300 nm below the tweezer arrays that held atoms for the computational register.

Mikhail Lukin, a physicist at Harvard University, US who has also worked on the challenges of atom loss and re-use in scalable, fault-tolerant neutral atom computing, has likewise recently reported successful atom re-use and diminished atom loss. Although Lukin’s work differs from that of the Atom Computing team in various ways – using rubidium instead of ytterbium atoms and a different approach for low atom loss mid-circuit measurements, for starters – he says that the work by Norcia and his team “represents an important technical advance for the Yb quantum computing platform, complementing major progress in the neutral atom quantum computing community in 2025”.

The research appears in Physical Review X.

The post Qubit ‘recycling’ gives neutral-atom quantum computing a boost appeared first on Physics World.

  •  

La chute de Luminar : quand un partenariat avec Volvo précipite la faillite d’un pionnier du lidar

Début 2023, tout souriait à Luminar. Après son introduction en bourse pendant la pandémie et la signature d’accords stratégiques avec Volvo, Mercedes-Benz et Polestar, l’entreprise spécialisée dans les capteurs lidar semblait promise à un avenir radieux. Austin Russell, fondateur et directeur général, évoquait alors un « point d’inflexion » décisif pour son entreprise, dont les ... Lire plus

L'article La chute de Luminar : quand un partenariat avec Volvo précipite la faillite d’un pionnier du lidar est apparu en premier sur Fredzone.
  •  

Amazon in talks to invest $10bn in developer of ChatGPT

OpenAI seeking to strike latest deal in its efforts to pay for huge spending on datacentres

Amazon is in talks to invest more than $10bn (£7.5bn) in OpenAI, in the latest funding deal being struck by the startup behind ChatGPT.

If it goes ahead, the market valuation of OpenAI could rise above $500bn, according to The Information, a tech news site that revealed the negotiations.

Continue reading...

© Photograph: Matteo Della Torre/NurPhoto/Shutterstock

© Photograph: Matteo Della Torre/NurPhoto/Shutterstock

© Photograph: Matteo Della Torre/NurPhoto/Shutterstock

  •  

‘Music needs a human component to be of any value’: Guardian readers on the growing use of AI in music

AI promises to have far-reaching effects in music-making. While some welcome it as a compositional tool, many have deep concerns. Here are some of your responses

AI-generated music is flooding streaming platforms, and it seems to be here to stay. Last month, three AI songs reached the highest spots on Spotify and Billboard charts. Jorja Smith’s label has called for her to receive a share of royalties from a song thought to have trained its original AI-generated vocals on her catalogue, which were later re-recorded by a human singer.

With this in mind, we asked for your thoughts on music composed by AI, the use of AI as a tool in the creation of music, and what should be done to protect musicians. Here are some of your responses.

Continue reading...

© Photograph: Krisanapong Detraphiphat/Getty Images

© Photograph: Krisanapong Detraphiphat/Getty Images

© Photograph: Krisanapong Detraphiphat/Getty Images

  •  

This is Europe's secret weapon against Trump: it could burst his AI bubble | Johnny Ryan

Growth in the US economy – and the president’s political survival – rest on AI. The EU must use its leverage and stand up to him

The unthinkable has happened. The US is Europe’s adversary. The stark, profound betrayal contained in the Trump administration’s national security strategy should stop any further denial and dithering in Europe’s capitals. Cultivating “resistance Europe’s current trajectory in European nations” is now Washington’s stated policy.

But contained within this calamity is the gift of clarity. Europe will fight or it will perish. The good news is that Europe holds strong cards.

Johnny Ryan is director of Enforce, a unit of the Irish Council for Civil Liberties

Continue reading...

© Photograph: Bart van Overbeeke Fotografie/AS/Reuters

© Photograph: Bart van Overbeeke Fotografie/AS/Reuters

© Photograph: Bart van Overbeeke Fotografie/AS/Reuters

  •  

Musicians are deeply concerned about AI. So why are the major labels embracing it?

Companies such as Udio, Suno and Klay will let you use AI to make new music based on existing artists’ work. It could mean more royalties – but many are worried

This was the year that AI-generated music went from jokey curiosity to mainstream force. Velvet Sundown, a wholly AI act, generated millions of streams; AI-created tracks topped Spotify’s viral chart and one of the US Billboard country charts; AI “artist” Xania Monet “signed” a record deal. BBC Introducing is usually a platform for flesh-and-blood artists trying to make it big, but an AI-generated song by Papi Lamour was recently played on the West Midlands show. And jumping up the UK Top 20 this month is I Run, a track by dance act Haven, who have been accused of using AI to imitate British vocalist Jorja Smith (Haven claim they simply asked the AI for “soulful vocal samples”, and did not respond to an earlier request to comment).

The worry is that AI will eventually absorb all creative works in history and spew out endless slop that will replace human-made art and drive artists into penury. Those worries are being deepened by how the major labels, once fearful of the technology, are now embracing it – and heralding a future in which ordinary listeners have a hand in co-creating music with their favourite musicians.

Continue reading...

© Illustration: Velvet Sundown

© Illustration: Velvet Sundown

© Illustration: Velvet Sundown

  •  

Can fast qubits also be robust?

National center of competence in research spin
Qubit central: This work was carried out as part of the National Center of Competence in Research SPIN (NCCR SPIN), which is led by the University of Basel, Switzerland. NCCR SPIN focuses on creating scalable spin qubits in semiconductor nanostructures made of silicon and germanium, with the aim of developing small, fast qubits for a universal quantum computer. (Courtesy: A Efimov)

Qubits – the building blocks of quantum computers – are plagued with a seemingly unsurmountable dilemma. If they’re fast, they aren’t robust. And if they’re robust, they aren’t fast. Both qualities are important, because all potentially useful quantum algorithms rely on being able to perform many manipulations on a qubit before its state decays. But whereas faster qubits are typically realized by strongly coupling them to the external environment, enabling them to interact more strongly with the driving field, robust qubits with long coherence times are typically achieved by isolating them from their environment.

These seemingly contradictory requirements made simultaneously fast and robust qubits an unsolved challenge – until now. In an article published in Nature Communications, a team of physicists led by Dominik Zumbühl from the University of Basel, Switzerland show that it is, in fact, possible to increase both the coherence time and operational speed of a qubit, demonstrating a pathway out of this long-standing impasse.

The magic ingredient

The key ingredient driving this discovery is something called the direct Rashba spin-orbit interaction. The best-known example of spin-orbit interaction comes from atomic physics. Consider a hydrogen atom, in which a single electron revolves around a single proton in the nucleus. During this orbital motion, the electron interacts with the static electric field generated by the positively charged nucleus. The electron in turn experiences an effective magnetic field that couples to the electron’s intrinsic magnetic moment, or spin. This coupling of the electron’s orbital motion to its spin is called spin-orbit (SO) interaction.

Aided by collaborators at the University of Oxford, UK and TU Eindhoven in the Netherlands, Zumbühl and colleagues chose to replace this simple SO interaction with a far more complex landscape of electrostatic potential generated by a 10-nanometer-thick germanium wire coated with a thin silicon shell. By removing a single electron from this wire, they create states known as holes that can be used as qubits, with quantum information being encoded in the hole’s spin.

Importantly, the underlying crystal structure of the silicon-coated germanium wire constrains these holes to discrete energy levels called bands. “If you were to mathematically model a low-level hole residing in one of these bands using perturbation theory – a commonly applied method in which more remote bands are treated as corrections to the ground state – you would find a term that looks structurally similar to the spin–orbit interaction known from atomic physics,” explains Miguel Carballido, who conducted the work during his PhD at Basel, and is now a senior research associate at the University of New South Wales’ School of Electrical Engineering and Telecommunications in Sydney, Australia.

By encoding the quantum states in these energy levels, the spin-orbit interaction can be used to drive the hole-qubit between its two spin states. What makes this interaction special is that it can be tuned using an external electric field. Thus, by applying a stronger electric field, the interaction can be strengthened – resulting in faster qubit manipulation.

Comparison of graphs of qubit speed and qubit coherence times, showing showing qubit speed plateauing (top panel) and qubit coherence times peaking (bottom) at an applied electric field around 1330 mV
Uncompromising performance: Results showing qubit speed plateauing (top panel) and qubit coherence times peaking (bottom) at an applied electric field around 1330 mV, showing that qubit speed and coherence times can be simultaneously optimized. (CC BY ND 4.0 MJ Carballido et al. “Compromise-free scaling of qubit speed and coherence” 2025 Nat. Commun. 16 7616)

Reaching a plateau

This ability to make a qubit faster by tuning an external parameter isn’t new. The difference is that whereas in other approaches, a stronger interaction also means higher sensitivity to fluctuations in the driving field, the Basel researchers found a way around this problem. As they increase the electric field, the spin-orbit interaction increases up to a certain point. Beyond this point, any further increase in the electric field will cause the hole to remain stuck within a low energy band. This restricts the hole’s ability to interact with other bands to change its spin, causing the SO interaction strength to drop.

By tuning the electric field to this peak, they can therefore operate in a “plateau” region where the SO interaction is the strongest, but the sensitivity to noise is the lowest. This leads to high coherence times (see figure), meaning that the qubit remains in the desired quantum state for longer. By reaching this plateau, where the qubit is both fast and robust, the researchers demonstrate the ability to operate their device in the “compromise-free” regime.

So, is quantum computing now a solved problem? The researchers’ answer is “not yet”, as there are still many challenges to overcome. “A lot of the heavy lifting is being done by the quasi 1D system provided by the nanowire,” remarks Carballido, “but this also limits scalability.” He also notes that the success of the experiment depends on being able to fabricate each qubit device very precisely, and doing this reproducibly remains a challenge.

The post Can fast qubits also be robust? appeared first on Physics World.

  •  

Quantum computing: hype or hope?

Unless you’ve been living under a stone, you can’t have failed to notice that 2025 marks the first 100 years of quantum mechanics. A massive milestone, to say the least, about which much has been written in Physics World and elsewhere in what is the International Year of Quantum Science and Technology (IYQ). However, I’d like to focus on a specific piece of quantum technology, namely quantum computing.

I keep hearing about quantum computers, so people must be using them to do cool things, and surely they will soon be as commonplace as classical computers. But as a physicist-turned-engineer working in the aerospace sector, I struggle to get a clear picture of where things are really at. If I ask friends and colleagues when they expect to see quantum computers routinely used in everyday life, I get answers ranging from “in the next two years” to “maybe in my lifetime” or even “never”.

Before we go any further, it’s worth reminding ourselves that quantum computing relies on several key quantum properties, including superposition, which gives rise to the quantum bit, or qubit. The basic building block of a quantum computer – the qubit – exists as a combination of 0 and 1 states at the same time and is represented by a probabilistic wave function. Classical computers, in contrast, use binary digital bits that are either 0 or 1.

Also vital for quantum computers is the notion of entanglement, which is when two or more qubits are co-ordinated, allowing them to share their quantum information. In a highly correlated system, a quantum computer can explore many paths simultaneously. This “massive scale” parallel processing is how quantum may solve certain problems exponentially faster than a classical computer.

The other key phenomenon for quantum computers is quantum interference. The wave-like nature of qubits means that when different probability amplitudes are in phase, they combine constructively to increase the likelihood of the right solution. Conversely, destructive interference occurs when amplitudes are out of phase, making it less likely to get the wrong answer.

Quantum interference is important in quantum computing because it allows quantum algorithms to amplify the probability of correct answers and suppress incorrect ones, making calculations much faster. Along with superposition and entanglement, it means that quantum computers could process and store vast numbers of probabilities at once, outstripping even the best classical supercomputers.

Towards real devices

To me, it all sounds exciting, but what have quantum computers ever done for us so far? It’s clear that quantum computers are not ready to be deployed in the real world. Significant technological challenges need to be overcome before they become fully realisable. In any case, no-one is expecting quantum computers to displace classical computers “like for like”: they’ll both be used for different things.

Yet it seems that the very essence of quantum computing is also its Achilles heel. Superposition, entanglement and interference – the quantum properties that will make it so powerful – are also incredibly difficult to create and maintain. Qubits are also extremely sensitive to their surroundings. They easily lose their quantum state due to interactions with the environment, whether via stray particles, electromagnetic fields, or thermal fluctuations. Known as decoherence, it makes quantum computers prone to error.

That’s why quantum computers need specialized – and often cryogenically controlled – environments to maintain the quantum states necessary for accurate computation. Building a quantum system with lots of interconnected qubits is therefore a major, expensive engineering challenge, with complex hardware and extreme operating conditions. Developing “fault-tolerant” quantum hardware and robust error-correction techniques will be essential if we want reliable quantum computation.

As for the development of software and algorithms for quantum systems, there’s a long way to go, with a lack of mature tools and frameworks. Quantum algorithms require fundamentally different programming paradigms to those used for classical computers. Put simply, that’s why building reliable, real-world deployable quantum computers remains a grand challenge.

What does the future hold?

Despite the huge amount of work that still lies in store, quantum computers have already demonstrated some amazing potential. The US firm D-Wave, for example, claimed earlier this year to have carried out simulations of quantum magnetic phase transitions that wouldn’t be possible with the most powerful classical devices. If true, this was the first time a quantum computer had achieved “quantum advantage” for a practical physics problem (whether the problem was worth solving is another question).

There is also a lot of research and development going on around the world into solving the qubit stability problem. At some stage, there will likely be a breakthrough design for robust and reliable quantum computer architecture. There is probably a lot of technical advancement happening right now behind closed doors.

The first real-world applications of quantum computers will be akin to the giant classical supercomputers of the past. If you were around in the 1980s, you’ll remember Cray supercomputers: huge, inaccessible beasts owned by large corporations, government agencies and academic institutions to enable vast amounts of calculations to be performed (provided you had the money).

And, if I believe what I read, quantum computers will not replace classical computers, at least not initially, but work alongside them, as each has its own relative strengths. Quantum computers will be suited for specific and highly demanding computational tasks, such as drug discovery, materials science, financial modelling, complex optimization problems and increasingly large artificial intelligence and machine-learning models.

These are all things beyond the limits of classical computer resource. Classical computers will remain relevant for everyday tasks like web browsing, word processing and managing databases, and they will be essential for handling the data preparation, visualization and error correction required by quantum systems.

And there is one final point to mention, which is cyber security. Quantum computing poses a major threat to existing encryption methods, with potential to undermine widely used public-key cryptography. There are concerns that hackers nowadays are storing their stolen data in anticipation of future quantum decryption.

Having looked into the topic, I can now see why the timeline for quantum computing is so fuzzy and why I got so many different answers when I asked people when the technology would be mainstream. Quite simply, I still can’t predict how or when the tech stack will pan out. But as IYQ draws to a close, the future for quantum computers is bright.

The post Quantum computing: hype or hope? appeared first on Physics World.

  •  

Modular cryogenics platform adapts to new era of practical quantum computing

2025-10-iceoxford-creo-main-image
Modular and scalable: the ICE-Q cryogenics platform delivers the performance and reliability needed for professional computing environments while also providing a flexible and extendable design. The standard configuration includes a cooling module, a payload with a large sample space, and a side-loading wiring module for scalable connectivity (Courtesy: ICEoxford)

At the centre of most quantum labs is a large cylindrical cryostat that keeps the delicate quantum hardware at ultralow temperatures. These cryogenic chambers have expanded to accommodate larger and more complex quantum systems, but the scientists and engineers at UK-based cryogenics specialist ICEoxford have taken a radical new approach to the challenge of scalability. They have split the traditional cryostat into a series of cube-shaped modules that slot into a standard 19-inch rack mount, creating an adaptable platform that can easily be deployed alongside conventional computing infrastructure.

“We wanted to create a robust, modular and scalable solution that enables different quantum technologies to be integrated into the cryostat,” says Greg Graf, the company’s engineering manager. “This approach offers much more flexibility, because it allows different modules to be used for different applications, while the system also delivers the efficiency and reliability that are needed for operational use.”

The standard configuration of the ICE-Q platform has three separate modules: a cryogenics unit that provides the cooling power, a large payload for housing the quantum chip or experiment, and a patent-pending wiring module that attaches to the side of the payload to provide the connections to the outside world. Up to four of these side-loading wiring modules can be bolted onto the payload at the same time, providing thousands of external connections while still fitting into a standard rack. For applications where space is not such an issue, the payload can be further extended to accommodate larger quantum assemblies and potentially tens of thousands of radio-frequency or fibre-optic connections.

The cube-shaped form factor provides much improved access to these external connections, whether for designing and configuring the system or for ongoing maintenance work. The outer shell of each module consists of panels that are easily removed, offering a simple mechanism for bolting modules together or stacking them on top of each other to provide a fully scalable solution that grows with the qubit count.

The flexible design also offers a more practical solution for servicing or upgrading an installed system, since individual modules can be simply swapped over as and when needed. “For quantum computers running in an operational environment it is really important to minimize the downtime,” says Emma Yeatman, senior design engineer at ICEoxford. “With this design we can easily remove one of the modules for servicing, and replace it with another one to keep the system running for longer. For critical infrastructure devices, it is possible to have built-in redundancy that ensures uninterrupted operation in the event of a failure.”

Other features have been integrated into the platform to make it simple to operate, including a new software system for controlling and monitoring the ultracold environment. “Most of our cryostats have been designed for researchers who really want to get involved and adapt the system to meet their needs,” adds Yeatman. “This platform offers more options for people who want an out-of-the-box solution and who don’t want to get hands on with the cryogenics.”

Such a bold design choice was enabled in part by a collaborative research project with Canadian company Photonic Inc, funded jointly by the UK and Canada, that was focused on developing an efficient and reliable cryogenics platform for practical quantum computing. That R&D funding helped to reduce the risk of developing an entirely new technology platform that addresses many of the challenges that ICEoxford and its customers had experienced with traditional cryostats. “Quantum technologies typically need a lot of wiring, and access had become a real issue,” says Yeatman. “We knew there was an opportunity to do better.”

However, converting a large cylindrical cryostat into a slimline and modular form factor demanded some clever engineering solutions. Perhaps the most obvious was creating a frame that allows the modules to be bolted together while still remaining leak tight. Traditional cryostats are welded together to ensure a leak-proof seal, but for greater flexibility the ICEoxford team developed an assembly technique based on mechanical bonding.

The side-loading wiring module also presented a design challenge. To squeeze more wires into the available space, the team developed a high-density connector for the coaxial cables to plug into. An additional cold-head was also integrated into the module to pre-cool the cables, reducing the overall heat load generated by such large numbers of connections entering the ultracold environment.

2025-10-iceoxford-image-a4-system-render
Flexible for the future: the outer shell of the modules is covered with removable panels that make it easy to extend or reconfigure the system (Courtesy: ICEoxford)

Meanwhile, the speed of the cooldown and the efficiency of operation have been optimized by designing a new type of heat exchanger that is fabricated using a 3D printing process. “When warm gas is returned into the system, a certain amount of cooling power is needed just to compress and liquefy that gas,” explains Kelly. “We designed the heat exchangers to exploit the returning cold gas much more efficiently, which enables us to pre-cool the warm gas and use less energy for the liquefaction.”

The initial prototype has been designed to operate at 1 K, which is ideal for the photonics-based quantum systems being developed by ICEoxford’s research partner. But the modular nature of the platform allows it to be adapted to diverse applications, with a second project now underway with the Rutherford Appleton Lab to develop a module that that will be used at the forefront of the global hunt for dark matter.

Already on the development roadmap are modules that can sustain temperatures as low as 10 mK – which is typically needed for superconducting quantum computing – and a 4 K option for trapped-ion systems. “We already have products for each of those applications, but our aim was to create a modular platform that can be extended and developed to address the changing needs of quantum developers,” says Kelly.

As these different options come onstream, the ICEoxford team believes that it will become easier and quicker to deliver high-performance cryogenic systems that are tailored to the needs of each customer. “It normally takes between six and twelve months to build a complex cryogenics system,” says Graf. “With this modular design we will be able to keep some of the components on the shelf, which would allow us to reduce the lead time by several months.”

More generally, the modular and scalable platform could be a game-changer for commercial organizations that want to exploit quantum computing in their day-to-day operations, as well as for researchers who are pushing the boundaries of cryogenics design with increasingly demanding specifications. “This system introduces new avenues for hardware development that were previously constrained by the existing cryogenics infrastructure,” says Kelly. “The ICE-Q platform directly addresses the need for colder base temperatures, larger sample spaces, higher cooling powers, and increased connectivity, and ensures our clients can continue their aggressive scaling efforts without being bottlenecked by their cooling environment.”

  • You can find out more about the ICE-Q platform by contacting the ICEoxford team at iceoxford.com, or via email at sales@iceoxford.com. They will also be presenting the platform at the UK’s National Quantum Technologies Showcase in London on 7 November, with a further launch at the American Physical Society meeting in March 2026.

The post Modular cryogenics platform adapts to new era of practical quantum computing appeared first on Physics World.

  •  

Quantum computing on the verge: correcting errors, developing algorithms and building up the user base

When it comes to building a fully functional “fault-tolerant” quantum computer, companies and government labs all over the world are rushing to be the first over the finish line. But a truly useful universal quantum computer capable of running complex algorithms would have to entangle millions of coherent qubits, which are extremely fragile. Because of environmental factors such as temperature, interference from other electronic systems in hardware, and even errors in measurement, today’s devices would fail under an avalanche of errors long before reaching that point.

So the problem of error correction is a key issue for the future of the market. It arises because errors in qubits can’t be corrected simply by keeping multiple copies, as they are in classical computers: quantum rules forbid the copying of qubit states while they are still entangled with others, and are thus unknown. To run quantum circuits with millions of gates, we therefore need new tricks to enable quantum error correction (QEC).

Protected states

The general principle of QEC is to spread the information over many qubits so that an error in any one of them doesn’t matter too much. “The essential idea of quantum error correction is that if we want to protect a quantum system from damage then we should encode it in a very highly entangled state,” says John Preskill, director of the Institute for Quantum Information and Matter at the California Institute of Technology in Pasadena.

There is no unique way of achieving that spreading, however. Different error-correcting codes can depend on the connectivity between qubits – whether, say, they are coupled only to their nearest neighbours or to all the others in the device – which tends to be determined by the physical platform being used. However error correction is done, it must be done fast. “The mechanisms for error correction need to be running at a speed that is commensurate with that of the gate operations,” says Michael Cuthbert, founding director of the UK’s National Quantum Computing Centre (NQCC). “There’s no point in doing a gate operation in a nanosecond if it then takes 100 microseconds to do the error correction for the next gate operation.”

At the moment, dealing with errors is largely about compensation rather than correction: patching up the problems of errors in retrospect, for example by using algorithms that can throw out some results that are likely to be unreliable (an approach called “post-selection”). It’s also a matter of making better qubits that are less error-prone in the first place.

1 From many to few

Turning unreliable physical qubits into a logical qubit
(Courtesy: Riverlane via www.riverlane.com)

Qubits are so fragile that their quantum state is very susceptible to the local environment, and can easily be lost through the process of decoherence. Current quantum computers therefore have very high error rates – roughly one error in every few hundred operations. For quantum computers to be truly useful, this error rate will have to be reduced to the scale of one in a million; especially as larger more complex algorithms would require one in a billion or even trillion error rates. This requires real-time quantum error correction (QEC).

To protect the information stored in qubits, a multitude of unreliable physical qubits have to be combined in such a way that if one qubit fails and causes an error, the others can help protect the system. Essentially, by combining many physical qubits (shown above on the left), one can build a few “logical” qubits that are strongly resistant to noise.

According to Maria Maragkou, commercial vice-president of quantum error-correction company Riverlane, the goal of full QEC has ramifications for the design of the machines all the way from hardware to workflow planning. “The shift to support error correction has a profound effect on the way quantum processors themselves are built, the way we control and operate them, through a robust software stack on top of which the applications can be run,” she explains. The “stack” includes everything from programming languages to user interfaces and servers.

With genuinely fault-tolerant qubits, errors can be kept under control and prevented from proliferating during a computation. Such qubits might be made in principle by combining many physical qubits into a single “logical qubit” in which errors can be corrected (see figure 1). In practice, though, this creates a large overhead: huge numbers of physical qubits might be needed to make just a few fault-tolerant logical qubits. The question is then whether errors in all those physical qubits can be checked faster than they accumulate (see figure 2).

That overhead has been steadily reduced over the past several years, and at the end of last year researchers at Google announced that their 105-qubit Willow quantum chip passed the break-even threshold at which the error rate gets smaller, rather than larger, as more physical qubits are used to make a logical qubit. This means that in principle such arrays could be scaled up without errors accumulating.

2 Error correction in action

Illustration of the error correction cycle
(Courtesy: Riverlane via www.riverlane.com)

The illustration gives an overview of quantum error correction (QEC) in action within a quantum processing unit. UK-based company Riverlane is building its Deltaflow QEC stack that will correct millions of data errors in real time, allowing a quantum computer to go beyond the reach of any classical supercomputer.

Fault-tolerant quantum computing is the ultimate goal, says Jay Gambetta, director of IBM research at the company’s centre in Yorktown Heights, New York. He believes that to perform truly transformative quantum calculations, the system must go beyond demonstrating a few logical qubits – instead, you need arrays of at least a 100 of them, that can perform more than 100 million quantum operations (108 QuOps). “The number of operations is the most important thing,” he says.

It sounds like a tall order, but Gambetta is confident that IBM will achieve these figures by 2029. By building on what has been achieved so far with error correction and mitigation, he feels “more confident than I ever did before that we can achieve a fault-tolerant computer.” Jerry Chow, previous manager of the Experimental Quantum Computing group at IBM, shares that optimism. “We have a real blueprint for how we can build [such a machine] by 2029,” he says (see figure 3).

Others suspect the breakthrough threshold may be a little lower: Steve Brierley, chief executive of Riverlane, believes that the first error-corrected quantum computer, with around 10 000 physical qubits supporting 100 logical qubits and capable of a million QuOps (a megaQuOp), could come as soon as 2027. Following on, gigaQuOp machines (109 QuOps) should be available by 2030–32, and teraQuOps (1012 QuOp) by 2035–37.

Platform independent

Error mitigation and error correction are just two of the challenges for developers of quantum software. Fundamentally, to develop a truly quantum algorithm involves taking full advantage of the key quantum-mechanical properties such as superposition and entanglement. Often, the best way to do that depends on the hardware used to run the algorithm. But ultimately the goal will be to make software that is not platform-dependent and so doesn’t require the user to think about the physics involved.

“At the moment, a lot of the platforms require you to come right down into the quantum physics, which is a necessity to maximize performance,” says Richard Murray of photonic quantum-computing company Orca. Try to generalize an algorithm by abstracting away from the physics and you’ll usually lower the efficiency with which it runs. “But no user wants to talk about quantum physics when they’re trying to do machine learning or something,” Murray adds. He believes that ultimately it will be possible for quantum software developers to hide those details from users – but Brierley thinks this will require fault-tolerant machines.

“In due time everything below the logical circuit will be a black box to the app developers”, adds Maragkou over at Riverlane. “They will not need to know what kind of error correction is used, what type of qubits are used, and so on.” She stresses that creating truly efficient and useful machines depends on developing the requisite skills. “We need to scale up the workforce to develop better qubits, better error-correction codes and decoders, write the software that can elevate those machines and solve meaningful problems in a way that they can be adopted.” Such skills won’t come only from quantum physicists, she adds: “I would dare say it’s mostly not!”

Yet even now, working on quantum software doesn’t demand a deep expertise in quantum theory. “You can be someone working in quantum computing and solving problems without having a traditional physics training and knowing about the energy levels of the hydrogen atom and so on,” says Ashley Montanaro, who co-founded the quantum software company Phasecraft.

On the other hand, insights can flow in the other direction too: working on quantum algorithms can lead to new physics. “Quantum computing and quantum information are really pushing the boundaries of what we think of as quantum mechanics today,” says Montanaro, adding that QEC “has produced amazing physics breakthroughs.”

Early adopters?

Once we have true error correction, Cuthbert at the UK’s NQCC expects to see “a flow of high-value commercial uses” for quantum computers. What might those be?

In this arena of quantum chemistry and materials science, genuine quantum advantage – calculating something that is impossible using classical methods alone – is more or less here already, says Chow. Crucially, however, quantum methods needn’t be used for the entire simulation but can be added to classical ones to give them a boost for particular parts of the problem.

IBM and RIKEN quantum systems
Joint effort In June 2025, IBM in the US and Japan’s national research laboratory RIKEN, unveiled the IBM Quantum System Two, the first to be used outside the US. It involved IBM’s 156-qubit IBM Heron quantum computing system (left) being paired with RIKEN’s supercomputer Fugaku (right) — one of the most powerful classical systems on Earth. The computers are linked through a high-speed network at the fundamental instruction level to form a proving ground for quantum-centric supercomputing. (Courtesy: IBM and RIKEN)

For example, last year researchers at IBM teamed up with scientists at several RIKEN institutes in Japan to calculate the minimum energy state for the iron sulphide cluster (4Fe-4S) at the heart of the bacterial nitrogenase enzyme that fixes nitrogen. This cluster is too big and complex to be accurately simulated using the classical approximations of quantum chemistry. The researchers used a combination of both quantum computing (with IBM’s 72-qubit Heron chip) and RIKEN’s Fugaku high performance computing (HPC). This idea of “improving classical methods by injecting quantum as a subroutine” is likely to be a more general strategy, says Gambetta. “The future of computing is going to be heterogeneous accelerators [of discovery] that include quantum.”

Likewise, Montanaro says that Phasecraft is developing “quantum-enhanced algorithms”, where a quantum computer is used, not to solve the whole problem, but just to help a classical computer in some way. “There are only certain problems where we know quantum computing is going to be useful,” he says. “I think we are going to see quantum computers working in tandem with classical computers in a hybrid approach. I don’t think we’ll ever see workloads that are entirely run using a quantum computer.” Among the first important problems that quantum machines will solve, according to Montanaro, are the simulation of new materials – to develop, for example, clean-energy technologies (see figure 4).

“For a physicist like me,” says Preskill, “what is really exciting about quantum computing is that we have good reason to believe that a quantum computer would be able to efficiently simulate any process that occurs in nature.”

3 Structural insights

Modelling materials using quantum computing
(Courtesy: Phasecraft)

A promising application of quantum computers is simulating novel materials. Researchers from the quantum algorithms firm Phasecraft, for example, have already shown how a quantum computer could help simulate complex materials such as the polycrystalline compound LK-99, which was purported by some researchers in 2024 to be a room-temperature superconductor.

Using a classical/quantum hybrid workflow, together with the firm’s proprietary material simulation approach to encode and compile materials on quantum hardware, Phasecraft researchers were able to establish a classical model of the LK99 structure that allowed them to extract an approximate representation of the electrons within the material. The illustration above shows the green and blue electronic structure around red and grey atoms in LK-99.

Montanaro believes another likely near-term goal for useful quantum computing is solving optimization problems – both here and in quantum simulation, “we think genuine value can be delivered already in this NISQ era with hundreds of qubits.” (NISQ, a term coined by Preskill, refers to noisy intermediate-scale quantum computing, with relatively small numbers of rather noisy, error-prone qubits.)

One further potential benefit of quantum computing is that it tends to require less energy than classical high-performance computing, which is notoriously high. If the energy cost could be cut by even a few percent, it would be worth using quantum resources for that reason alone. “Quantum has real potential for an energy advantage,” says Chow. One study in 2020 showed that a particular quantum-mechanical calculation carried out on a HPC used many orders of magnitude more energy than when it was simulated on a quantum circuit. Such comparisons are not easy, however, in the absence of an agreed and well-defined metric for energy consumption.

Building the market

Right now, the quantum computing market is in a curious superposition of states itself – it has ample proof of principle, but today’s devices are still some way from being able to perform a computation relevant to a practical problem that could not be done with classical computers. Yet to get to that point, the field needs plenty of investment.

The fact that quantum computers, especially if used with HPC, are already unique scientific tools should establish their value in the immediate term, says Gambetta. “I think this is going to accelerate, and will keep the funding going.” It is why IBM is focusing on utility-scale systems of around 100 qubits or so and more than a thousand gate operations, he says, rather than simply trying to build ever bigger devices.

Montanaro sees a role for governments to boost the growth of the industry “where it’s not the right fit for the private sector”. One role of government is simply as a customer. For example, Phasecraft is working with the UK national grid to develop a quantum algorithm for optimizing the energy network. “Longer-term support for academic research is absolutely critical,” Montanaro adds. “It would be a mistake to think that everything is done in terms of the underpinning science, and governments should continue to support blue-skies research.”

IBM roadmap of quantum development
The road ahead IBM’s current roadmap charts how the company plans on scaling up its devices to achieve a fault-tolerant device by 2029. Alongside hardware development, the firm will also focus on developing new algorithms and software for these devices. (Courtesy: IBM)

It’s not clear, though, whether there will be a big demand for quantum machines that every user will own and run. Before 2010, “there was an expectation that banks and government departments would all want their own machine – the market would look a bit like HPC,” Cuthbert says. But that demand depends in part on what commercial machines end up being like. “If it’s going to need a premises the size of a football field, with a power station next to it, that becomes the kind of infrastructure that you only want to build nationally.” Even for smaller machines, users are likely to try them first on the cloud before committing to installing one in-house.

According to Cuthbert , the real challenge in the supply-chain development is that many of today’s technologies were developed for the science community – where, say, achieving millikelvin cooling or using high-power lasers is routine. “How do you go from a specialist scientific clientele to something that starts to look like a washing machine factory, where you can make them to a certain level of performance,” while also being much cheaper, and easier to use?

But Cuthbert is optimistic about bridging this gap to get to commercially useful machines, encouraged in part by looking back at the classical computing industry of the 1970s. “The architects of those systems could not imagine what we would use our computation resources for today. So I don’t think we should be too discouraged that you can grow an industry when we don’t know what it’ll do in five years’ time.”

Montanaro too sees analogies with those early days of classical computing. “If you think what the computer industry looked like in the 1940s, it’s very different from even 20 years later. But there are some parallels. There are companies that are filling each of the different niches we saw previously, there are some that are specializing in quantum hardware development, there are some that are just doing software.” Cuthbert thinks that the quantum industry is likely to follow a similar pathway, “but more quickly and leading to greater market consolidation more rapidly.”

However, while the classical computing industry was revolutionized by the advent of personal computing in the 1970s and 80s, it seems very unlikely that we will have any need for quantum laptops. Rather, we might increasingly see apps and services appear that use cloud-based quantum resources for particular operations, merging so seamlessly with classical computing that we don’t even notice.

That, perhaps, would be the ultimate sign of success: that quantum computing becomes invisible, no big deal but just a part of how our answers are delivered.

  • In the first instalment of this two-part article, Philip Ball explores the latest developments in the quantum-computing industry

This article forms part of Physics World‘s contribution to the 2025 International Year of Quantum Science and Technology (IYQ), which aims to raise global awareness of quantum physics and its applications.

Stayed tuned to Physics World and our international partners throughout the year for more coverage of the IYQ.

Find out more on our quantum channel.

The post Quantum computing on the verge: correcting errors, developing algorithms and building up the user base appeared first on Physics World.

  •  

Quantum computing and AI join forces for particle physics

This episode of the Physics World Weekly podcast explores how quantum computing and artificial intelligence can be combined to help physicists search for rare interactions in data from an upgraded Large Hadron Collider.

My guest is Javier Toledo-Marín, and we spoke at the Perimeter Institute in Waterloo, Canada. As well as having an appointment at Perimeter, Toledo-Marín is also associated with the TRIUMF accelerator centre in Vancouver.

Toledo-Marín and colleagues have recently published a paper called “Conditioned quantum-assisted deep generative surrogate for particle–calorimeter interactions”.

Delft logo

This podcast is supported by Delft Circuits.

As gate-based quantum computing continues to scale, Delft Circuits provides the i/o solutions that make it possible.

The post Quantum computing and AI join forces for particle physics appeared first on Physics World.

  •