↩ Accueil

Vue normale

Reçu aujourd’hui — 11 novembre 2025
Reçu hier — 10 novembre 2025
Reçu avant avant-hier

MSI GeForce RTX 5080 16G Expert gets the CES 2026 ‘Best of Innovation’ award

9 novembre 2025 à 14:00

MSI has secured five honoree awards ahead of CES 2026, including a “Best of Innovation” award for its GeForce RTX 5080 16G Expert graphics card.

The annual competition run by the Consumer Technology Association (CTA) has awarded the RTX 5080 16G Expert graphics card with the maximum honor in its category. Featuring a die-cast aluminum alloy shroud with an industrial aesthetic, this card uses MSI's updated “Flow Frozr 2” thermal system, specifically designed to maintain performance stability over long sessions.

Alongside the new GPU, MSI picked up Honoree awards for two complete systems. The Vision Elite RS AI 2nd is the company's new flagship pre-built desktop, pairing the latest Intel Core Ultra CPUs with high-end Nvidia RTX graphics inside a chassis dominated by tempered glass and managed by AI-driven cooling controls.

On the professional side, the MSI EdgeXpert also received a nod. This system is an edge AI computing device designed for local model training and inference, powered by the Nvidia GB10 Grace Blackwell Superchip. MSI has announced that the remaining award-winning products will be unveiled on January 4th, just ahead of the start of CES.

Discuss on our Facebook page, HERE.

KitGuru says: It will be interesting to see what the other award winners were when we touch down at CES in January. 

The post MSI GeForce RTX 5080 16G Expert gets the CES 2026 ‘Best of Innovation’ award first appeared on KitGuru.

Abracadabrant et pourtant vrai : une petite ville de Finlande se chauffe désormais grâce à... du sable !

8 novembre 2025 à 11:08
Du sable pour se chauffer ? Si l’idée semble complètement irréelle, le phénomène vient de faire ses preuves dans la petite ville de Pornainen, au sud de la Finlande. Une installation hors du commun fait parler d’elle et relègue gaz et fioul au second plan. Mais que cache cette petite révolution...

Quantum computing: hype or hope?

3 novembre 2025 à 15:00

Unless you’ve been living under a stone, you can’t have failed to notice that 2025 marks the first 100 years of quantum mechanics. A massive milestone, to say the least, about which much has been written in Physics World and elsewhere in what is the International Year of Quantum Science and Technology (IYQ). However, I’d like to focus on a specific piece of quantum technology, namely quantum computing.

I keep hearing about quantum computers, so people must be using them to do cool things, and surely they will soon be as commonplace as classical computers. But as a physicist-turned-engineer working in the aerospace sector, I struggle to get a clear picture of where things are really at. If I ask friends and colleagues when they expect to see quantum computers routinely used in everyday life, I get answers ranging from “in the next two years” to “maybe in my lifetime” or even “never”.

Before we go any further, it’s worth reminding ourselves that quantum computing relies on several key quantum properties, including superposition, which gives rise to the quantum bit, or qubit. The basic building block of a quantum computer – the qubit – exists as a combination of 0 and 1 states at the same time and is represented by a probabilistic wave function. Classical computers, in contrast, use binary digital bits that are either 0 or 1.

Also vital for quantum computers is the notion of entanglement, which is when two or more qubits are co-ordinated, allowing them to share their quantum information. In a highly correlated system, a quantum computer can explore many paths simultaneously. This “massive scale” parallel processing is how quantum may solve certain problems exponentially faster than a classical computer.

The other key phenomenon for quantum computers is quantum interference. The wave-like nature of qubits means that when different probability amplitudes are in phase, they combine constructively to increase the likelihood of the right solution. Conversely, destructive interference occurs when amplitudes are out of phase, making it more likely to get the wrong answer.

Quantum interference is important in quantum computing because it allows quantum algorithms to amplify the probability of correct answers and suppress incorrect ones, making calculations much faster. Along with superposition and entanglement, it means that quantum computers could process and store vast numbers of probabilities at once, outstripping even the best classical supercomputers.

Towards real devices

To me, it all sounds exciting, but what have quantum computers ever done for us so far? It’s clear that quantum computers are not ready to be deployed in the real world. Significant technological challenges need to be overcome before they become fully realisable. In any case, no-one is expecting quantum computers to displace classical computers “like for like”: they’ll both be used for different things.

Yet it seems that the very essence of quantum computing is also its Achilles heel. Superposition, entanglement and interference – the quantum properties that will make it so powerful – are also incredibly difficult to create and maintain. Qubits are also extremely sensitive to their surroundings. They easily lose their quantum state due to interactions with the environment, whether via stray particles, electromagnetic fields, or thermal fluctuations. Known as decoherence, it makes quantum computers prone to error.

That’s why quantum computers need specialized – and often cryogenically controlled – environments to maintain the quantum states necessary for accurate computation. Building a quantum system with lots of interconnected qubits is therefore a major, expensive engineering challenge, with complex hardware and extreme operating conditions. Developing “fault-tolerant” quantum hardware and robust error-correction techniques will be essential if we want reliable quantum computation.

As for the development of software and algorithms for quantum systems, there’s a long way to go, with a lack of mature tools and frameworks. Quantum algorithms require fundamentally different programming paradigms to those used for classical computers. Put simply, that’s why building reliable, real-world deployable quantum computers remains a grand challenge.

What does the future hold?

Despite the huge amount of work that still lies in store, quantum computers have already demonstrated some amazing potential. The US firm D-Wave, for example, claimed earlier this year to have carried out simulations of quantum magnetic phase transitions that wouldn’t be possible with the most powerful classical devices. If true, this was the first time a quantum computer had achieved “quantum advantage” for a practical physics problem (whether the problem was worth solving is another question).

There is also a lot of research and development going on around the world into solving the qubit stability problem. At some stage, there will likely be a breakthrough design for robust and reliable quantum computer architecture. There is probably a lot of technical advancement happening right now behind closed doors.

The first real-world applications of quantum computers will be akin to the giant classical supercomputers of the past. If you were around in the 1980s, you’ll remember Cray supercomputers: huge, inaccessible beasts owned by large corporations, government agencies and academic institutions to enable vast amounts of calculations to be performed (provided you had the money).

And, if I believe what I read, quantum computers will not replace classical computers, at least not initially, but work alongside them, as each has its own relative strengths. Quantum computers will be suited for specific and highly demanding computational tasks, such as drug discovery, materials science, financial modelling, complex optimization problems and increasingly large artificial intelligence and machine-learning models.

These are all things beyond the limits of classical computer resource. Classical computers will remain relevant for everyday tasks like web browsing, word processing and managing databases, and they will be essential for handling the data preparation, visualization and error correction required by quantum systems.

And there is one final point to mention, which is cyber security. Quantum computing poses a major threat to existing encryption methods, with potential to undermine widely used public-key cryptography. There are concerns that hackers nowadays are storing their stolen data in anticipation of future quantum decryption.

Having looked into the topic, I can now see why the timeline for quantum computing is so fuzzy and why I got so many different answers when I asked people when the technology would be mainstream. Quite simply, I still can’t predict how or when the tech stack will pan out. But as IYQ draws to a close, the future for quantum computers is bright.

The post Quantum computing: hype or hope? appeared first on Physics World.

Notefox - Prenez des notes directement sur les sites que vous visitez

Par :Korben
3 novembre 2025 à 10:09

Vous êtes en train de lire un truc, vous avez une idée géniale comme d’habitude, et là vous vous dites que vous noterez ça plus tard… Sauf que si vous êtes comme moi, plus tard, c’est jamais. Snif !

Et bien Notefox règle exactement ce problème. Il s’agit d’une extension Firefox qui vous permet d’associer des notes directement sur les pages web / domaines que vous visitez. Un peu comme des post-it virtuels, mais en mieux fichus…

Avec Notefox, vous pouvez donc créer des notes à deux niveaux. Soit une note pour un domaine entier (genre tous vos meilleurs passages de Korben.info, le site préféré de votre informaticien préféré), soit une note pour une page précise (genre ce tuto que vous venez de lire sur mon site et sur lequel vous avez quelques petites réserves à ajouter). Et voilà !! Les notes se sauvegardent alors automatiquement, et quand vous revenez sur le site, hop, elles sont là !

L’extension propose de l’import / export, un système d’étiquettes de couleurs, de la mise en forme (gras, italique, liens… vous connaissez), de la recherche et vous avez bien sûr la possibilité de supprimer des trucs. Et tout est synchronisé entre vos machines. Vous pouvez aussi personnaliser les raccourcis clavier si vous aimez aller vite.

Mon point noir par contre, c’est que le thème par défaut est vraiment pas terrible. Mais avant de fuir, sachez que vous pouvez tout personnaliser dans les options de l’extension. Couleurs, apparence, police…etc y’a tout et après c’est nickel !

L’extension est développée par Saverio Morelli et le code est en open source sur GitHub si vous voulez jeter un œil sous le capot.

Après si vous cherchez des alternatives, il en existe quelques unes sur Firefox comme Notes by Firefox qui est l’extension officielle de Mozilla, mais elle est très minimaliste, et plus mise à jour depuis 2020.

Bref, si vous passez votre temps à perdre vos idées entre deux onglets, ça vaut le coup d’essayer !

Quantum computing on the verge: correcting errors, developing algorithms and building up the user base

31 octobre 2025 à 15:20

When it comes to building a fully functional “fault-tolerant” quantum computer, companies and government labs all over the world are rushing to be the first over the finish line. But a truly useful universal quantum computer capable of running complex algorithms would have to entangle millions of coherent qubits, which are extremely fragile. Because of environmental factors such as temperature, interference from other electronic systems in hardware, and even errors in measurement, today’s devices would fail under an avalanche of errors long before reaching that point.

So the problem of error correction is a key issue for the future of the market. It arises because errors in qubits can’t be corrected simply by keeping multiple copies, as they are in classical computers: quantum rules forbid the copying of qubit states while they are still entangled with others, and are thus unknown. To run quantum circuits with millions of gates, we therefore need new tricks to enable quantum error correction (QEC).

Protected states

The general principle of QEC is to spread the information over many qubits so that an error in any one of them doesn’t matter too much. “The essential idea of quantum error correction is that if we want to protect a quantum system from damage then we should encode it in a very highly entangled state,” says John Preskill, director of the Institute for Quantum Information and Matter at the California Institute of Technology in Pasadena.

There is no unique way of achieving that spreading, however. Different error-correcting codes can depend on the connectivity between qubits – whether, say, they are coupled only to their nearest neighbours or to all the others in the device – which tends to be determined by the physical platform being used. However error correction is done, it must be done fast. “The mechanisms for error correction need to be running at a speed that is commensurate with that of the gate operations,” says Michael Cuthbert, founding director of the UK’s National Quantum Computing Centre (NQCC). “There’s no point in doing a gate operation in a nanosecond if it then takes 100 microseconds to do the error correction for the next gate operation.”

At the moment, dealing with errors is largely about compensation rather than correction: patching up the problems of errors in retrospect, for example by using algorithms that can throw out some results that are likely to be unreliable (an approach called “post-selection”). It’s also a matter of making better qubits that are less error-prone in the first place.

1 From many to few

Turning unreliable physical qubits into a logical qubit
(Courtesy: Riverlane via www.riverlane.com)

Qubits are so fragile that their quantum state is very susceptible to the local environment, and can easily be lost through the process of decoherence. Current quantum computers therefore have very high error rates – roughly one error in every few hundred operations. For quantum computers to be truly useful, this error rate will have to be reduced to the scale of one in a million; especially as larger more complex algorithms would require one in a billion or even trillion error rates. This requires real-time quantum error correction (QEC).

To protect the information stored in qubits, a multitude of unreliable physical qubits have to be combined in such a way that if one qubit fails and causes an error, the others can help protect the system. Essentially, by combining many physical qubits (shown above on the left), one can build a few “logical” qubits that are strongly resistant to noise.

According to Maria Maragkou, commercial vice-president of quantum error-correction company Riverlane, the goal of full QEC has ramifications for the design of the machines all the way from hardware to workflow planning. “The shift to support error correction has a profound effect on the way quantum processors themselves are built, the way we control and operate them, through a robust software stack on top of which the applications can be run,” she explains. The “stack” includes everything from programming languages to user interfaces and servers.

With genuinely fault-tolerant qubits, errors can be kept under control and prevented from proliferating during a computation. Such qubits might be made in principle by combining many physical qubits into a single “logical qubit” in which errors can be corrected (see figure 1). In practice, though, this creates a large overhead: huge numbers of physical qubits might be needed to make just a few fault-tolerant logical qubits. The question is then whether errors in all those physical qubits can be checked faster than they accumulate (see figure 2).

That overhead has been steadily reduced over the past several years, and at the end of last year researchers at Google announced that their 105-qubit Willow quantum chip passed the break-even threshold at which the error rate gets smaller, rather than larger, as more physical qubits are used to make a logical qubit. This means that in principle such arrays could be scaled up without errors accumulating.

2 Error correction in action

Illustration of the error correction cycle
(Courtesy: Riverlane via www.riverlane.com)

The illustration gives an overview of quantum error correction (QEC) in action within a quantum processing unit. UK-based company Riverlane is building its Deltaflow QEC stack that will correct millions of data errors in real time, allowing a quantum computer to go beyond the reach of any classical supercomputer.

Fault-tolerant quantum computing is the ultimate goal, says Jay Gambetta, director of IBM research at the company’s centre in Yorktown Heights, New York. He believes that to perform truly transformative quantum calculations, the system must go beyond demonstrating a few logical qubits – instead, you need arrays of at least a 100 of them, that can perform more than 100 million quantum operations (108 QuOps). “The number of operations is the most important thing,” he says.

It sounds like a tall order, but Gambetta is confident that IBM will achieve these figures by 2029. By building on what has been achieved so far with error correction and mitigation, he feels “more confident than I ever did before that we can achieve a fault-tolerant computer.” Jerry Chow, previous manager of the Experimental Quantum Computing group at IBM, shares that optimism. “We have a real blueprint for how we can build [such a machine] by 2029,” he says (see figure 3).

Others suspect the breakthrough threshold may be a little lower: Steve Brierley, chief executive of Riverlane, believes that the first error-corrected quantum computer, with around 10 000 physical qubits supporting 100 logical qubits and capable of a million QuOps (a megaQuOp), could come as soon as 2027. Following on, gigaQuOp machines (109 QuOps) should be available by 2030–32, and teraQuOps (1012 QuOp) by 2035–37.

Platform independent

Error mitigation and error correction are just two of the challenges for developers of quantum software. Fundamentally, to develop a truly quantum algorithm involves taking full advantage of the key quantum-mechanical properties such as superposition and entanglement. Often, the best way to do that depends on the hardware used to run the algorithm. But ultimately the goal will be to make software that is not platform-dependent and so doesn’t require the user to think about the physics involved.

“At the moment, a lot of the platforms require you to come right down into the quantum physics, which is a necessity to maximize performance,” says Richard Murray of photonic quantum-computing company Orca. Try to generalize an algorithm by abstracting away from the physics and you’ll usually lower the efficiency with which it runs. “But no user wants to talk about quantum physics when they’re trying to do machine learning or something,” Murray adds. He believes that ultimately it will be possible for quantum software developers to hide those details from users – but Brierly thinks this will require fault-tolerant machines.

“In due time everything below the logical circuit will be a black box to the app developers”, adds Maragkou over at Riverlane. “They will not need to know what kind of error correction is used, what type of qubits are used, and so on.” She stresses that creating truly efficient and useful machines depends on developing the requisite skills. “We need to scale up the workforce to develop better qubits, better error-correction codes and decoders, write the software that can elevate those machines and solve meaningful problems in a way that they can be adopted.” Such skills won’t come only from quantum physicists, she adds: “I would dare say it’s mostly not!”

Yet even now, working on quantum software doesn’t demand a deep expertise in quantum theory. “You can be someone working in quantum computing and solving problems without having a traditional physics training and knowing about the energy levels of the hydrogen atom and so on,” says Ashley Montanaro, who co-founded the quantum software company Phasecraft.

On the other hand, insights can flow in the other direction too: working on quantum algorithms can lead to new physics. “Quantum computing and quantum information are really pushing the boundaries of what we think of as quantum mechanics today,” says Montanaro, adding that QEC “has produced amazing physics breakthroughs.”

Early adopters?

Once we have true error correction, Cuthbert at the UK’s NQCC expects to see “a flow of high-value commercial uses” for quantum computers. What might those be?

In this arena of quantum chemistry and materials science, genuine quantum advantage – calculating something that is impossible using classical methods alone – is more or less here already, says Chow. Crucially, however, quantum methods needn’t be used for the entire simulation but can be added to classical ones to give them a boost for particular parts of the problem.

IBM and RIKEN quantum systems
Joint effort In June 2025, IBM in the US and Japan’s national research laboratory RIKEN, unveiled the IBM Quantum System Two, the first to be used outside the US. It involved IBM’s 156-qubit IBM Heron quantum computing system (left) being paired with RIKEN’s supercomputer Fugaku (right) — one of the most powerful classical systems on Earth. The computers are linked through a high-speed network at the fundamental instruction level to form a proving ground for quantum-centric supercomputing. (Courtesy: IBM and RIKEN)

For example, last year researchers at IBM teamed up with scientists at several RIKEN institutes in Japan to calculate the minimum energy state for the iron sulphide cluster (4Fe-4S) at the heart of the bacterial nitrogenase enzyme that fixes nitrogen. This cluster is too big and complex to be accurately simulated using the classical approximations of quantum chemistry. The researchers used a combination of both quantum computing (with IBM’s 72-qubit Heron chip) and RIKEN’s Fugaku high performance computing (HPC). This idea of “improving classical methods by injecting quantum as a subroutine” is likely to be a more general strategy, says Gambetta. “The future of computing is going to be heterogeneous accelerators [of discovery] that include quantum.”

Likewise, Montanaro says that Phasecraft is developing “quantum-enhanced algorithms”, where a quantum computer is used, not to solve the whole problem, but just to help a classical computer in some way. “There are only certain problems where we know quantum computing is going to be useful,” he says. “I think we are going to see quantum computers working in tandem with classical computers in a hybrid approach. I don’t think we’ll ever see workloads that are entirely run using a quantum computer.” Among the first important problems that quantum machines will solve, according to Montanaro, are the simulation of new materials – to develop, for example, clean-energy technologies (see figure 4).

“For a physicist like me,” says Preskill, “what is really exciting about quantum computing is that we have good reason to believe that a quantum computer would be able to efficiently simulate any process that occurs in nature.”

3 Structural insights

Modelling materials using quantum computing
(Courtesy: Phasecraft)

A promising application of quantum computers is simulating novel materials. Researchers from the quantum algorithms firm Phasecraft, for example, have already shown how a quantum computer could help simulate complex materials such as the polycrystalline compound LK-99, which was purported by some researchers in 2024 to be a room-temperature superconductor.

Using a classical/quantum hybrid workflow, together with the firm’s proprietary material simulation approach to encode and compile materials on quantum hardware, Phasecraft researchers were able to establish a classical model of the LK99 structure that allowed them to extract an approximate representation of the electrons within the material. The illustration above shows the green and blue electronic structure around red and grey atoms in LK-99.

Montanaro believes another likely near-term goal for useful quantum computing is solving optimization problems – both here and in quantum simulation, “we think genuine value can be delivered already in this NISQ era with hundreds of qubits.” (NISQ, a term coined by Preskill, refers to noisy intermediate-scale quantum computing, with relatively small numbers of rather noisy, error-prone qubits.)

One further potential benefit of quantum computing is that it tends to require less energy than classical high-performance computing, which is notoriously high. If the energy cost could be cut by even a few percent, it would be worth using quantum resources for that reason alone. “Quantum has real potential for an energy advantage,” says Chow. One study in 2020 showed that a particular quantum-mechanical calculation carried out on a HPC used many orders of magnitude more energy than when it was simulated on a quantum circuit. Such comparisons are not easy, however, in the absence of an agreed and well-defined metric for energy consumption.

Building the market

Right now, the quantum computing market is in a curious superposition of states itself – it has ample proof of principle, but today’s devices are still some way from being able to perform a computation relevant to a practical problem that could not be done with classical computers. Yet to get to that point, the field needs plenty of investment.

The fact that quantum computers, especially if used with HPC, are already unique scientific tools should establish their value in the immediate term, says Gambetta. “I think this is going to accelerate, and will keep the funding going.” It is why IBM is focusing on utility-scale systems of around 100 qubits or so and more than a thousand gate operations, he says, rather than simply trying to build ever bigger devices.

Montanaro sees a role for governments to boost the growth of the industry “where it’s not the right fit for the private sector”. One role of government is simply as a customer. For example, Phasecraft is working with the UK national grid to develop a quantum algorithm for optimizing the energy network. “Longer-term support for academic research is absolutely critical,” Montanaro adds. “It would be a mistake to think that everything is done in terms of the underpinning science, and governments should continue to support blue-skies research.”

IBM roadmap of quantum development
The road ahead IBM’s current roadmap charts how the company plans on scaling up its devices to achieve a fault-tolerant device by 2029. Alongside hardware development, the firm will also focus on developing new algorithms and software for these devices. (Courtesy: IBM)

It’s not clear, though, whether there will be a big demand for quantum machines that every user will own and run. Before 2010, “there was an expectation that banks and government departments would all want their own machine – the market would look a bit like HPC,” Cuthbert says. But that demand depends in part on what commercial machines end up being like. “If it’s going to need a premises the size of a football field, with a power station next to it, that becomes the kind of infrastructure that you only want to build nationally.” Even for smaller machines, users are likely to try them first on the cloud before committing to installing one in-house.

According to Cuthbert , the real challenge in the supply-chain development is that many of today’s technologies were developed for the science community – where, say, achieving millikelvin cooling or using high-power lasers is routine. “How do you go from a specialist scientific clientele to something that starts to look like a washing machine factory, where you can make them to a certain level of performance,” while also being much cheaper, and easier to use?

But Cuthbert is optimistic about bridging this gap to get to commercially useful machines, encouraged in part by looking back at the classical computing industry of the 1970s. “The architects of those systems could not imagine what we would use our computation resources for today. So I don’t think we should be too discouraged that you can grow an industry when we don’t know what it’ll do in five years’ time.”

Montanaro too sees analogies with those early days of classical computing. “If you think what the computer industry looked like in the 1940s, it’s very different from even 20 years later. But there are some parallels. There are companies that are filling each of the different niches we saw previously, there are some that are specializing in quantum hardware development, there are some that are just doing software.” Cuthbert thinks that the quantum industry is likely to follow a similar pathway, “but more quickly and leading to greater market consolidation more rapidly.”

However, while the classical computing industry was revolutionized by the advent of personal computing in the 1970s and 80s, it seems very unlikely that we will have any need for quantum laptops. Rather, we might increasingly see apps and services appear that use cloud-based quantum resources for particular operations, merging so seamlessly with classical computing that we don’t even notice.

That, perhaps, would be the ultimate sign of success: that quantum computing becomes invisible, no big deal but just a part of how our answers are delivered.

  • In the first instalment of this two-part article, Philip Ball explores the latest developments in the quantum-computing industry

This article forms part of Physics World‘s contribution to the 2025 International Year of Quantum Science and Technology (IYQ), which aims to raise global awareness of quantum physics and its applications.

Stayed tuned to Physics World and our international partners throughout the year for more coverage of the IYQ.

Find out more on our quantum channel.

The post Quantum computing on the verge: correcting errors, developing algorithms and building up the user base appeared first on Physics World.

Quantum computing on the verge: a look at the quantum marketplace of today

14 octobre 2025 à 17:40

“I’d be amazed if quantum computing produces anything technologically useful in ten years, twenty years, even longer.” So wrote University of Oxford physicist David Deutsch – often considered the father of the theory of quantum computing – in 2004. But, as he added in a caveat, “I’ve been amazed before.”

We don’t know how amazed Deutsch, a pioneer of quantum computing, would have been had he attended a meeting at the Royal Society in London in February on “the future of quantum information”. But it was tempting to conclude from the event that quantum computing has now well and truly arrived, with working machines that harness quantum mechanics to perform computations being commercially produced and shipped to clients. Serving as the UK launch of the International Year of Quantum Science and Technology (IYQ) 2025, it brought together some of the key figures of the field to spend two days discussing quantum computing as something like a mature industry, even if one in its early days.

Werner Heisenberg – who worked out the first proper theory of quantum mechanics 100 years ago – would surely have been amazed to find that the formalism he and his peers developed to understand the fundamental behaviour of tiny particles had generated new ways of manipulating information to solve real-world problems in computation. So far, quantum computing – which exploits phenomena such as superposition and entanglement to potentially achieve greater computational power than the best classical computers can muster – hasn’t tackled any practical problems that can’t be solved classically.

Although the fundamental quantum principles are well-established and proven to work, there remain many hurdles that quantum information technologies have to clear before this industry can routinely deliver resources with transformative capabilities. But many researchers think that moment of “practical quantum advantage” is fast approaching, and an entire industry is readying itself for that day.

Entangled marketplace

So what are the current capabilities and near-term prospects for quantum computing?

The first thing to acknowledge is that a booming quantum-computing market exists. Devices are being produced for commercial use by a number of tech firms, from the likes of IBM, Google, D-Wave, and Rigetti who have been in the field for a decade or more; to relative newcomers like Nord Quantique (Canada), IQM (Finland), Quantinuum (UK and US), Orca (UK) and PsiQuantum (US), Silicon Quantum Computing (Australia), see box below, “The global quantum ecosystem”.

The global quantum ecosystem

Map showing the investments globally into quantum computing
(Courtesy: QURECA)

We are on the cusp of a second quantum revolution, with quantum science and technologies growing rapidly across the globe. This includes quantum computers; quantum sensing (ultra-high precision clocks, sensors for medical diagnostics); as well as quantum communications (a quantum internet). Indeed, according to the State of Quantum 2024 report, a total of 33 countries around the world currently have government initiatives in quantum technology, of which more than 20 have national strategies with large-scale funding. As of this year, worldwide investments in quantum tech – by governments and industry – exceed $55.7 billion, and the market is projected to reach $106 billion by 2040. With the multitude of ground-breaking capabilities that quantum technologies bring globally, it’s unsurprising that governments all over the world are eager to invest in the industry.

With data from a number of international reports and studies, quantum education and skills firm QURECA has summarized key programmes and efforts around the world. These include total government funding spent through 2025, as well as future commitments spanning 2–10 year programmes, varying by country. These initiatives generally represent government agencies’ funding announcements, related to their countries’ advancements in quantum technologies, excluding any private investments and revenues.

A supply chain is also organically developing, which includes manufacturers of specific hardware components, such as Oxford Instruments and Quantum Machines and software developers like Riverlane, based in Cambridge, UK, and QC Ware in Palo Alto, California. Supplying the last link in this chain are a range of eager end-users, from finance companies such as J P Morgan and Goldman Sachs to pharmaceutical companies such as AstraZeneca and engineering firms like Airbus. Quantum computing is already big business, with around 400 active companies and current global investment estimated at around $2 billion.

But the immediate future of all this buzz is hard to assess. When the chief executive of computer giant Nvidia announced at the start of 2025 that “truly useful” quantum computers were still two decades away, the previously burgeoning share prices of some leading quantum-computing companies plummeted. They have since recovered somewhat, but such volatility reflects the fact that quantum computing has yet to prove its commercial worth.

The field is still new and firms need to manage expectations and avoid hype while also promoting an optimistic enough picture to keep investment flowing in. “Really amazing breakthroughs are being made,” says physicist Winfried Hensinger of the University of Sussex, “but we need to get away from the expectancy that [truly useful] quantum computers will be available tomorrow.”

The current state of play is often called the “noisy intermediate-scale quantum” (NISQ) era. That’s because the “noisy” quantum bits (qubits) in today’s devices are prone to errors for which no general and simple correction process exists. Current quantum computers can’t therefore carry out practically useful computations that could not be done on classical high-performance computing (HPC) machines. It’s not just a matter of better engineering either; the basic science is far from done.

IBM quantum computer cryogenic chandelier
Building up Quantum computing behemoth IBM says that by 2029, its fault-tolerant system should accurately run 100 million gates on 200 logical qubits, thereby truly achieving quantum advantage. (Courtesy: IBM)

“We are right on the cusp of scientific quantum advantage – solving certain scientific problems better than the world’s best classical methods can,” says Ashley Montanaro, a physicist at the University of Bristol who co-founded the quantum software company Phasecraft. “But we haven’t yet got to the stage of practical quantum advantage, where quantum computers solve commercially important and practically relevant problems such as discovering the next lithium-ion battery.” It’s no longer if or how, but when that will happen.

Pick your platform

As the quantum-computing business is such an emerging area, today’s devices use wildly different types of physical systems for their qubits, see the box below, “Comparing computing modalities: from qubits to architectures”

. There is still no clear sign as to which of these platforms, if any, will emerge as the winner. Indeed many researchers believe that no single qubit type will ever dominate.

The top-performing quantum computers, like those made by Google (with its 105-qubit Willow chip) and IBM (which has made the 121-qubit Condor), use qubits in which information is encoded in the wavefunction of a superconducting material. Until recently, the strongest competing platform seemed to be trapped ions, where the qubits are individual ions held in electromagnetic traps – a technology being developed into working devices by the US company IonQ, spun out from the University of Maryland, among others.

Comparing computing modalities: from qubits to architectures

Table listing out the different types of qubit, the advantages of each and which company uses which qubit
(Courtesy: PatentVest)

Much like classical computers, quantum computers have a core processor and a control stack – the difference being that the core depends on the type of qubit being used. Currently, quantum computing is not based on a single platform, but rather a set of competing hardware approaches, each with its own physical basis for creating and controlling qubits and keeping them stable.

The data above –  taken from the August 2025 report Quantum Computing at an Inflection Point: Who’s Leading, What They Own, and Why IP Decides Quantum’s Future by US firm Patentvest – shows the key “quantum modalities”, which refers to the different types of qubits and architectures used to build these quantum systems. Differing qubits each have their own pros and cons, with varying factors including the temperature at which they operate, coherence time, gate speed, and how easy they might be to scale up.

But over the past few years, neutral trapped atoms have emerged as a major contender, thanks to advances in controlling the positions and states of these qubits. Here the atoms are prepared in highly excited electronic states called Rydberg atoms, which can be entangled with one another over a few microns. A Harvard startup called QuEra is developing this technology, as is the French start-up Pasqal. In September a team from the California Institute of Technology announced a 6100-qubit array made from neutral atoms. “Ten years ago I would not have included [neutral-atom] methods if I were hedging bets on the future of quantum computing,” says Deutsch’s Oxford colleague, the quantum information theorist Andrew Steane. But like many, he thinks differently now.

Some researchers believe that optical quantum computing, using photons as qubits, will also be an important platform. One advantage here is that there is no need for complex conversion of photonic signals in existing telecommunications networks going to or from the processing units, which is also handy for photonic interconnections between chips. What’s more, photonic circuits can work at room temperature, whereas trapped ions and superconducting qubits need to be cooled. Photonic quantum computing is being developed by firms like PsiQuantum, Orca, and Xanadu.

Other efforts, for example at Intel and Silicon Quantum Computing in Australia, make qubits from either quantum dots (Intel) or precision-placed phosphorus atoms (SQC), both in good old silicon, which benefits from a very mature manufacturing base. “Small qubits based on ions and atoms yield the highest quality processors”, says Michelle Simmons of the University of New South Wales, who is the founder and CEO of SQC. “But only atom-based systems in silicon combine this quality with manufacturability.”

Intel's silicon spin qubits are now being manufactured on an industrial scale
Spinning around Intel’s silicon spin qubits are now being manufactured on an industrial scale. (Courtesy: Intel Corporation)

And it’s not impossible that entirely new quantum computing platforms might yet arrive. At the start of 2025, researchers at Microsoft’s laboratories in Washington State caused a stir when they announced that they had made topological qubits from semiconducting and superconducting devices, which are less error-prone than those currently in use. The announcement left some scientists disgruntled because it was not accompanied by a peer-reviewed paper providing the evidence for these long-sought entities. But in any event, most researchers think it would take a decade or more for topological quantum computing to catch up with the platforms already out there.

Each of these quantum technologies has its own strengths and weaknesses. “My personal view is that there will not be a single architecture that ‘wins’, certainly not in the foreseeable future,” says Michael Cuthbert, founding director of the UK’s National Quantum Computing Centre (NQCC), which aims to facilitate the transition of quantum computing from basic research to an industrial concern. Cuthbert thinks the best platform will differ for different types of computation: cold neutral atoms might be good for quantum simulations of molecules, materials and exotic quantum states, say, while superconducting and trapped-ion qubits might be best for problems involving machine learning or optimization.

Measures and metrics

Given these pros and cons of different hardware platforms, one difficulty in assessing their merits is finding meaningful metrics for making comparisons. Should we be comparing error rates, coherence times (basically how long qubits remain entangled), gate speeds (how fast a single computational step can be conducted), circuit depth (how many steps a single computation can sustain), number of qubits in a processor, or what? “The metrics and measures that have been put forward so far tend to suit one or other platform more than others,” says Cuthbert, “such that it becomes almost a marketing exercise rather than a scientific benchmarking exercise as to which quantum computer is better.”

The NQCC evaluates the performance of devices using a factor known as the “quantum operation” (QuOp). This is simply the number of quantum operations that can be carried out in a single computation, before the qubits lose their coherence and the computation dissolves into noise. “If you want to run a computation, the number of coherent operations you can run consecutively is an objective measure,” Cuthbert says. If we want to get beyond the NISQ era, he adds, “we need to progress to the point where we can do about a million coherent operations in a single computation. We’re now at the level of maybe a few thousand. So we’ve got a long way to go before we can run large-scale computations.”

One important issue is how amenable the platforms are to making larger quantum circuits. Cuthbert contrasts the issue of scaling up – putting more qubits on a chip – with “scaling out”, whereby chips of a given size are linked in modular fashion. Many researchers think it unlikely that individual quantum chips will have millions of qubits like the silicon chips of today’s machines. Rather, they will be modular arrays of relatively small chips linked at their edges by quantum interconnects.

Having made the Condor, IBM now plans to focus on modular architectures (scaling out) – a necessity anyway, since superconducting qubits are micron-sized, so a chip with millions of them would be “bigger than your dining room table”, says Cuthbert. But superconducting qubits are not easy to scale out because microwave frequencies that control and read out the qubits have to be converted into optical frequencies for photonic interconnects. Cold atoms are easier to scale up, as the qubits are small, while photonic quantum computing is easiest to scale out because it already speaks the same language as the interconnects.

To be able to build up so called “fault tolerant” quantum computers, quantum platforms must solve the issue of error correction, which will enable more extensive computations without the results becoming degraded into mere noise.

In part two of this feature, we will explore how this is being achieved and meet the various firms developing quantum software. We will also look into the potential high-value commercial uses for robust quantum computers – once such devices exist.

  • This article was updated with additional content on 22 October 2025.

This article forms part of Physics World‘s contribution to the 2025 International Year of Quantum Science and Technology (IYQ), which aims to raise global awareness of quantum physics and its applications.

Stayed tuned to Physics World and our international partners throughout the year for more coverage of the IYQ.

Find out more on our quantum channel.

The post Quantum computing on the verge: a look at the quantum marketplace of today appeared first on Physics World.

Is materials science the new alchemy for the 21st century?

6 octobre 2025 à 14:00

For many years, I’ve been a judge for awards and prizes linked to research and innovation in engineering and physics. It’s often said that it’s better to give than to receive, and it’s certainly true in this case. But another highlight of my involvement with awards is learning about cutting-edge innovations I either hadn’t heard of or didn’t know much about.

One area that never fails to fascinate me is the development of new and advanced materials. I’m not a materials scientist – my expertise lies in creating monitoring systems for engineering – so I apologize for any over-simplification in what follows. But I do want to give you a sense of just how impressive, challenging and rewarding the field of materials science is.

It’s all too easy to take advanced materials for granted. We are in constant contact with them in everyday life, whether it’s through applications in healthcare, electronics and computing or energy, transport, construction and process engineering. But what are the most important materials innovations right now – and what kinds of novel materials can we expect in future?

Drivers of innovation

There are several – and all equally important – drivers when it comes to materials development. One is the desire to improve the performance of products we’re already familiar with. A second is the need to develop more sustainable materials, whether that means replacing less environmentally friendly solutions or enabling new technology. Third, there’s the drive for novel developments, which is where some of the most ground-breaking work is occurring.

On the environmental front, we know that there are many products with components that could, in principle, be recycled. However, the reality is that many products end up in landfill because of how they’ve been constructed. I was recently reminded of this conundrum when I heard a research presentation about the difficulties of recycling solar panels.

Solar farm in the evening sun
Green problem Solar panels often fail to be recycled at their end of their life despite containing reusable materials. (Courtesy: iStock/Milos Muller)

Photovoltaic cells become increasingly inefficient with time and most solar panels aren’t expected to last more than about 30 years. Trouble is, solar panels are so robustly built that recycling them requires specialized equipment and processes. More often than not, solar panels just get thrown away despite mostly containing reusable materials such as glass, plastic and metals – including aluminium and silver.

It seems ironic that solar panels, which enable sustainable living, could also contribute significantly to landfill. In fact, the problem could escalate significantly if left unaddressed. There are already an estimated 1.8 million solar panels in use the UK, and potentially billions around the world, with a rapidly increasing install base. Making solar panels more sustainable is surely a grand challenge in materials science.

Waste not, want not

Another vital issue concerns our addiction to new tech, which means we rarely hang on to objects until the end of their life; I mean, who hasn’t been tempted by a shiny new smartphone even though the old one is perfectly adequate? That urge for new objects means we need more materials and designs that can be readily re-used or recycled, thereby reducing waste and resource depletion.

As someone who works in the aerospace industry, I know first-hand how companies are trying to make planes more fuel efficient by developing composite materials that are stronger and can survive higher temperatures and pressures – for example carbon fibre and composite matrix ceramics. The industry also uses “additive manufacturing” to enable more intricate component design with less resultant waste.

Plastics are another key area of development. Many products are made from single type, recyclable materials, such as polyethylene or polypropylene, which benefit from being light, durable and capable of withstanding chemicals and heat. Trouble is, while polyethene and polypropene can be recycled, they both create the tiny “microplastics” that, as we know all too well, are not good news for the environment.

Person holding eco plastic garbage bio bags in rolls outdoors
Sustainable challenge Material scientists will need to find practical bio-based alternatives to conventional plastics to avoid polluting microplastics entering the seas and oceans. (Courtesy: iStock/Dmitriy Sidor)

Bio-based materials are becoming more common for everyday items. Think about polylactic acid (PLA), which is a plant-based polymer derived from renewable resources such as cornstarch or sugar cane. Typically used for food or medical packaging, it’s usually said to be “compostable”, although this is a term we need to view with caution.

Sadly, PLA does not degrade readily in natural environments or landfill. To break it down, you need high-temperature, high-moisture industrial composting facilities. So whilst PLAs come from natural plants, they are not straightforward to recycle, which is why single-use disposable items, such as plastic cutlery, drinking straws and plates, are no longer permitted to be made from it.

Thankfully, we’re also seeing greater use of more sustainable, natural fibre composites, such as flax, hemp and bamboo (have you tried bamboo socks or cutlery?). All of which brings me to an interesting urban myth, which is that in 1941 legendary US car manufacturer Henry Ford built a car apparently made entirely of a plant-based plastic – dubbed the “soybean” car (see box).

The soybean car: fact or fiction?

Soybean car frame patent
Crazy or credible? Soybean car frame patent signed by Henry Ford and Eugene Turenne Gregorie. (Courtesy: Image in public domain)

Henry Ford’s 1941 “soybean” car, which was built entirely of a plant-based plastic, was apparently motivated by a need to make vehicles lighter (and therefore more fuel efficient), less reliant on steel (which was in high demand during the Second World War) and safer too. The exact ingredients of the plastic are, however, not known since there were no records kept.

Speculation is that it was a combination of soybeans, wheat, hemp, flax and ramie (a kind of flowering nettle). Lowell Overly, a Ford designer who had major involvement in creating the car, said it was “soybean fibre in a phenolic resin with formaldehyde used in the impregnation”. Despite being a mix of natural and synthetic materials – and not entirely made of soybeans – the car was nonetheless a significant advancement for the automotive industry more than eight decades ago.

Avoiding the “solar-panel trap”

So what technology developments do we need to take materials to the next level? The key will be to avoid what I coin the “solar-panel trap” and find materials that are sustainable from cradle to grave. We have to create an environmentally sustainable economic system that’s based on the reuse and regeneration of materials or products – what some dub the “circular economy”.

Sustainable composites will be essential. We’ll need composites that can be easily separated, such as adhesives that dissolve in water or a specific solvent, so that we can cleanly, quickly and cheaply recover valuable materials from complex products. We’ll also need recycled composites, using recycled carbon fibre, or plastic combined with bio-based resins made from renewable sources like plant-based oils, starches and agricultural waste (rather than fossil fuels).

Vital too will be eco-friendly composites that combine sustainable composite materials (such as natural fibres) with bio-based resins. In principle, these could be used to replace traditional composite materials and to reduce waste and environmental impact.

Another important trend is developing novel metals and complex alloys. As well as enhancing traditional applications, these are addressing future requirements for what may become commonplace applications, such as wide-scale hydrogen manufacture, transportation and distribution.

Soft and stretchy

Then there are “soft composites”. These are advanced, often biocompatible materials that combine softer, rubbery polymers with reinforcing fibres or nanoparticles to create flexible, durable and functional materials that can be used for soft robotics, medical implants, prosthetics and wearable sensors. These materials can be engineered for properties like stretchability, self-healing, magnetic actuation and tissue integration, enabling innovative and patient-friendly healthcare solutions.

Wearable electronic monitors on patients' arms
Medical magic Wearable electronic materials could transform how we monitor human health. (Shutterstock/Guguart)

And have you heard of e-textiles, which integrate electronic components into everyday fabrics? These materials could be game-changing for healthcare applications by offering wearable, non-invasive monitoring of physiological information such as heart rate and respiration.

Further applications could include advanced personal protective equipment (PPE), smart bandages and garments for long-term rehabilitation and remote patient care. Smart textiles could revolutionize medical diagnostics, therapy delivery and treatment by providing personalized digital healthcare solutions.

Towards “new gold”

I realize I have only scratched the surface of materials science – an amazing cauldron of ideas where physics, chemistry and engineering work hand in hand to deliver groundbreaking solutions. It’s a hugely and truly important discipline. With far greater success than the original alchemists, materials scientists are adept at creating the “new gold”.

Their discoveries and inventions are making major contributions to our planet’s sustainable economy from the design, deployment and decommission of everyday items, as well as finding novel solutions that will positively impact way we live today. Surely it’s an area we should celebrate and, as physicists, become more closely involved in.

The post Is materials science the new alchemy for the 21st century? appeared first on Physics World.

❌