↩ Accueil

Vue lecture

Microsoft says everyone will be a boss in the future – of AI employees

Tech company predicts rise of ‘frontier firms’ – where a human worker directs AI agents to carry out tasks

Microsoft has good news for anyone with corner office ambitions. In the future we’re all going to be bosses – of AI employees.

The tech company is predicting the rise of a new kind of business, called a “frontier firm”, where ultimately a human worker directs autonomous artificial intelligence agents to carry out tasks.

Continue reading...

© Photograph: Roman Lacheev/Alamy

© Photograph: Roman Lacheev/Alamy

  •  

Elon Musk’s xAI accused of pollution over Memphis supercomputer

Hearing scheduled for Friday as residents receive anonymous leaflets that downplay pollution dangers

Elon Musk’s artificial intelligence (AI) company is stirring controversy in Memphis, Tennessee. That’s where he’s building a massive supercomputer to power his company xAI. Community residents and environmental activists say that since the supercomputer was fired up last summer it has become one of the biggest air polluters in the county. But some local officials have championed the billionaire, saying he is investing in Memphis.

The first public hearing with the health department is scheduled for Friday, where county officials will hear from all sides of the debate. In the run-up to the hearing, secretive fliers claiming xAI has low emissions were sent to residents of historically Black neighborhoods; at the same time, environmental groups have been amassing data about how much pollution the AI company is likely generating.

Continue reading...

© Photograph: Steve Jones/Flight by Southwings for Southern Environmental Law Center

© Photograph: Steve Jones/Flight by Southwings for Southern Environmental Law Center

  •  

AI images of child sexual abuse getting ‘significantly more realistic’, says watchdog

Internet Watch Foundation report shows 380% increase in illegal AI-generated imagery in 2024, most of it ‘category A’

Images of child sexual abuse created by artificial intelligence are becoming “significantly more realistic”, according to an online safety watchdog.

The Internet Watch Foundation (IWF) said advances in AI are being reflected in illegal content created and consumed by paedophiles, saying: “In 2024, the quality of AI-generated videos improved exponentially, and all types of AI imagery assessed appeared significantly more realistic as the technology developed.”

Continue reading...

© Photograph: Doug Armand/Getty Images

© Photograph: Doug Armand/Getty Images

  •  

On the path towards a quantum economy

The high-street bank HSBC has worked with the NQCC, hardware provider Rigetti and the Quantum Software Lab to investigate the advantages that quantum computing could offer for detecting the signs of fraud in transactional data. (Courtesy: Shutterstock/Westend61 on Offset)

Rapid technical innovation in quantum computing is expected to yield an array of hardware platforms that can run increasingly sophisticated algorithms. In the real world, however, such technical advances will remain little more than a curiosity if they are not adopted by businesses and the public sector to drive positive change. As a result, one key priority for the UK’s National Quantum Computing Centre (NQCC) has been to help companies and other organizations to gain an early understanding of the value that quantum computing can offer for improving performance and enhancing outcomes.

To meet that objective the NQCC has supported several feasibility studies that enable commercial organizations in the UK to work alongside quantum specialists to investigate specific use cases where quantum computing could have a significant impact within their industry. One prime example is a project involving the high-street bank HSBC, which has been exploring the potential of quantum technologies for spotting the signs of fraud in financial transactions. Such fraudulent activity, which affects millions of people every year, now accounts for about 40% of all criminal offences in the UK and in 2023 generated total losses of more than £2.3 bn across all sectors of the economy.

Banks like HSBC currently exploit classical machine learning to detect fraudulent transactions, but these techniques require a large computational overhead to train the models and deliver accurate results. Quantum specialists at the bank have therefore been working with the NQCC, along with hardware provider Rigetti and the Quantum Software Lab at the University of Edinburgh, to investigate the capabilities of quantum machine learning (QML) for identifying the tell-tale indicators of fraud.

“HSBC’s involvement in this project has brought transactional fraud detection into the realm of cutting-edge technology, demonstrating our commitment to pushing the boundaries of quantum-inspired solutions for near-term benefit,” comments Philip Intallura, Group Head of Quantum Technologies at HSBC. “Our philosophy is to innovate today while preparing for the quantum advantage of tomorrow.”

Another study focused on a key problem in the aviation industry that has a direct impact on fuel consumption and the amount of carbon emissions produced during a flight. In this logistical challenge, the aim was to find the optimal way to load cargo containers onto a commercial aircraft. One motivation was to maximize the amount of cargo that can be carried, the other was to balance the weight of the cargo to reduce drag and improve fuel efficiency.

“Even a small shift in the centre of gravity can have a big effect,” explains Salvatore Sinno of technology solutions company Unisys, who worked on the project along with applications engineers at the NQCC and mathematicians at the University of Newcastle. “On a Boeing 747 a displacement of just 75 cm can increase the carbon emissions on a flight of 10,000 miles by four tonnes, and also increases the fuel costs for the airline company.”

aeroplane being loaded with cargo
A hybrid quantum–classical solution has been used to optimize the configuration of air freight, which can improve fuel efficiency and lower carbon emissions. (Courtesy: Shutterstock/supakitswn)

With such a large number of possible loading combinations, classical computers cannot produce an exact solution for the optimal arrangement of cargo containers. In their project the team improved the precision of the solution by combining quantum annealing with high-performance computing, a hybrid approach that Unisys believes can offer immediate value for complex optimization problems. “We have reached the limit of what we can achieve with classical computing, and with this work we have shown the benefit of incorporating an element of quantum processing into our solution,” explains Sinno.

The HSBC project team also found that a hybrid quantum–classical solution could provide an immediate performance boost for detecting anomalous transactions. In this case, a quantum simulator running on a classical computer was used to run quantum algorithms for machine learning. “These simulators allow us to execute simple QML programmes, even though they can’t be run to the same level of complexity as we could achieve with a physical quantum processor,” explains Marco Paini, the project lead for Rigetti. “These simulations show the potential of these low-depth QML programmes for fraud detection in the near term.”

The team also simulated more complex QML approaches using a similar but smaller-scale problem, demonstrating a further improvement in performance. This outcome suggests that running deeper QML algorithms on a physical quantum processor could deliver an advantage for detecting anomalies in larger datasets, even though the hardware does not yet provide the performance needed to achieve reliable results. “This initiative not only showcases the near-term applicability of advanced fraud models, but it also equips us with the expertise to leverage QML methods as quantum computing scales,” comments Intellura.

Indeed, the results obtained so far have enabled the project partners to develop a roadmap that will guide their ongoing development work as the hardware matures. One key insight, for example, is that even a fault-tolerant quantum computer would struggle to process the huge financial datasets produced by a bank like HSBC, since a finite amount of time is needed to run the quantum calculation for each data point. “From the simulations we found that the hybrid quantum–classical solution produces more false positives than classical methods,” says Paini. “One approach we can explore would be to use the simulations to flag suspicious transactions and then run the deeper algorithms on a quantum processor to analyse the filtered results.”

This particular project also highlighted the need for agreed protocols to navigate the strict rules on data security within the banking sector. For this project the HSBC team was able to run the QML simulations on its existing computing infrastructure, avoiding the need to share sensitive financial data with external partners. In the longer term, however, banks will need reassurance that their customer information can be protected when processed using a quantum computer. Anticipating this need, the NQCC has already started to work with regulators such as the Financial Conduct Authority, which is exploring some of the key considerations around privacy and data security, with that initial work feeding into international initiatives that are starting to consider the regulatory frameworks for using quantum computing within the financial sector.

For the cargo-loading project, meanwhile, Sinno says that an important learning point has been the need to formulate the problem in a way that can be tackled by the current generation of quantum computers. In practical terms that means defining constraints that reduce the complexity of the problem, but that still reflect the requirements of the real-world scenario. “Working with the applications engineers at the NQCC has helped us to understand what is possible with today’s quantum hardware, and how to make the quantum algorithms more viable for our particular problem,” he says. “Participating in these studies is a great way to learn and has allowed us to start using these emerging quantum technologies without taking a huge risk.”

Indeed, one key feature of these feasibility studies is the opportunity they offer for different project partners to learn from each other. Each project includes an end-user organization with a deep knowledge of the problem, quantum specialists who understand the capabilities and limitations of present-day solutions, and academic experts who offer an insight into emerging theoretical approaches as well as methodologies for benchmarking the results. The domain knowledge provided by the end users is particularly important, says Paini, to guide ongoing development work within the quantum sector. “If we only focused on the hardware for the next few years, we might come up with a better technical solution but it might not address the right problem,” he says. “We need to know where quantum computing will be useful, and to find that convergence we need to develop the applications alongside the algorithms and the hardware.”

Another major outcome from these projects has been the ability to make new connections and identify opportunities for future collaborations. As a national facility NQCC has played an important role in providing networking opportunities that bring diverse stakeholders together, creating a community of end users and technology providers, and supporting project partners with an expert and independent view of emerging quantum technologies. The NQCC has also helped the project teams to share their results more widely, generating positive feedback from the wider community that has already sparked new ideas and interactions.

“We have been able to network with start-up companies and larger enterprise firms, and with the NQCC we are already working with them to develop some proof-of-concept projects,” says Sinno. “Having access to that wider network will be really important as we continue to develop our expertise and capability in quantum computing.”

The post On the path towards a quantum economy appeared first on Physics World.

  •  

New entanglement approach could boost photonic quantum computing

Diagram showing the arrangement of the coupled waveguides, represented as circles labelled with A, C, W and E and connected by lines
Deterministic entanglement through holonomy: A system of four coupled optical waveguides (A, C, E, W), with three inter-waveguide coupling coefficients (k_A,k_E,k_W) vary in such a way to define a closed path γ. (Courtesy: Reprinted with permission from http://dx.doi.org/10.1103/PhysRevLett.134.080201)

Physicists at the Georgia Institute of Technology, US have introduced a novel way to generate entanglement between photons – an essential step in building scalable quantum computers that use photons as quantum bits (qubits). Their research, published in Physical Review Letters, leverages a mathematical concept called non-Abelian quantum holonomy to entangle photons in a deterministic way without relying on strong nonlinear interactions or irrevocably probabilistic quantum measurements.

Entanglement is fundamental to quantum information science, distinguishing quantum mechanics from classical theories and serving as a pivotal resource for quantum technologies. Existing methods for entangling photons often suffer from inefficiencies, however, requiring additional particles such as atoms or quantum dots and additional steps such as post-selection that eliminate all outcomes of a quantum measurement in which a desired event does not occur.

While post-selection is a common strategy for entangling non-interacting quantum particles, protocols for entangled state preparation that use post-selection are non-deterministic. This is because they rely upon making measurements, and the result of obtaining a certain state of the system after a measurement is associated with a probability, making it inevitably non-deterministic.

Non-Abelian holonomy

The new approach provides a direct and deterministic alternative. In it, the entangled photons occupy distinguishable spatial modes of optical waveguides, making entanglement more practical for real-world applications. To develop it, Georgia Tech’s Aniruddha Bhattacharya and Chandra Raman took inspiration from a 2023 experiment by physicists at Universität Rostock, Germany, that involved coupled photonic waveguides on a fused silica chip. Both works exploit a property known as non-Abelian holonomy, which is essentially a geometric effect that occurs when a quantum system evolves along a closed path in parameter space (more precisely, it is a matrix-valued generalization of a pure geometric phase).

In Bhattacharya and Raman’s approach, photons evolve in a waveguide system where their quantum states undergo a controlled transformation that leads to entanglement. The pair derive an analytical expression for the holonomic transformation matrix, showing that the entangling operation corresponds to a unitary rotation within an effective pseudo-angular momentum space. Because this process is fully unitary, it does not require measurement or external interventions, making it inherently robust.

Beyond the Hong-Ou-Mandel effect

A classic example of photon entanglement is the Hong–Ou–Mandel (HOM) effect, where two identical photons interfere at a beam splitter, leading to quantum correlations between them. The new method extends such interference effects beyond two photons, allowing deterministic entanglement of multiple photons and even higher-dimensional quantum states known as qudits (d-level systems) instead of qubits (two-level systems). This could significantly improve the efficiency of quantum information protocols.

Because state preparation and measurement are relatively straightforward in this approach, Bhattacharya and Raman say it is well-suited for quantum computing. Since the method relies on geometric principles, it naturally protects against certain types of noise, making it more robust than traditional approaches. They add that their technique could even be used to construct an almost universal set of near-deterministic entangling gates for quantum computation with light. “This innovative use of non-Abelian holonomy could shift the way we think about photonic quantum computing,” they say.

By providing a deterministic and scalable entanglement mechanism, Bhattacharya and Raman add that their method opens the door to more efficient and reliable photonic quantum technologies. The next steps will be to validate the approach experimentally and explore practical implementations in quantum communication and computation. Further in the future, it will be necessary to find ways of integrating this approach with other quantum systems, such as matter-based qubits, to enable large-scale quantum networks.

The post New entanglement approach could boost photonic quantum computing appeared first on Physics World.

  •  

Operating system for quantum networks is a first

Researchers in the Netherlands, Austria, and France have created what they describe as the first operating system for networking quantum computers. Called QNodeOS, the system was developed by a team led by Stephanie Wehner at Delft University of Technology. The system has been tested using several different types of quantum processor and it could help boost the accessibility of quantum computing for people without an expert knowledge of the field.

In the 1960s, the development of early operating systems such as OS/360 and UNIX  represented a major leap forward in computing. By providing a level of abstraction in its user interface, an operating system enables users to program and run applications, without having to worry about how to reconfigure the transistors in the computer processors. This advance laid the groundwork for the many of the digital technologies that have revolutionized our lives.

“If you needed to directly program the chip installed in your computer in order to use it, modern information technologies would not exist,” Wehner explains. “As such, the ability to program and run applications without needing to know what the chip even is has been key in making networks like the Internet actually useful.”

Quantum and classical

The users of nascent quantum computers would also benefit from an operating system that allows quantum (and classical) computers to be connected in networks. Not least because most people are not familiar with the intricacies of quantum information processing.

However, quantum computers are fundamentally different from their classical counterparts, and this means a host of new challenges faces those developing network operating systems.

“These include the need to execute hybrid classical–quantum programs, merging high-level classical processing (such as sending messages over a network) with quantum operations (such as executing gates or generating entanglement),” Wehner explains.

Within these hybrid programs, quantum computing resources would only be used when specifically required. Otherwise, routine computations would be offloaded to classical systems, making it significantly easier for developers to program and run their applications.

No standardized architecture

In addition, Wehner’s team considered that, unlike the transistor circuits used in classical systems, quantum operations currently lack a standardized architecture – and can be carried out using many different types of qubits.

Wehner’s team addressed these design challenges by creating a QNodeOS, which is a hybridized network operating system. It combines classical and quantum “blocks”, that provide users with a platform for performing quantum operations.

“We implemented this architecture in a software system, and demonstrated that it can work with different types of quantum hardware,” Wehner explains. The qubit-types used by the team included the electronic spin states of nitrogen–vacancy defects in diamond and the energy levels of individual trapped ions.

Multi-tasking operation

“We also showed how QNodeOS can perform advanced functions such as multi-tasking. This involved the concurrent execution of several programs at once, including compilers and scheduling algorithms.”

QNodeOS is still a long way from having the same impact as UNIX and other early operating systems. However, Wehner’s team is confident that QNodeOS will accelerate the development of future quantum networks.

“It will allow for easier software development, including the ability to develop new applications for a quantum Internet,” she says. “This could open the door to a new area of quantum computer science research.”

The research is described in Nature.

The post Operating system for quantum networks is a first appeared first on Physics World.

  •  

5G et Technologies Immersives – Nokia / Laval Virtual 2019

Dans le cas de la réalité augmentée, les utilisateurs pourront télécharger en 5G les maps 3D traitées en temps réel depuis des infrastructure Edge Computing. D’autre part, les informations associées à ces Maps 3D proviendront d’une source de données plus éloignée : le cloud.

L’article 5G et Technologies Immersives – Nokia / Laval Virtual 2019 est apparu en premier sur Réalité Augmentée - Augmented Reality.

  •  

Microsoft’s Chetan Nayak on topological qubits, the physics of bigger splashes

Last week I had the pleasure of attending the Global Physics Summit (GPS) in Anaheim California, where I rubbed shoulders with 15,0000 fellow physicists. The best part of being there was chatting with lots of different people, and in this podcast I share two of those conversations.

First up is Chetan Nayak, who is a senior researcher at Microsoft’s Station Q quantum computing research centre here in California. In February, Nayak and colleagues claimed a breakthrough in the development of topological quantum bits (qubits) based on Majorana zero modes. In principle, such qubits could enable the development of practical quantum computers, but not all physicists were convinced, and the announcement remains controversial – despite further results presented by Nayak in a packed session at the GPS.

I caught up with Nayak after his talk and asked him about the challenges of achieving Microsoft’s goal of a superconductor-based topological qubit. That conversation is the first segment of today’s podcast.

Distinctive jumping technique

Up next, I chat with Atharva Lele about the physics of manu jumping, which is a competitive aquatic sport that originates from the Māori and Pasifika peoples of New Zealand. Jumpers are judged by the height of their splash when they enter the water, and the best competitors use a very distinctive technique.

Lele is an undergraduate student at the Georgia Institute of Technology in the US, and is part of team that analysed manu techniques in a series of clever experiments that included plunging robots. He explains how to make a winning manu jump while avoiding the pain of a belly flop.

This article forms part of Physics World‘s contribution to the 2025 International Year of Quantum Science and Technology (IYQ), which aims to raise global awareness of quantum physics and its applications.

Stayed tuned to Physics World and our international partners throughout the next 12 months for more coverage of the IYQ.

Find out more on our quantum channel.

The post Microsoft’s Chetan Nayak on topological qubits, the physics of bigger splashes appeared first on Physics World.

  •  

Quantum computers extend lead over classical machines in random circuit sampling

Researchers in China have unveiled a 105-qubit quantum processor that can solve in minutes a quantum computation problem that would take billions of years using the world’s most powerful classical supercomputers. The result sets a new benchmark for claims of so-called “quantum advantage”, though some previous claims have faded after classical algorithms improved.

The fundamental promise of quantum computation is that it will reduce the computational resources required to solve certain problems. More precisely, it promises to reduce the rate at which resource requirements grow as problems become more complex. Evidence that a quantum computer can solve a problem faster than a classical computer – quantum advantage – is therefore a key measure of success.

The first claim of quantum advantage came in 2019, when researchers at Google reported that their 53-qubit Sycamore processor had solved a problem known as random circuit sampling (RCS) in just 200 seconds. Xiaobu Zhu, a physicist at the University of Science and Technology of China (USTC) in Hefei who co-led the latest work, describes RCS as follows: “First, you initialize all the qubits, then you run them in single-qubit and two-qubit gates and finally you read them out,” he says. “Since this process includes every key element of quantum computing, such as initializing the gate operations and readout, unless you have really good fidelity at each step you cannot demonstrate quantum advantage.”

At the time, the Google team claimed that the best supercomputers would take 10::000 years to solve this problem. However, subsequent improvements to classical algorithms reduced this to less than 15 seconds. This pattern has continued ever since, with experimentalists pushing quantum computing forward even as information theorists make quantum advantage harder to achieve by improving techniques used to simulate quantum algorithms on classical computers.

Recent claims of quantum advantage

In October 2024, Google researchers announced that their 67-qubit Sycamore processor had solved an RCS problem that would take an estimated 3600 years for the Frontier supercomputer at the US’s Oak Ridge National Laboratory to complete. In the latest work, published in Physical Review Letters, Jian-Wei Pan, Zhu and colleagues set the bar even higher. They show that their new Zuchongzhi 3.0 processor can complete in minutes an RCS calculation that they estimate would take Frontier billions of years using the best classical algorithms currently available.

To achieve this, they redesigned the readout circuit of their earlier Zuchongzhi processor to improve its efficiency, modified the structures of the qubits to increase their coherence times and increased the total number of superconducting qubits to 105. “We really upgraded every aspect and some parts of it were redesigned,” Zhu says.

Google’s latest processor, Willow, also uses 105 superconducting qubits, and in December 2024 researchers there announced that they had used it to demonstrate quantum error correction. This achievement, together with complementary advances in Rydberg atom qubits from Harvard University’s Mikhail Lukin and colleagues, was named Physics World’s Breakthrough of the Year in 2024. However, Zhu notes that Google has not yet produced any peer-reviewed research on using Willow for RCS, making it hard to compare the two systems directly.

The USTC team now plans to demonstrate quantum error correction on Zuchongzhi 3.0. This will involve using an error correction code such as the surface code to combine multiple physical qubits into a single “logical qubit” that is robust to errors.  “The requirements for error-correction readout are much more difficult than for RCS,” Zhu notes. “RCS only needs one readout, whereas error-correction needs readout many times with very short readout times…Nevertheless, RCS can be a benchmark to show we have the tools to run the surface code. I hope that, in my lab, within a few months we can demonstrate a good-quality error correction code.”

“How progress gets made”

Quantum information theorist Bill Fefferman of the University of Chicago in the US praises the USTC team’s work, describing it as “how progress gets made”. However, he offers two caveats. The first is that recent demonstrations of quantum advantage do not have efficient classical verification schemes – meaning, in effect, that classical computers cannot check the quantum computer’s work. While the USTC researchers simulated a smaller problem on both classical and quantum computers and checked that the answers matched, Fefferman doesn’t think this is sufficient. “With the current experiments, at the moment you can’t simulate it efficiently, the verification doesn’t work anymore,” he says.

The second caveat is that the rigorous hardness arguments proving that the classical computational power needed to solve an RCS problem grows exponentially with the problem’s complexity apply only to situations with no noise. This is far from the case in today’s quantum computers, and Fefferman says this loophole has been exploited in many past quantum advantage experiments.

Still, he is upbeat about the field’s prospects. “The fact that the original estimates the experimentalists gave did not match some future algorithm’s performance is not a failure: I see that as progress on all fronts,” he says. “The theorists are learning more and more about how these systems work and improving their simulation algorithms and, based on that, the experimentalists are making their systems better and better.”

The post Quantum computers extend lead over classical machines in random circuit sampling appeared first on Physics World.

  •  

D-Wave Systems claims quantum advantage, but some physicists are not convinced

D-Wave Systems has used quantum annealing to do simulations of quantum magnetic phase transitions. The company claims that some of their calculations would be beyond the capabilities of the most powerful conventional (classical) computers – an achievement referred to as quantum advantage. This would mark the first time quantum computers had achieved such a feat for a practical physics problem.

However, the claim has been challenged by two independent groups of researchers in Switzerland and the US, who have published papers on the arXiv preprint server that report that similar calculations could be done using classical computers. D-Wave’s experts believe these classical results fall well short of the company’s own accomplishments, and some independent experts agree with D-Wave.

While most companies trying to build practical quantum computers are developing “universal” or “gate model” quantum systems, US-based D-Wave has principally focused on quantum annealing devices. While such systems are less programmable than gate model systems, the approach has allowed D-Wave to build machines with many more quantum bits (qubits) than any of its competitors. Whereas researchers at Google Quantum AI and researchers in China have, independently, recently unveiled 105-qubit universal quantum processors, some of D-Wave’s have more than 5000 qubits. Moreover, D-Wave’s systems are already in practical use, with hardware owned by the Japanese mobile phone company NTT Docomo being used to optimize cell tower operations. Systems are also being used for network optimization at motor companies, food producers and elsewhere.

Trevor Lanting, the chief development officer at D-Wave, explains the central principles behind  quantum-annealing computation: “You have a network of qubits with programmable couplings and weights between those devices and then you program in a certain configuration – a certain bias on all of the connections in the annealing processor,” he says. The quantum annealing algorithm places the system in a superposition of all possible states of the system. When the couplings are slowly switched off, the system settles into its most energetically favoured state – which is the desired solution.

Quantum hiking

Lanting compares this to a hiker in the mountains searching for the lowest point on a landscape: “As a classical hiker all you can really do is start going downhill until you get to a minimum, he explains; “The problem is that, because you’re not doing a global search, you could get stuck in a local valley that isn’t at the minimum elevation.” By starting out in a quantum superposition of all possible states (or locations in the mountains), however, quantum annealing is able to find the global potential minimum.

In the new work, researchers at D-Wave and elsewhere set out to show that their machines could use quantum annealing to solve practical physics problems beyond the reach of classical computers. The researchers used two different 1200-qubit processors to model magnetic quantum phase transitions. This is a similar problem to one studied in gate-model systems by researchers at Google and Harvard University in independent work announced in February.

“When water freezes into ice, you can sometimes see patterns in the ice crystal, and this is a result of the dynamics of the phase transition,” explains Andrew King, who is senior distinguished scientist at D-Wave and the lead author of a paper describing the work. “The experiments that we’re demonstrating shed light on a quantum analogue of this phenomenon taking place in a magnetic material that has been programmed into our quantum processors and a phase transition driven by a magnetic field.” Understanding such phase transitions are important in the discovery and design of new magnetic materials.

Quantum versus classical

The researchers studied multiple configurations, comprising ever-more spins arranged in ever-more complex lattice structures. The company says that its system performed the most complex simulation in minutes. They also ascertained how long it would take to do the simulations using several leading classical computation techniques, including neural network methods, and how the time to achieve a solution grew with the complexity of the problem. Based on this, they extrapolated that the most complex lattices would require almost a million years on Frontier, which is one of the world’s most powerful supercomputers.

However, two independent groups – one at EPFL in Switzerland and one at the Flatiron Institute in the US – have posted papers on the arXiv preprint server claiming to have done some of the less complex calculations using classical computers. They argue that their results should scale simply to larger sizes; the implication being that classical computers could solve the more complicated problems addressed by D-Wave.

King has a simple response: “You don’t just need to do the easy simulations, you need to do the hard ones as well, and nobody has demonstrated that.” Lanting adds that “I see this as a healthy back and forth between quantum and classical methods, but I really think that, with these results, we’re pulling ahead of classical methods on the biggest scales we can calculate”.

Very interesting work

Frank Verstraete of the University of Cambridge is unsurprised by some scientists’ scepticism. “D-Wave have historically been the absolute champions at overselling what they did,” he says. “But now it seems they’re doing something nobody else can reproduce, and in that sense it’s very interesting.” He does note, however, that the specific problem chosen is not, in his view an interesting one from a physics perspective, and has been chosen purely to be difficult for a classical computer.

Daniel Lidar of the University of Southern California, who has previously collaborated with D-Wave on similar problems but was not involved in the current work, says “I do think this is quite the breakthrough…The ability to anneal very fast on the timescales of the coherence times of the qubits has now become possible, and that’s really a game changer here.” He concludes that “the arms race is destined to continue between quantum and classical simulations, and because, in all likelihood, these are problems that are extremely hard classically, I think the quantum win is going to become more and more indisputable.”

The D-Wave research is described in Science. The Flatiron Institute preprint is by Joseph Tindall and colleagues, and the EPFL preprint is by Linda Mauron and Giuseppe Carleo.

The post D-Wave Systems claims quantum advantage, but some physicists are not convinced appeared first on Physics World.

  •  

Cat qubits open a faster track to fault-tolerant quantum computing

Researchers from the Amazon Web Services (AWS) Center for Quantum Computing have announced what they describe as a “breakthrough” in quantum error correction. Their method uses so-called cat qubits to reduce the total number of qubits required to build a large-scale, fault-tolerant quantum computer, and they claim it could shorten the time required to develop such machines by up to five years.

Quantum computers are promising candidates for solving complex problems that today’s classical computers cannot handle. Their main drawback is the tendency for errors to crop up in the quantum bits, or qubits, they use to perform computations. Just like classical bits, the states of qubits can erroneously flip from 0 to 1, which is known as a bit-flip error. In addition, qubits can suffer from inadvertent changes to their phase, which is a parameter that characterizes their quantum superposition (phase-flip errors). A further complication is that whereas classical bits can be copied in order to detect and correct errors, the quantum nature of qubits makes copying impossible. Hence, errors need to be dealt with in other ways.

One error-correction scheme involves building physical or “measurement” qubits around each logical or “data” qubit. The job of the measurement qubits is to detect phase-flip or bit-flip errors in the data qubits without destroying their quantum nature. In 2024, a team at Google Quantum AI showed that this approach is scalable in a system of a few dozen qubits. However, a truly powerful quantum computer would require around a million data qubits and an even larger number of measurement qubits.

Cat qubits to the rescue

The AWS researchers showed that it is possible reduce this total number of qubits. They did this by using a special type of qubit called a cat qubit. Named after the Schrödinger’s cat thought that illustrates the concept of quantum superposition, cat qubits use the superposition of coherent states to encode information in a way that resists bit flips. Doing so may increase the number of phase-flip errors, but special error-correction algorithms can deal with these efficiently.

The AWS team got this result by building a microchip containing an array of five cat qubits. These are connected to four transmon qubits, which are a type of superconducting qubit with a reduced sensitivity to charge noise (a major source of errors in quantum computations). Here, the cat qubits serve as data qubits, while the transmon qubits measure and correct phase-flip errors. The cat qubits were further stabilized by connecting each of them to a buffer mode that uses a non-linear process called two-photon dissipation to ensure that their noise bias is maintained over time.

According to Harry Putterman, a senior research scientist at AWS, the team’s foremost challenge (and innovation) was to ensure that the system did not introduce too many bit-flip errors. This was important because the system uses a classical repetition code as its “outer layer” of error correction, which left it with no redundancy against residual bit flips. With this aspect under control, the researchers demonstrated that their superconducting quantum circuit suppressed errors from 1.75% per cycle for a three-cat qubit array to 1.65% per cycle for a five-cat qubit array. Achieving this degree of error suppression with larger error-correcting codes previously required tens of additional qubits.

On a scalable path

AWS’s director of quantum hardware, Oskar Painter, says the result will reduce the development time for a full-scale quantum computer by 3-5 years. This is, he says, a direct outcome of the system’s simple architecture as well as its 90% reduction in the “overhead” required for quantum error correction. The team does, however, need to reduce the error rates of the error-corrected logical qubits. “The two most important next steps towards building a fault-tolerant quantum computer at scale is that we need to scale up to several logical qubits and begin to perform and study logical operations at the logical qubit level,” Painter tells Physics World.

According to David Schlegel, a research scientist at the French quantum computing firm Alice & Bob, which specializes in cat qubits, this work marks the beginning of a shift from noisy, classically simulable quantum devices to fully error-corrected quantum chips. He says the AWS team’s most notable achievement is its clever hybrid arrangement of cat qubits for quantum information storage and traditional transmon qubits for error readout.

However, while Schlegel calls the research “innovative”, he says it is not without limitations. Because the AWS chip incorporates transmons, it still needs to address both bit-flip and phase-flip errors. “Other cat qubit approaches focus on completely eliminating bit flips, further reducing the qubit count by more than a factor of 10,” Schlegel says. “But it remains to be seen which approach will prove more effective and hardware-efficient for large-scale error-corrected quantum devices in the long run.”

The research is published in Nature.

The post Cat qubits open a faster track to fault-tolerant quantum computing appeared first on Physics World.

  •