IBM today unveiled an expanded roadmap to what it claims will be the world’s first large-scale fault-tolerant quantum computer.
As part of its roadmap, IBM has published two new technical papers that it says will be key to creating the necessary architecture.
It has identified the destination for the roadmap to be the completion of two new, progressively advanced, quantum systems: Starling, which IBM claims will be the world’s first large-scale fault-tolerant quantum computer, is due to be finalised by 2029, followed by Blue Jay, due to be finalised by 2033. Both quantum machines will be located in IBM’s quantum data centre in Poughkeepsie, New York.
The first new scientific paper builds on IBM’s approach to error correction, which was initially published in Nature magazine in March 2024, and introduced quantum low-density parity check (qLDPC) codes. This paper demonstrated a 90% reduction in physical qubits needed for error correction.
Quantum error correction requires the encoding of quantum information into more qubits, or quantum bits, than might be otherwise needed. The new paper from IBM demonstrates how to reduce the number of physical qubits needed for error correction and identifies the resources needed to run large-scale quantum programs.
A qubit is the basic unit of information in quantum computing. A logical qubit, according to QuEra.com, “refers to a qubit that is encoded using a collection of physical qubits to protect against errors. Unlike a physical qubit, which represents the actual quantum hardware, a logical qubit is a higher-level abstraction used in fault-tolerant quantum computing.”
The second paper IBM has just published introduces a new heuristic decoder, Relay-BP, which is focused on how to decode information from physical qubits and correct errors in real-time, with conventional computing resources.
During a media call, Matthias Steffen, head of Quantum Processor Technology, IBM, said: “In the first paper, we show how the qubit advantage is retained even when we build large logical circuits on modular quantum system architecture. In the second paper, we show how we identify and correct errors in real-time using conventional computing resources. When taken together, these papers will demonstrate the essential criteria for a large-scale error correction approach.”
Steffen added that the result would be creating a platform stable enough for “meaningful algorithms to succeed”.
The first of the new quantum machines, IBM Quantum Starling, is expected to be capable of 100 million quantum operations using 200 logical qubits – 20 000 times more than the 5 000 operations quantum computers based on IBM’s Heron architecture are capable of today.
IBM says to put the Starling figure into perspective, 200 logical qubits would require the memory of more than a quindecillion (10^48) of the world’s most powerful classical supercomputers.
With Starling operational, four years later, by 2033, IBM is expecting to have built its Blue Jay quantum system, which will be capable of 1 billion quantum operations using over 2 000 logical qubits.
To reach that stage, IBM is taking a modular approach to the hardware, and has been building out progressively more capable quantum computers, as well as linking them with classical high-performance computers. IBM claims to be running the largest fleet of quantum computers in the world, with 10+ quantum machines.
Jay Gambetta, VP, IBM Quantum, said the IBM Quantum System Two debuted in 2023, and a second version of the same system will be running by 2026.
Other developments IBM outlined in its roadmap for the next four years are the introduction of its Loon processor, expected this year, which allows for the testing of architecture components for the qLDPC code; the Kookaburra multi-chip processor, expected in 2026, which will be IBM’s first modular processor designed to store and process encoded information – combining quantum memory with logic operations; and Cockatoo, due in 2027, will connect two Kookaburra modules. This architecture will link quantum chips and reduce the need to build “impractically large” chips.
Why the excitement about quantum computing?
Quantum computing is set to be the next revolution in computing, but while there is development work being undertaken, quantum computers have not yet reached sufficient sophistication to overtake the capabilities of classical computers.
Development of quantum computing is expensive and challenging to facilitate the right conditions for qubit creation. Qubits need low temperatures (minus 273.15 degrees Celsius or zero Kelvin) to function reliably. As such, quantum computing will be offered as a cloud service, providing remote access to the resources.
Where quantum computing will excel is that it has the potential to solve problems that would be too hard or take too long for classical computers to solve. One example is the simulation of carbon capture with 52-65 molecular orbitals, which would be out of reach for a classical supercomputer, but is estimated it would take around 3.5 days for a quantum computer.
Seen more as complementary rather than a replacement to classical computing, quantum computing is suited to specific tasks and challenges. These include healthcare and life science, materials science, physics, drug discovery and molecular modelling, optimisation problems, climate and weather modelling, financial modelling and risk analysis, and machine learning and AI.
Perhaps the most famous example of where quantum computing is set to have an impact is in the world of cryptography.
Much of today’s cryptography, based on RSA 2048, is expected to be broken with a strong quantum computer. Last year, the US National Institute of Standards and Technology approved four new quantum safe cryptography standards, giving a deadline of 2030 for companies to be compliant, with RSA 2048 cryptography being deprecated by 2035.
On the subject, Gambetta said that beyond the work being undertaken at IBM on developing quantum systems, of equal importance is the work taking place among partners to develop algorithms that run on the new technology.
“What's going to accelerate the time for breaking encryption is the algorithm research. There was a recent paper from the Google team where they showed improvements in running that algorithm on the 2048 version of the code. So, we’ve already seen on the 2048 version of Shor's algorithm, that it required less physical qubits. So, you combine all these parts together, it (breaking of encryption) will happen, but what will accelerate the time will be more the algorithm research. And if you’re not already making the transition to being quantum safe (implementing quantum safe cryptography), you should be doing it.”
Share