Noise is currently quantum computing’s biggest challenge as well as its most significant limitation. IBM is working to reduce that noise in the next few years through various types of quantum error management until true quantum error correction (QEC) is attained.
Here is why reducing noise is so important. A quantum bit (qubit) is the basic unit of information for quantum computers, and the longer a qubit can maintain its quantum state, the more computational operations it can perform. Unfortunately, qubits are very sensitive to environmental noise, which can come from a quantum computer’s control electronics, wiring, cryogenics, other qubits, heat, and even external factors such as cosmic radiation. Noise is problematic because it can cause a qubit’s quantum state to collapse (a condition called decoherence) and thus create errors. An uncorrected error can cascade into an avalanche of errors and destroy an entire computation.
This type of noise originates at the atomic level, and even though it can’t be completely eliminated, it can be managed.
Quantum advantage and noise
Despite media hype, there is no documented evidence that current quantum computers are more powerful than classical supercomputers. Even so, there is no question that quantum computers have an indisputable advantage. Most experts believe it’s only a matter of time before quantum computing demonstrates its superiority compared to classical supercomputers. When that occurs, quantum computing will have achieved what is commonly referred to as “quantum advantage.”
IBM defines quantum advantage as a significant improvement in quantum algorithm runtime for practical cases over the best classical algorithm. Its blog further states that the algorithm needed to prove quantum advantage must have an efficient representation as quantum circuits, and there should be no classical algorithm capable of simulating those circuits efficiently.
But here’s the dilemma: For quantum computers to achieve quantum advantage, besides improving qubit coherence, gate fidelities, and speed of circuit execution, we must also significantly increase computational qubit counts. But upping the number of qubits also increases noise and qubit errors. Therefore, managing noise and qubit errors is critical to the long-term success of quantum computing.
Although error correction is common in classical computers and in certain types of memory hardware, we can’t use the same techniques in quantum computers because the laws of quantum mechanics make it impossible to clone unknown quantum states.
Quantum error correction (QEC) is a complex engineering and physics problem. And despite its importance and the number of years that have been invested thus far in the search for a solution, quantum error correction remains elusive and still appears to be many years away. Until full error correction becomes available, IBM is researching other error management solutions.
Quantum error management
The above IBM chart compares the exponential scaling of error-mitigated quantum circuits to the exponential scaling of classical computers. The crossover point is where quantum error mitigation becomes competitive with classical solutions.
IBM has a long history of error correction research, beginning with David DiVincenzo’s investigations in 1996. In 2015, it developed the first system to detect quantum bit flip and phase flip errors. Today, almost every corporate and academic quantum computing program has some form of error correction research in progress because of the importance of quantum error correction.
IBM currently looks at quantum error management through the lens of three methods: error suppression, error mitigation, and error correction. Setting aside error correction for the moment, let’s consider the other two approaches.
Error suppression is one of the earliest and most basic methods of handling errors. It typically modifies a circuit, uses pulses of energy to keep a qubit in its quantum state longer, or directs pulses at idle qubits to undo any unwanted effects caused by neighboring qubits. These types of error suppression are known as dynamic decoupling.
Error mitigation is the method that IBM believes will bridge the gap between today’s error-prone hardware and tomorrow’s fault-tolerant quantum computers. Error mitigation’s interim purpose is to enable early achievement of quantum advantage.
IBM has done more continuous error mitigation research than any other institution. Through that work, IBM has developed multiple approaches to error mitigation, including probabilistic error cancellation (PEC) and zero-noise extrapolation (ZNE).
- PEC functions much like noise-canceling headsets where noise is extracted and analyzed, then inverted before being mixed with the original noise signal to cancel it out. One significant difference for PEC is that, rather than using single samples as in audio noise-canceling algorithms, PEC uses averages calculated from a collection of circuits.
- ZNE reduces noise in a quantum circuit by running the quantum program at different noise levels, then extrapolating the computation to determine an estimated value at a zero-noise level.
Effective quantum error correction would eliminate almost all noise-related errors. It is worth noting that QEC exponentially suppresses errors with increasing code size. At any finite code size, errors will always be present. For optimum results, it will be important to pick a code size that suppresses errors just enough for the target application.
But until QEC becomes available, it appears that quantum error mitigation provides the fastest path to quantum advantage.
Dialable error reduction
IBM recently announced the integration of error suppression and error mitigation into Qiskit Runtime primitives Sampler and Estimator. As beta features, these allow a user to trade speed for fewer errors. IBM’s roadmap projects the final release of these features in 2025.
There is overhead associated with compiling, executing, and classical post-processing of error mitigation techniques. The amount of overhead varies depending on the type of error mitigation used. IBM introduced a new simplified option for the primitives called a “resilience level” that allows users to dial in the cost/accuracy tradeoff needed for their work. Sampler and Estimator will automatically apply dynamical decoupling error suppression to circuits at optimization levels 1 through 3. Resilience 0 offers no error mitigation, Resilience 1 is measurement error mitigation, Resilience 2 provides biased error mitigation (via ZNE), and Resilience 3 enables unbiased estimators (via PEC).
Error mitigation will be available on all cloud-accessible IBM systems. As the resilience number increases, so does the cost. Resilience 3 produces the fewest errors but could require 100,000X more computation time.
Dr. Blake Johnson, IBM Quantum Platform Lead, explained the rationale for IBM’s implementation of this option for error mitigation services.
“We have some very advanced users that want to do everything themselves,” Dr. Johnson said. “They don’t want us touching their circuits. That’s fine with us, so we make that possible. But we are seeing more and more users who look at a quantum computer like you would look at a toaster. They don’t understand how it works. They just want to push a button and make the right thing happen. So, we decided to enable certain things as defaults if it doesn’t have a sampling overhead and if there isn’t an additional cost to run it.”
Quantum error correction
Thanks to error correction research conducted by the entire quantum computing community, significant progress has been made on QEC over the past decade. Even so, it is likely that years of more research will be required to find a workable solution.
One of the early challenges of quantum error correction was determining if an error had been made without destroying a qubit’s quantum state by measuring it. In 1995, Peter Shor developed a breakthrough solution to circumvent the problem. Rather than storing the quantum state in a single qubit, Shor’s system encoded quantum information in a logical qubit distributed across nine physical qubits. The scheme enabled errors to be detected by monitoring the system’s parity rather than destroying the quantum state with direct measurements.
IBM is currently investigating many approaches to quantum error correction, including some similar to Shor’s code. This class of error correction code is called quantum Low-Density Parity Check (qLDPC). LDPC is not new. It is used in many classical error correction applications, such as Wi-Fi and 5G.
According to IBM, qLDPC offers the following advantages:
- Only a few physical qubits are needed for each logical qubit, rather than the hundreds that are needed for 2-D surface code.
- Only a limited number of qubits are exposed if a faulty operation occurs.
The research opportunities and diverse methods for quantum error correction are too numerous to cover here, but having many options is a good problem to have. If a fault-tolerant quantum computer is ever to be realized, we must find a solution for error correction, and the more options we have, the better our chances.
IBM’s quantum roadmap reflects the complexity of the problem. It shows an error correction solution becoming available sometime beyond 2026. Indeed, it will likely take several years beyond that.
As quantum hardware continues to improve, there is a high probability that quantum error mitigation, as implemented by IBM’s roadmap, will facilitate the early achievement of quantum advantage. Presently, error mitigation has an exponential runtime influenced by how many qubits are needed and the circuit depth. But improvements in speed, qubit fidelity, and error mitigation methods are expected to considerably reduce that overhead.
It is IBM’s goal for error mitigation to provide a continuous development path to error-correction. Once QEC is attained, it will enable us to build fault-tolerant quantum machines running millions of qubits in a quantum-centric supercomputing environment. These machines will have the ability to simulate large many-body systems, optimize complex supply chain logistics, create new drugs and materials, model and react to sophisticated financial market behavior, and much more.
Fault-tolerant quantum computers will signal that a new era of quantum-centric scientific investigation has arrived. And with that new capability will come the potential to responsibly change the world.
- Despite media hype about the power of quantum computers, it has yet to be demonstrated that quantum has a clear computational superiority over classical supercomputers.
- Quantinuum recently published two important QEC proofs-of-concept. Its researchers developed a logical entangling circuit with higher fidelity than its physical counterparts. Researchers also entangled two logical qubit gates in a fully fault-tolerant manner.
- IBM announced that dynamic circuits will also be available its systems along with error mitigation. Dynamic circuits are expected to play an important role in quantum Low-Density Parity Check (qLDPC) error correction codes.
- In preparation for quantum advantage, IBM began scaling up its processors with the recently announced 433 Osprey qubit processor. The Osprey has 3X more qubits than the current 127-qubit Eagle processor.
- In addition to IBM’s error suppression and error mitigation initiatives, these are the major highlights in IBM’s quantum roadmap that provide a path to quantum advantage:
- 2023 — Further scaling occurs with the release of the Condor processor, with 1121 qubits. Work also continues on initiatives to improve system-wide speed and quality.
- 2024 — IBM will begin to integrate and test key technologies that enable future scaling such as classical parallelization, couplers, multi-chip quantum processors, and quantum parallelization.
- 2025 — Implementation of modular quantum hardware, new control electronics, and cryogenic infrastructure are the final near-term hardware pieces needed for attaining quantum advantage.
- 2026 — IBM will have the capability of scaling up future systems to 10K–100K qubits. By then, it should also have significantly increased the system speed and quality. A mature implementation of quantum error mitigation will make it possible to attain quantum advantage. Significant advances in quantum error correction will also have been made.
Follow Paul Smith-Goodson on Twitter for current information and insights about Quantum, AI, Electromagnetics, and Space
Moor Insights & Strategy, like all research and tech industry analyst firms, provides or has provided paid services to technology companies. These services include research, analysis, advising, consulting, benchmarking, acquisition matchmaking, and speaking sponsorships. The company has had or currently has paid business relationships with 8×8, Accenture, A10 Networks, Advanced Micro Devices, Amazon, Amazon Web Services, Ambient Scientific, Anuta Networks, Applied Brain Research, Applied Micro, Apstra, Arm, Aruba Networks (now HPE), Atom Computing, AT&T, Aura, Automation Anywhere, AWS, A-10 Strategies, Bitfusion, Blaize, Box, Broadcom, C3.AI, Calix, Campfire, Cisco Systems, Clear Software, Cloudera, Clumio, Cognitive Systems, CompuCom, Cradlepoint, CyberArk, Dell, Dell EMC, Dell Technologies, Diablo Technologies, Dialogue Group, Digital Optics, Dreamium Labs, D-Wave, Echelon, Ericsson, Extreme Networks, Five9, Flex, Foundries.io, Foxconn, Frame (now VMware), Fujitsu, Gen Z Consortium, Glue Networks, GlobalFoundries, Revolve (now Google), Google Cloud, Graphcore, Groq, Hiregenics, Hotwire Global, HP Inc., Hewlett Packard Enterprise, Honeywell, Huawei Technologies, IBM, Infinidat, Infosys, Inseego, IonQ, IonVR, Inseego, Infosys, Infiot, Intel, Interdigital, Jabil Circuit, Keysight, Konica Minolta, Lattice Semiconductor, Lenovo, Linux Foundation, Lightbits Labs, LogicMonitor, Luminar, MapBox, Marvell Technology, Mavenir, Marseille Inc, Mayfair Equity, Meraki (Cisco), Merck KGaA, Mesophere, Micron Technology, Microsoft, MiTEL, Mojo Networks, MongoDB, MulteFire Alliance, National Instruments, Neat, NetApp, Nightwatch, NOKIA (Alcatel-Lucent), Nortek, Novumind, NVIDIA, Nutanix, Nuvia (now Qualcomm), onsemi, ONUG, OpenStack Foundation, Oracle, Palo Alto Networks, Panasas, Peraso, Pexip, Pixelworks, Plume Design, PlusAI, Poly (formerly Plantronics), Portworx, Pure Storage, Qualcomm, Quantinuum, Rackspace, Rambus, Rayvolt E-Bikes, Red Hat, Renesas, Residio, Samsung Electronics, Samsung Semi, SAP, SAS, Scale Computing, Schneider Electric, SiFive, Silver Peak (now Aruba-HPE), SkyWorks, SONY Optical Storage, Splunk, Springpath (now Cisco), Spirent, Splunk, Sprint (now T-Mobile), Stratus Technologies, Symantec, Synaptics, Syniverse, Synopsys,Tanium, Telesign,TE Connectivity, TensTorrent, Tobii Technology, Teradata,T-Mobile, Treasure Data, Twitter, Unity Technologies, UiPath, Verizon Communications, VAST Data, Ventana Micro Systems, Vidyo, VMware, Wave Computing, Wellsmith, Xilinx, Zayo, Zebra, Zededa, Zendesk, Zoho, Zoom, and Zscaler.
Moor Insights & Strategy founder, CEO, and Chief Analyst Patrick Moorhead is an investor in dMY Technology Group Inc. VI, Dreamium Labs, Groq, Luminar Technologies, MemryX, and Movand
Note: Moor Insights & Strategy writers and editors may have contributed to this article.