What’s the Big Deal between 99% and 99.99%?

What’s the Big Deal between 99% and 99.99%?

In late 2025 that IonQ broke the news heard around the quantum computing industry – they achieved “99.99% two-qubit gate fidelity,”. Press releases called it a “watershed moment” and a “world record.” Stock prices moved.

 

But what exactly does this mean, and why does it matter? 

The Machine That Fails Constantly

One of the biggest challenges facing quantum computers is the error rate when running an algorithm. To give you a sense of the scale of this problem: a typical NISQ-era quantum processor has two-qubit gate fidelities in the range of 95% to 99%. That means an error occurs somewhere between 1 in 20 and 1 in 100 operations. The best laboratory results have reached 99.9% (1 in 1,000), and IonQ’s recent announcement pushed that to 99.99% (1 in 10,000).

 

For comparison, a classical computer processor has an error rate closer to one in a billion or one in a trillion operations. By that measure, quantum computers are orders of magnitude less reliable than the devices we use every day.

 

This matters because quantum algorithms require many operations chained together. With error rates above 0.1% per gate, a quantum circuit can only execute around 1,000 gates before the accumulated noise drowns out the signal. That constraint limits the depth and complexity of any algorithm you can run. Most of the applications that would make quantum computing genuinely useful, such as simulating molecules for drug discovery or breaking certain encryption schemes, require far deeper circuits than current hardware can reliably execute.

 

Improving fidelity changes this calculus, which opens for more widespread applications, because it allows you to run longer algorithms before errors accumulate to the point of failure. It also reduces the overhead required for error correction. At 99% fidelity, you might need thousands of physical qubits to protect a single logical qubit. At 99.99%, that number drops significantly, which affects both the cost and the timeline for building machines that can solve real problems.

Fidelity = how close the actual result is to the intended result

 

99% fidelity = 1 error per 100 operations

99.9% fidelity = 1 error per 1,000 operations

99.99% fidelity = 1 error per 10,000 operations

 

Since quantum algorithms require thousands of gate operations chained together, a 1% error rate means your computation results in garbage

Physical Qubits, Logical Qubits, and the Overhead Problem

A physical qubit is the actual hardware: a trapped ion, a superconducting circuit, a photon, or whatever system the quantum computer uses to store and manipulate quantum information. These are the qubits that engineers build and calibrate. They are also inherently noisy. Environmental interference, imperfect control signals, and the fragility of quantum states all introduce errors.

 

A logical qubit is an abstraction built on top of multiple physical qubits. By encoding quantum information across a group of physical qubits and continuously checking for errors, you can create a single logical qubit that is far more reliable than any of its components. This is quantum error correction in practice. The tradeoff is overhead: depending on the error correction scheme and the quality of the underlying hardware, you might need anywhere from a dozen to several thousand physical qubits to maintain one logical qubit.

 

This is why fidelity matters so much at the physical level. The higher the fidelity of your physical operations, the fewer physical qubits you need to protect each logical qubit. At 99% fidelity, the overhead is enormous. At 99.99%, it becomes more manageable. The goal is to get physical error rates low enough that error correction can do its job efficiently, rather than being overwhelmed by the constant stream of new errors.

 

When researchers talk about “two-qubit gate fidelity,” they are measuring how accurately the processor performs operations that entangle two qubits. Single-qubit gates, which manipulate one qubit at a time, are easier to execute and typically have higher fidelity. Two-qubit gates are where most of the errors occur, and they are also essential for any meaningful quantum algorithm. Entanglement between qubits is what gives quantum computers their computational power, so the accuracy of two-qubit operations sets a practical ceiling on what the machine can do. This is why two-qubit gate fidelity has become the standard benchmark for comparing quantum hardware.

Why 99.99% Matters

There is a threshold in quantum error correction below which the whole system starts to work, and above which it falls apart (in theory). The principle is: if your error rate is too high, the process of detecting and correcting errors will itself introduce new errors faster than it removes them. You almost end up worse off than if you had done nothing.

 

Stay tuned for a future article on how error correction works – subscribe for free!

 

For most practical error correction schemes, that threshold sits somewhere around 99% fidelity (a 1% error rate). Once you are comfortably below that threshold, adding more physical qubits to your error correction code actually makes the system more reliable. Above it, adding qubits just adds more noise.

 

But being just below the threshold is not enough. At 99% fidelity, you might need thousands of physical qubits to maintain a single logical qubit. The overhead is so large that building a useful machine becomes impractical. As fidelity improves, the overhead drops. At 99.9%, the numbers start to look more reasonable. At 99.99%, they improve dramatically. IonQ has estimated that the difference between 99.9% and 99.99% fidelity translates to a ten-billion-fold reduction in logical error rates for equivalent systems. Each additional “nine” is not a linear improvement. It is an exponential one.

 

This is why the quantum computing industry talks about “four nines” (99.99%) as a milestone. It represents a point where fault-tolerant quantum computing becomes possible.

 

The IonQ Announcement

In October 2025, IonQ announced that it had achieved 99.99% two-qubit gate fidelity, becoming the first company to cross the four-nines threshold. The previous record, 99.97%, was set in 2024 by Oxford Ionics, which IonQ has since acquired. The result was achieved using a technology called Electronic Qubit Control (EQC), which replaces the bulky laser systems traditionally used in trapped-ion quantum computers with microwave signals delivered through chip-integrated electronics. 

 

Subscribe to be the first to read our report about the IonQ acquisition of Oxford Ionics, and why it matters for the future of FTQC.

The company has stated that this level of hardware performance is sufficient to scale toward millions of qubits by 2030. That timeline is, well,  ambitious to say the least, and there are significant engineering challenges ahead. But this news does represent measurable progress toward a future where quantum computing might, one day, become useful.