How can quantum error mitigation improve noisy intermediate-scale devices?

Noisy intermediate-scale quantum (NISQ) devices are limited by decoherence, imperfect gates and readout errors that corrupt quantum computations. John Preskill at the California Institute of Technology framed the NISQ era as one where noise dominates but useful experiments remain possible. Quantum error mitigation does not replace full fault-tolerant error correction; instead it applies algorithmic, calibration and post-processing techniques to reduce the impact of noise so that useful results can be extracted from devices with dozens to a few hundred qubits.

Techniques and experimental evidence

Several practical approaches have emerged. Zero-noise extrapolation increases or scales controllable noise and extrapolates results back to the zero-noise limit. Probabilistic error cancellation uses a quasi-probability decomposition of noisy gates to statistically cancel errors; the theoretical foundations and algorithmic description were developed by Kristan Temme, Sergey Bravyi, and Jay Gambetta at IBM Research. Symmetry verification enforces conserved quantities of a problem Hamiltonian to detect and reject erroneous runs. Experimental demonstrations on chemistry problems and variational algorithms come from Abhinav Kandala at IBM Research and collaborators, showing that error mitigation improves observable estimates on current hardware. These methods are often combined with readout-error calibration and circuit recompilation to maximize benefit.

Relevance, causes and consequences

Error mitigation matters because building large-scale, fault-tolerant quantum computers remains technically and economically challenging. By addressing dominant noise sources—finite coherence times, gate infidelities and measurement bias—mitigation extends the useful algorithmic depth of NISQ hardware and makes near-term applications in quantum chemistry, materials and optimization more attainable. However, mitigation has limits: it typically requires increased sampling, classical post-processing and accurate noise characterization, and its cost grows quickly as noise rates or system size increase. Correlated and non-Markovian errors reduce effectiveness, and some techniques assume error models that real devices only approximately satisfy.

Human and territorial factors shape progress: academic labs, national labs and technology companies prioritize different trade-offs between hardware scale and noise control, influencing which mitigation strategies are practical. Environmentally, mitigation shifts resource needs toward more classical computation and repeated experiments rather than dramatically larger cryogenic platforms. In sum, error mitigation provides a pragmatic bridge: it improves immediate utility of NISQ devices while highlighting the technical gaps that full quantum error correction must eventually close.