IBM Achieves Record Quantum Calculation Fidelity, Extending Stability by Over 50%

18
IBM Achieves Record Quantum Calculation Fidelity, Extending Stability by Over 50%

IBM researchers, in collaboration with RWTH Aachen University and Quantum Elements, have broken the previous record for sustained high-fidelity quantum computation on superconducting qubits. The breakthrough, published in Nature Communications on February 27th, tackles a core challenge in quantum computing: maintaining stable calculations long enough to execute complex algorithms.

The Problem of Quantum Instability

Quantum computers rely on qubits, the quantum equivalent of bits, to process information. Unlike classical bits, qubits are inherently fragile, susceptible to noise from even minute vibrations or environmental disturbances. This fragility forces scientists to group multiple physical qubits together into “logical qubits” as a form of redundancy, but even this approach is vulnerable to “logical errors” — where multiple physical qubits fail simultaneously, corrupting the calculation.

The issue is especially acute in IBM’s 127-qubit “Kyiv” and “Marrakesh” processors, which suffer from a specific type of noise called “ZZ crosstalk.” Traditional error-correction methods struggle to scale effectively without introducing additional errors.

The Solution: Normalizer Dynamical Decoupling (NDD)

The research team developed a novel hybrid error-suppression protocol called Normalizer Dynamical Decoupling (NDD). Instead of applying noise-reduction pulses at the hardware level alone, NDD adjusts the timing of these pulses to synchronize with the quantum code being executed. This requires a mathematical “normalizer” that dynamically tunes the pulses, allowing them to counteract noise more efficiently.

The results are significant:
* Peak encoding fidelity reached 98.05% — higher than any previously recorded.
* This fidelity was sustained at 84.87% for 55 microseconds, more than double the previous record of 27 microseconds.

Why This Matters

The longer a quantum computer can maintain high fidelity, the more complex calculations it can perform. A sustained 55 microseconds allows for roughly 4,500 to 5,500 consecutive quantum operations before data degradation. While this may seem incremental, it is a substantial improvement.

Quantum computing’s ultimate goal is to tackle problems impossible for classical computers, like breaking modern encryption. Tasks such as running Shor’s algorithm could take weeks or months on a capable quantum system, compared to trillions of years on a classical machine.

This milestone brings that future closer, demonstrating that sustained high-fidelity quantum computation is achievable. The team’s success underscores the importance of hybrid error-suppression techniques and dynamic optimization in advancing quantum technology.