New research from the California Institute of Technology suggests that quantum computers capable of breaking modern cryptography may require significantly fewer qubits than previously thought. The study, published Monday, was conducted in collaboration with Oratomic, a Pasadena-based quantum computing startup founded by Caltech researchers. The work centers on a neutral-atom system in which individual atoms are trapped and manipulated using lasers to function as qubits. This approach could allow a fault-tolerant quantum computer to execute Shor’s algorithm with as few as 10,000 reconfigurable atomic qubits.
Shor’s algorithm is capable of deriving private keys from the public keys used in Bitcoin‘s elliptic-curve cryptography, posing a direct threat to the security of the network. Historically, estimates for the number of qubits needed to run such an algorithm at scale have been in the billions. Oratomic co-founder and CEO Dolev Bluvstein, a visiting associate in physics at Caltech, noted that just over a decade ago, the best lab systems contained roughly five qubits while estimates for Shor’s algorithm required around one billion. The rapid compression of that gap, he said, is accelerating pressure to adopt quantum-resistant cryptography.
A key factor driving earlier estimates higher has been the overhead associated with error correction. Today’s most common error-correction systems typically require around 1,000 physical qubits to produce a single reliable logical qubit, the error-corrected unit used in actual computations. That ratio has pushed projections for practical fault-tolerant machines into the millions of qubits, slowing the perceived timeline for threats to encryption standards used by Bitcoin and Ethereum. The new research suggests that overhead can be substantially reduced using the neutral-atom approach.
Bluvstein pointed out that current laboratory systems are already approaching or exceeding 6,000 physical qubits. In September, Caltech researchers demonstrated a neutral-atom quantum computer operating 6,100 qubits with 99.98% accuracy and coherence times of 13 seconds. That milestone renewed concerns about the long-term security of blockchain networks and other systems relying on elliptic-curve and RSA cryptography. The gap between current capabilities and the 10,000-qubit threshold is narrowing faster than many experts anticipated.
Despite the shrinking qubit gap, Bluvstein cautioned that reaching the physical qubit count is only one part of the challenge. Building a functional fault-tolerant machine involves highly complex engineering tasks that go well beyond simply assembling components. Maintaining extremely low error rates while scaling quantum systems remains a significant technical hurdle. Still, he said a practical quantum computer could emerge before the end of the decade.
The findings arrive alongside separate research from Google, whose scientists reported on Tuesday that future quantum computers could break elliptic-curve cryptography using fewer resources than previously estimated. Together, the two studies add urgency to calls for a broad transition to post-quantum cryptography before such machines become operational. Governments and technology firms have already begun exploring encryption systems designed to withstand quantum attacks, though the migration remains in early stages across most sectors.
Bluvstein emphasized that the threat extends well beyond cryptocurrency networks. He described the risk as spanning the entire global digital infrastructure, including internet-of-things devices, routers, satellites, and general internet communications. The complexity of updating so many interconnected systems makes the transition to post-quantum standards a formidable undertaking. The challenge, he said, is not unique to blockchain but is shared by virtually every corner of the modern digital world.
Originally reported by Decrypt.
