Quantum Observer #5 — The Walls Are Closing In (From Both Sides)
In a single week, three papers rewrote the math on how soon quantum computers could break your encryption. And a quiet silicon result may have added another path to get there
In this edition: The biggest week in quantum resource estimation just happened, and I don’t think most people have processed what it means. Google Quantum AI published a landmark paper showing fewer than 500,000 superconducting qubits could break Bitcoin’s ECC-256 cryptography in under nine minutes. On the same day, Oratomic and Caltech showed the same attack could work with just 10,000 neutral atom qubits — at the cost of months of runtime. Weeks earlier, the same French team behind last year’s RSA breakthrough turned their optimization pipeline on elliptic curves. I unpack all three papers and explain why the convergence matters more than any single result. I also introduce my new CRQC Scorecard — a modality-by-modality assessment of how close each quantum platform actually is to cryptographic relevance — and the updated Q-Day Estimator tool. On the policy front, the U.S. intelligence community just elevated quantum to the same threat tier as AI, and Google drew a hard line: full PQC migration by 2029. To help with that migration, I’ve released the complete Applied Quantum PQC Migration Framework under Creative Commons — freely available at PQCframework.com. Then in the second half, I cover the story that nobody noticed: silicon quantum computing just ticked its last box on the fault-tolerance checklist, and it happened in Shenzhen.
The Resource Estimation Revolution: Three Papers That Redrew the Map
For years, the standard answer to “how many qubits to break encryption?” was a comfortably large number. Gidney and Ekerå’s 2021 estimate put RSA-2048 at roughly 20 million physical qubits. That figure became a security blanket for organizations that wanted to delay action — 20 million qubits felt safely impossible.
In the span of a few weeks this March, three separate research teams tore that security blanket to shreds. Not by building bigger quantum computers, but by finding dramatically more efficient ways to use smaller ones. And critically, the three papers aren’t independent — they build on and reference each other, forming a coherent body of evidence that the algorithmic side of the CRQC problem is advancing far faster than most threat models assume.
Google’s Bitcoin Paper: 500,000 Qubits, Nine Minutes
The headline-grabber landed on March 31. Google Quantum AI published what I consider the most significant quantum cryptanalysis resource estimate since Gidney’s 2021 RSA paper. The target: the elliptic curve discrete logarithm problem on 256-bit curves — the cryptography that protects Bitcoin, Ethereum, and the overwhelming majority of TLS sessions on the internet.
The core numbers: fewer than 500,000 superconducting physical qubits to solve ECDLP-256 in under nine minutes on a fast-clock architecture. That’s roughly a 10x reduction over the best prior estimates for this problem.
How? The improvements aren’t a single trick but an entire pipeline of optimizations — more efficient windowed arithmetic, better modular inversion circuits, tighter Toffoli-to-T compilations, and crucially, a compressed spacetime volume that shifts difficulty from qubit count to circuit depth. The paper also includes something genuinely novel: a zero-knowledge proof that the published circuits are correct, allowing independent verification without revealing proprietary compilation details.
But here’s the nuance that matters for CISOs reading this: a 9-minute attack window against ECC-256 has very different implications depending on the protocol. For harvest-now-decrypt-later (HNDL) attacks against stored data, runtime barely matters — an attacker with a CRQC can take as long as they need. For real-time attacks against Bitcoin transactions, the 9-minute window is operationally significant. Bitcoin’s confirmation time is roughly 10 minutes, meaning a sufficiently powerful quantum computer could theoretically forge a transaction before it’s confirmed. The full analysis is in my deep dive on the Google paper.
Oratomic’s Bombshell: 10,000 Qubits (But Read the Fine Print)
On the same day, and this is not a coincidence, a team from Oratomic, Caltech, and UC Berkeley published a paper claiming that Shor’s algorithm can be executed with as few as 10,000 reconfigurable neutral atom qubits. The names on the paper are a who’s-who of quantum error correction: Dolev Bluvstein (who led Harvard’s landmark neutral atom experiments), John Preskill (one of the founders of quantum error correction theory), and Hsin-Yuan Huang (Caltech).
The qubit reduction is staggering: 50x lower than Google’s superconducting estimate for the same ECC-256 attack, and roughly 100x lower than Gidney’s updated RSA-2048 estimate of roughly 1 million qubits. The key innovation is high-rate quantum low-density parity-check (qLDPC) codes with approximately 30% encoding rate, compared to the roughly 4% achieved by the surface codes Google and Gidney used. In practical terms, each physical qubit does far more useful work.
But, and this is critical, the paper trades qubits for time. The 10,000-qubit architecture would take days to months to complete the computation, depending on the target and parallelism configuration. For RSA-2048 in the space-efficient regime, the runtime stretches into months. This is not a 9-minute attack. It’s a different point on the space-time tradeoff curve — fewer qubits, vastly more time.
Does that make it less threatening? Not necessarily. An adversary running a months-long computation against harvested encrypted traffic doesn’t care about speed. And the paper’s balanced architecture — roughly 13,000 qubits with more parallelism — brings the ECC-256 attack down to days rather than months. These are still hypothetical machines. But “hypothetical machine with 13,000 qubits” is a very different planning target than “hypothetical machine with 20 million qubits.” My full analysis of the Oratomic paper covers the runtime tradeoffs, the qLDPC code innovations, and what this means for threat timeline assessment.
The INRIA Optimization Pipeline: ECC Gets the Same Treatment as RSA
These two papers didn’t emerge from nowhere. Weeks earlier, the same French team at INRIA Rennes (Clémence Chevignard, Pierre-Alain Fouque, and André Schrottenloher) published a new algorithm that further shrinks the quantum attack surface for elliptic curve cryptography. This is the same team whose CRYPTO 2025 paper on RSA factoring optimizations was subsequently used by Google’s Craig Gidney to bring RSA-2048 estimates from 20 million down to roughly 1 million physical qubits.
Now they’ve turned the same optimization pipeline on the elliptic curve discrete logarithm problem. The gate count reductions are significant, and the work, submitted to EUROCRYPT 2026, feeds directly into the resource estimates that Google and Oratomic subsequently published. This is the pattern I keep emphasizing: algorithmic improvements compound. Each optimization at the circuit level cascades through every subsequent resource estimate.
The Big Picture: Convergence, Not Individual Breakthroughs
Any one of these papers alone would be notable. Together, they represent something more important: a convergence of independent research groups arriving at dramatically lower resource estimates through different approaches.
The resource estimate trajectory tells the story. For RSA-2048: 20 million physical qubits (Gidney & Ekerå, 2021) → roughly 1 million (Gidney, 2025) → under 100,000 with the Pinnacle architecture (February 2026) → as low as 11,000–13,000 neutral atom qubits with qLDPC codes (Oratomic, March 2026). For ECC-256: millions of qubits in prior estimates → under 500,000 (Google, March 2026) → as low as 10,000 (Oratomic, March 2026). Different targets, different architectures, different tradeoffs — but the same direction. The cost of breaking today’s cryptography has dropped by orders of magnitude in five years, and the algorithmic pipeline shows no signs of slowing.
This is why the argument that Q-Day is safely decades away needs constant re-examination. The hardware gap is real: nobody has a machine with 500,000 physical qubits, and nobody has a 10,000-qubit neutral atom machine with the error correction quality these papers assume. But the algorithmic side of the equation is compressing faster than most forecasters assumed. And the algorithmic improvements don’t require building anything — they’re available to anyone with a good idea and a preprint server.
The CRQC Scorecard: Measuring the Gap That Actually Matters
To make sense of this rapidly shifting landscape, I’ve published what I think is the most comprehensive modality-by-modality assessment of how close each quantum computing platform is to cryptographic relevance: The CRQC Scorecard.
The scorecard evaluates five major quantum modalities (superconducting, trapped-ion, neutral-atom, photonic, and silicon spin) against three metrics that define the CRQC threat: Logical Qubit Capacity (LQC), Logical Operations Budget (LOB), and Quantum Operations Throughput (QOT). These map directly to the CRQC Capability Framework I developed years ago, but compressed into three executive-level levers that non-specialists can track.
The key finding: no modality is close to CRQC capability today. But the rate of progress varies enormously across platforms, and the new resource estimates change the goalposts significantly. When the target was 20 million qubits, every platform was equally far away. When the target drops to 500,000, or 10,000 for architectures that support qLDPC codes, the relative positions shift dramatically.
I’ve also updated the CRQC Readiness Benchmark (Q-Day Estimator) tool on PostQuantum.com. The tool now incorporates the latest resource estimates from all three papers and lets you model different scenarios: pick a resource estimation paper, select a quantum modality, set your own growth assumptions, and see the projected Q-Day timeline. Conservative, median, and aggressive presets are available, or you can plug in vendor roadmap claims and see whether they hold up against historical delivery rates.
Try it yourself: CRQC Readiness Benchmark (Q-Day Estimator). I think it’s the most rigorous public tool for scenario-based Q-Day forecasting, and I welcome challenges to the methodology.
The Intelligence Community Gets It: Quantum Is Now a Tier-1 Threat
If you needed a signal that the quantum threat has graduated from theoretical curiosity to active national security concern, the U.S. intelligence community just provided one. In the 2026 Annual Threat Assessment, quantum computing received its own dedicated section — alongside AI, not buried under a generic “cyber” heading.
This matters more than it might seem. The ATA is the single most-read intelligence product in the U.S. government. It shapes budgets, policy priorities, and attention at the highest levels. Previous editions mentioned quantum in passing. This year, quantum got equal billing with artificial intelligence as a transformative technology threat.
Perhaps more significantly, the ATA expanded the quantum threat definition beyond cryptographic attack. It explicitly acknowledges quantum sensing, quantum networking, and the broader implications of quantum advantage across intelligence collection, defense, and economic competition. This tracks with the argument I’ve been making on PostQuantum.com: the quantum threat isn’t just about Q-Day. It’s about a fundamental shift in what’s computationally possible. My full analysis of the ATA’s quantum sections is here.
Google’s 2029 Deadline: The Ecosystem Argument in Action
And speaking of deadlines being set: Google has now publicly committed to completing PQC migration across all its products and infrastructure by 2029. Not started. Completed.
This is exactly the dynamic I described in my Q-Day Deadlines Are Set analysis. The reason to act on post-quantum cryptography isn’t a specific Q-Day prediction — it’s that the ecosystem is setting deadlines regardless of when a CRQC arrives. When Google, the company that arguably understands the quantum threat better than any other (it’s simultaneously building the quantum computers and defending against them), sets a 2029 completion target for PQC migration, that becomes a de facto standard for every organization in its supply chain.
If your cryptographic infrastructure interfaces with Google’s (and whose doesn’t?) you now have a deadline whether you set one or not.
The PQC Migration Framework: Now Open and Free
Speaking of migration: I’ve published an update to something I’ve been working on for a long time. The complete Applied Quantum PQC Migration Framework — the full methodology for migrating enterprise cryptography to post-quantum standards — is now freely available under Creative Commons (CC BY 4.0).
This is not another repackaging of NIST guidance or a theoretical migration model. It’s an 8-phase lifecycle covering everything from executive mandate and business case through discovery, CBOM, risk scoring, roadmap, pilots, infrastructure modernization, and vendor governance. It includes cross-cutting sections on crypto-agility architecture, maturity models, metrics, and regulatory mapping. And it comes with four sector-specific extensions: Financial Services, Telecommunications, Government & Defense, and Critical Infrastructure / OT.
I embedded some hard-earned lessons into it. The framework deliberately diverges from conventional industry approaches where practical experience has shown they don’t work — minimum-viable CBOM instead of exhaustive inventories, risk-driven discovery scoping instead of boil-the-ocean audits, vendor governance first rather than as an afterthought. Where I take these more pragmatic positions, I defend each one with evidence, and we’ve worked with regulators who have accepted and in some cases adopted these approaches.
If you’re a CISO figuring out how to start, a program manager staring at a multi-year migration, a security architect navigating hybrid deployment, or a consultant helping clients get quantum-ready — this is for you. Publishing under CC BY 4.0 means anyone can use it, including commercially, with attribution. The full framework is at pqcframework.com.
Silicon Just Ticked Its Last Box — and Nobody Noticed
While Google’s cryptocurrency paper and the neutral atom hype cycle dominated headlines this month, a team in Shenzhen quietly demonstrated something that no silicon quantum processor had ever done: universal logical operations.
The SZIQA team at Southern University of Science and Technology used five phosphorus nuclear spins in isotopically purified silicon to encode two logical qubits, implement the complete universal gate set, including the notoriously difficult T gate, and run a variational quantum eigensolver on those encoded qubits, computing the ground-state energy of a water molecule. They also prepared magic states above the Bravyi-Kitaev distillation threshold, which is the gateway to fault-tolerant universal computation. Published in Nature Nanotechnology, and coming from China rather than the Australian groups that have led donor silicon work for two decades.
On its own, this is a nice result with modest fidelities. Two logical qubits isn’t going to threaten anyone’s cryptography.
But here’s why I think it matters far more than the headlines suggest: this was the last fundamental capability that silicon hadn’t demonstrated. I went back through the record and realized that over the past four years, silicon spin qubits have systematically checked every single box on the fault-tolerance checklist — gates above threshold (2022), error correction protocols (2022), multi-qubit algorithms above threshold (2025), modular multi-register scaling (2025), stabilizer-based error detection (2026), and now universal logical operations with distillable magic states (2026). No single result was a blockbuster. But the aggregate is remarkable.
So I wrote what a comprehensive analysis of silicon’s position in the quantum computing race: The Dark Horse: How Silicon Quietly Assembled Every Building Block for Fault-Tolerant Quantum Computing.
The Manufacturing Argument Nobody Else Is Making
The thesis: silicon is the platform everyone underestimates because it’s always a step behind the leaders on scale — superconducting qubits have more qubits, neutral atoms have demonstrated error correction at larger distances. But silicon is the only platform where every demonstrated capability has a credible path to industrial-scale manufacturing. The same EUV scanners, CVD chambers, and cleanroom protocols that produce today’s processors can, with targeted modifications, produce quantum chips. No other qubit platform can say this.
Though as I note in the article, “targeted modifications” is doing some heavy lifting — you need isotopically purified ²⁸Si, ultra-clean gate oxides, and fabrication precision beyond standard CMOS. The bridge from classical to quantum silicon is real, but it’s an engineering bridge, not a flat road.
The Biased Noise Finding the Security Community Missed
There’s also a finding from the SZIQA experiments that I think the security community has completely missed: silicon donor qubits exhibit strongly biased noise — phase-flip errors dominate while bit-flip errors are essentially absent. Theoretical work shows this can push fault-tolerance thresholds from ~1% to over 5%, which means silicon could need significantly fewer physical qubits per logical qubit than standard CRQC resource estimates assume. I wrote a separate deep dive on the biased noise advantage because I think it deserves its own analysis.
What This Means for Your Threat Model
What does this mean for your quantum threat planning? It doesn’t change near-term timelines. But it adds another credible pathway to a CRQC — and silicon is classified as a “fast-clock” architecture alongside superconducting and photonic qubits, meaning a silicon-based quantum computer would execute cryptographic attacks in minutes, not hours. If your risk model has been quietly discounting silicon as “too far behind,” the dark horse article lays out why that assumption needs updating.
The full analysis, with five things to watch for and a comparison against every other platform, is here. It’s the anchor of a ten-article series covering the complete progression of silicon quantum computing. Individual milestone papers are linked from this one.
The Bottom Line
Take a step back and look at what happened in March 2026 alone. Resource estimates dropped by another order of magnitude. A new quantum modality (neutral atoms with qLDPC codes) entered the cryptanalytic conversation at shockingly low qubit counts. Silicon, the platform most analysts ignore, completed its fault-tolerance checklist. The U.S. intelligence community elevated quantum to a Tier-1 threat. And Google set a 2029 migration deadline.
None of these developments individually mean Q-Day is imminent. But collectively, they demolish the case for inaction. The algorithmic walls are closing in from one side. The hardware is advancing from the other. And the ecosystem of regulators, insurers, supply chains, the intelligence community, isn’t waiting for the two to meet.
If you haven’t started your PQC migration planning, the question is no longer “when should we start?” It’s “how do we explain why we haven’t?” The PQC Migration Framework is free, comprehensive, and ready to use. The Q-Day Estimator can help you model the timeline. The excuses are running out.
If you found this edition useful, forward it to a colleague who’s still on the fence about quantum risk. If I got something wrong, hit reply — I read everything and correct publicly. And if you’re new here, you can browse the full PostQuantum.com resource library at postquantum.com for the deep dives behind every topic covered above.
— Marin


