Quantum Observer #6 — The Third Lever
Architecture joins algorithms in shrinking the path to CRQC. Cloudflare matches Google's 2029 deadline. And China's quantum strategy is more coordinated than the West admits.
In this edition: After last edition's coverage of three papers that rewrote the resource estimation math, I jokingly asked quantum researchers to give me a few days to catch up before publishing the next breakthrough. They did not listen, and this edition may be even more consequential. This week's lead is a paper I consider more strategically important than any of March's blockbusters: Q-CTRL published a heterogeneous architecture design that cuts RSA-2048 to 190,000–381,000 physical qubits — not by inventing a new algorithm, but by organizing the hardware like a classical computer. Architecture is now the third independent lever compressing CRQC resource estimates, alongside algorithmic and QEC code innovation. Cloudflare joined Google in setting 2029 as its PQC migration deadline, explicitly citing last month's papers as the catalyst. Two companies that collectively operate much of the internet's infrastructure now agree on the timeline. QuiX Quantum put photonics on the board with the first below-threshold error mitigation result, one day after my CRQC Scorecard said photonics had scored zero on every metric. I also completed one of my most ambitious projects yet: a 10-article Deep Dive series examining every dimension of China's quantum program, with a capstone arguing that if current trends hold, China will win the quantum race. The PQC Migration Framework crossed 10,000 downloads, and the most common feedback revealed something troubling about how organizations are approaching migration. And in the "quantum flapdoodle" department: the physics of Ghost Murmur don't survive contact with a calculator.
The Paper Nobody Is Talking About Enough: Architecture as the Third Lever
Last month’s parade of resource estimation papers (Google’s 500,000-qubit Bitcoin result, Oratomic’s 10,000-qubit Shor’s, the INRIA ECC optimization pipeline) all shared a common lever: making the algorithm more efficient. A second line of work, qLDPC codes and similar constructions, attacks the error correction overhead. Both are important, and I spent the last newsletter unpacking them.
This week, Q-CTRL published a paper that demonstrates a third independent lever: architecture. Not a new algorithm. Not exotic error correction codes. Just a smarter way to organize the same computation across specialized hardware with the result of a reduction to 190,000–381,000 physical qubits for RSA-2048, with a runtime of under 10 days.
The core insight is almost embarrassingly simple: in Gidney’s efficient implementation of Shor’s algorithm, each qubit sits idle for roughly 97% of all clock cycles. In a monolithic quantum computer, those idle qubits occupy the same expensive, actively error-corrected hardware as the working qubits. They accumulate errors, consume cryogenic cooling, and demand dense wiring, all while doing nothing. Q-CTRL’s Q-NEXUS architecture fixes this by separating processing from storage: a tiny superconducting QPU (three logical qubits per core) handles the fast gates, while idle qubits get shipped to quantum memory, connected via a quantum bus.
I’ve been arguing for years that the most likely path to a CRQC is a heterogeneous system rather than a monolithic chip. Q-CTRL’s paper is the most rigorous validation of this thesis on a real cryptographic benchmark.
Why Three Levers Matter More Than Any Single Paper
What I want security planners to internalize: these three levers — algorithms, QEC codes, and architecture — are largely independent. They multiply rather than add. The RSA-2048 trajectory tells the story: 20 million qubits (2021) → ~1 million (2025) → ~100,000 with qLDPC codes (2026) → 190,000–381,000 with architecture alone on surface codes (2026) → potentially well under 100,000 combining all three. That last number is starting to overlap with industry roadmaps for the late 2020s to early 2030s.
Q-CTRL also introduces the concept of the Application-Specific QPU — the quantum equivalent of a GPU or ASIC. They found that 70% of the RSA-2048 runtime is consumed by a single subroutine, the Adder. A dedicated 37-logical-qubit accelerator for that operation cuts runtime by 46% for a 13% hardware increase. An adversary building a CRQC wouldn’t construct a general-purpose quantum mainframe — they’d build a purpose-optimized system, just as Bitcoin miners use ASICs rather than CPUs.
The caveats are real: the quantum bus (high-fidelity Bell-pair transfer between modules) hasn’t been demonstrated at scale, the memory technologies are early-stage, and the compiler is simulated. I document all of them in my full analysis. But these are engineering gaps, not physics barriers.
The strategic implication: the race to Q-Day may now be fundamentally a quantum interconnect race. The traditional threat indicator has been “how many qubits can platform X fabricate?” The new one should be “who has demonstrated high-fidelity quantum state transfer between a fast-clock QPU and a slow-clock memory?” I’ll be updating my CRQC Scorecard to add quantum interconnect maturity as a new tracking metric.
Cloudflare Joins Google: The 2029 Consensus Is Forming
Two weeks ago, Google set 2029 as its deadline for completing PQC migration. I wrote at the time that the question was whether this would remain an outlier or become the benchmark.
We got our answer in thirteen days.
Cloudflare announced it is accelerating its post-quantum roadmap to match Google’s 2029 target, explicitly citing last month’s Google ECC paper and Oratomic’s 10,000-qubit result as the catalysts. This isn’t a marketing gesture. Cloudflare handles DNS, CDN, DDoS protection, and reverse-proxy services for a substantial fraction of global web traffic. Between Google and Cloudflare, the companies that actually operate the internet’s plumbing have converged on the same number.
Three details in Cloudflare’s announcement deserve attention.
First, Cloudflare reports that over 65% of human-initiated traffic to its network already uses post-quantum encryption. But the new roadmap targets the harder problem: post-quantum authentication. Digital signatures, certificates, identity infrastructure. This is the domain of Trust Now, Forge Later (TNFL), a threat I first described in 2018 that’s finally getting the attention it warrants.
Second, the announcement quotes Scott Aaronson’s warning that researchers working on Shor’s algorithm resource estimates may have already stopped publishing their findings. If the world’s top quantum algorithm researchers are sitting on optimizations that make the published numbers even worse, the public resource estimates are an upper bound on difficulty, not a best estimate. That should concern anyone using published qubit counts for threat planning.
Third, this is exactly the dynamic I described in Q-Day Predictions Are Irrelevant — Deadlines Are Set. The reason to act on PQC isn’t a specific Q-Day prediction — it’s that the ecosystem is setting deadlines regardless of when a CRQC arrives. If your cryptographic infrastructure interfaces with Google’s or Cloudflare’s, and whose doesn’t?, you now have a deadline whether you set one or not.
Photonics Puts Its First Points on the Board
I published my CRQC Scorecard on April 1st. The photonic section was the starkest: zero demonstrated logical qubits, zero logical gates, effectively the entire journey remaining on every metric.
That lasted exactly one day.
QuiX Quantum demonstrated below-threshold error mitigation on a photonic quantum computer — the first time any photonic platform has shown it can remove more errors than it introduces. The work, in collaboration with NASA, the University of Twente, and Freie Universität Berlin, used a 20-mode silicon-nitride processor to perform photon distillation: a technique that cleans up individual photons through quantum interference before computation.
Let me be precise about what this is and isn’t. Photon distillation is not quantum error correction. It is not a logical qubit. It doesn’t demonstrate syndrome extraction, magic state production, or any of the system-level capabilities in my CRQC Quantum Capability Framework. Photonics still has the largest gap to a CRQC of any modality I track.
But it’s the first time photonics has put points on the board. The modeling suggests photon distillation could reduce photon source requirements per logical qubit by up to 4×, directly attacking photonics’ biggest scalability bottleneck, since photon sources constitute the vast majority of components in a photonic quantum computer.
There’s also a detail worth noting: the project was partially funded by the Netherlands Ministry of Defense through a project called QSHOR. The name tells you what the defense establishment is interested in: photonic paths toward running Shor’s algorithm.
Full analysis in my QuiX deep dive.
China’s Quantum Ambition: The Deep Dive Is Complete
I’ve just completed one of the most ambitious research projects I’ve undertaken on PostQuantum.com: a 10-article Deep Dive series that took months of investigation and runs to roughly 80,000 words across industrial policy, investment architecture, the Hefei National Laboratory, talent pipelines, quantum networking and QKD, computing hardware, quantum sensing, supply chain self-sufficiency, and a capstone synthesis.
The capstone, Underestimating China: Why Beijing Could Win the Quantum Race, makes a case I expect will be contested: if nothing changes in current trend lines, China will win the quantum cold war.
That isn’t because China’s quantum hardware is better today. It isn’t — the error correction gap is real, and the private sector ecosystem is weaker. The case rests on something more structural. Across EVs, 5G, drones, robotics, solar, and shipbuilding, China has executed the same playbook: massive coordinated investment, long-horizon industrial policy, rapid talent scaling, and systematic conversion of Western export controls into accelerants for domestic self-sufficiency. That playbook is now pointed at quantum computing. China leads in 69 of 74 critical technologies tracked by ASPI. It operates the world’s only carrier-grade quantum network. It has exported its first quantum computer.
What makes this particularly dangerous is the asymmetry of institutional response. While China mobilizes government, military, state enterprises, and academia as a single coordinated system, the U.S. proposed cutting NSF quantum research funding by 37%. Scientists are leaving U.S. institutions for Chinese universities. Over 1,000 U.S. faculty signed a letter warning that America’s own policies have been more effective at driving talent to China than any recruitment program Beijing ever ran.
I documented China’s weaknesses honestly — the growing scientific isolation, the dependence on Western cryogenic components, the weak translation from research to commercial products. But the series also shows how systematically those weaknesses are being addressed, often with timelines measured in years rather than decades.
I lived and worked in China for years. I’ve watched industry after industry where Western executives dismissed Chinese competition, confident that quality gaps and IP challenges would hold. They were wrong every time. The full series is at postquantum.com/chinas-quantum-ambition, and the geopolitical dimensions are explored further in my forthcoming book, Quantum Sovereignty.
10,000 Downloads — and a Pattern in the Feedback
The PQC Migration Framework crossed 10,000 downloads this week. What people told me about it was more interesting than the number itself.
The most common response wasn’t gratitude or even disagreement — it was some version of alarm. Teams that thought they had a handle on PQC migration discovered that their internal approach was missing entire categories of work: cryptographic discovery beyond certificate inventories, vendor dependency analysis that typically defines the real critical path, hybrid deployment patterns that don’t break interoperability, governance structures for a multi-year program rather than a one-off project.
That pattern in the feedback confirmed something I’d suspected: many organizations that started thinking about PQC migration this year are working from a mental model of the problem that’s an order of magnitude too simple. The complexity isn’t in swapping one algorithm for another. It’s in finding every place cryptography lives in your environment — hardcoded keys, embedded protocols, third-party dependencies buried four layers deep in your supply chain — and managing a migration across thousands of systems with different vendor timelines, different regulatory requirements, and different tolerance for disruption.
The framework is free, open-source (CC BY 4.0), requires no registration, and is available at pqcframework.com. If you’ve started your quantum readiness journey, or think you have, stress-test your approach against it. The teams that had to restart weren’t behind — they’d been solving a simpler problem than the one they actually face. For a comprehensive guide to the organizational strategy behind all of this, my forthcoming book Quantum Ready covers the full picture.
Quantum Flapdoodle: Ghost Murmur and the Limits of Journalism
Every edition needs a reminder that the word “quantum” doesn’t make a claim true. This week delivered a spectacular example.
The New York Post reported that the CIA used a system called “Ghost Murmur”, described as a quantum magnetometer built by Lockheed Martin’s Skunk Works, to detect a downed American airman’s heartbeat from 40 miles away in the Iranian desert. The story went viral across defense blogs, investment sites, and cable news. Lockheed Martin’s stock ticked up. Nobody called a physicist.
The problem isn’t that quantum magnetometry is fake — it’s genuinely impressive science. The problem is arithmetic. The heart produces a magnetic field of roughly 25 picotesla at chest contact which is already two million times weaker than Earth’s background field. That signal decays as the cube of distance. At 40 miles, you’re looking at a signal roughly 190 quadrillion times weaker than it is at the chest. This isn’t below the noise floor. It’s below the Heisenberg limit — the absolute measurement floor that quantum mechanics permits for any sensor, built by anyone, using any technology, now or in the future.
Scientific American has since published a thorough debunking, quoting physicists who range from diplomatic (”may overstate the maturity of the technology”) to blunt (”somebody yanking a reporter’s chain”). Ghost Murmur is most likely either deliberate disinformation designed to project capability to adversaries, or the intelligence community’s idea of an inside joke.
The actual quantum sensing research happening around the world is remarkable and worth covering seriously.
The Signal Through the Noise
Take a step back and look at what’s happened since the last newsletter.
Resource estimates for breaking cryptography gained a third compression lever — architecture — that’s independent of and multiplies with algorithmic and QEC code improvements. A second internet infrastructure giant converged on 2029 as the PQC migration deadline, explicitly because the research scared them. A quantum modality that had scored zero on every CRQC metric put its first points on the board. And a comprehensive 10-article analysis of China’s quantum program suggests the West’s biggest competitor is executing a more coordinated strategy than most observers realize.
How many signals does one need?
The arguments for inaction are running out. The PQC Migration Framework is free. The Q-Day Estimator can help you model the timeline. The Practical Steps to Quantum Readiness guide will walk you through where to start. And if your leadership still thinks this is a problem for next year, remind them: the deadlines are already set.
If you found this edition useful, forward it to a colleague who’s still on the fence about quantum risk. If I got something wrong, hit reply — I read everything and correct publicly. And if you’re new here, the full PostQuantum.com resource library is at postquantum.com for the deep dives behind every topic covered above.
— Marin


