Quantum Computers Operate Colder Than Outer Space. The Root Goes Back to a 1981 Rant by Richard Feynman.
Google's Willow chip performed a calculation in 5 minutes that would take a classical supercomputer 10 septillion years. The idea that got us here started with Richard Feynman complaining that nature isn't classical.
Key Takeaways
- •Google's Willow chip (2024) performed a computation in 5 minutes that would take a classical supercomputer 10 septillion years
- •IBM's Condor processor broke 1,000 qubits in 2023, but quality matters more than quantity
- •Shor's algorithm can theoretically break RSA encryption — driving a global migration to post-quantum cryptography
- •Quantum processors operate at 15 millikelvins — 180 times colder than outer space
Root Connection
In 1981, Richard Feynman told a room of physicists that classical computers would never simulate quantum systems efficiently. His solution: build computers that are themselves quantum. Forty-five years later, Google, IBM, and Microsoft are racing to prove him right.
Quantum Processor Qubit Count (Gate-Based)
IBM's Condor broke 1,000 qubits in 2023 — but quality matters more than quantity
Source: IBM, Google, company announcements
Timeline
Einstein, Podolsky, Rosen publish EPR paradox — call entanglement 'spooky action at a distance'
Richard Feynman proposes quantum simulation at MIT — 'Nature isn't classical, dammit'
David Deutsch describes universal quantum computer at Oxford
Peter Shor publishes quantum factoring algorithm — cryptographers panic
First 2-qubit quantum computer demonstrated using NMR
Google claims quantum supremacy with 53-qubit Sycamore processor
IBM unveils Condor — first processor to break 1,000 qubits
Google's Willow chip demonstrates exponential quantum error correction
Microsoft announces Majorana 1 — first topological qubit processor
In December 2024, Google announced a chip called Willow. It had 105 qubits — a modest number by recent standards. But what it did with those qubits made physicists sit up straight.
Willow performed a specific quantum computation in under five minutes. Google estimated that the same calculation would take the world's most powerful classical supercomputer approximately 10 septillion years. That is 10,000,000,000,000,000,000,000,000 years — a number so large it exceeds the age of the universe by a factor of roughly 700 trillion.
More importantly, Willow demonstrated something that had never been proven experimentally before: as you add more qubits, the error rate goes down. In the quantum world, where fragility is the defining challenge, this was the first evidence that error correction could actually scale. It was, by many accounts, the most significant quantum computing milestone since Google's original supremacy claim in 2019.
But the idea that made all of this possible didn't start with a chip. It started with a complaint.
In May 1981, at a conference on Physics and Computation at MIT, Nobel laureate Richard Feynman delivered a keynote that would reshape computing. His argument was simple and devastating: classical computers are fundamentally incapable of efficiently simulating quantum mechanical systems. A system of just 40 interacting quantum particles requires more memory than any classical machine could hold. "Nature isn't classical, dammit," Feynman said, "and if you want to make a simulation of nature, you'd better make it quantum mechanical."
Nature isn't classical, dammit, and if you want to make a simulation of nature, you'd better make it quantum mechanical. — Richard Feynman, 1981
He proposed building computers that operate on quantum principles. At the time, this sounded like science fiction.
To understand why quantum computers are different, you need three concepts. The first is superposition. A classical bit is either 0 or 1. A qubit — quantum bit — can exist as 0, 1, or both simultaneously. Think of a coin spinning in the air: neither heads nor tails until it lands. With 300 qubits in full superposition, you can represent more states simultaneously than there are atoms in the observable universe.
The second is entanglement. When two qubits become entangled, measuring one instantly determines the state of the other, regardless of distance. Einstein called it "spooky action at a distance" and spent decades trying to disprove it. Experiments by Alain Aspect in 1982, and later by Anton Zeilinger — both Nobel Prize winners in 2022 — confirmed it is real. In computation, entanglement allows qubits to coordinate in ways that have no classical parallel.
The third is interference. Quantum algorithms are designed so that wrong answers cancel each other out (destructive interference) while correct answers reinforce each other (constructive interference). The art of quantum programming is arranging computations so the right answer has the highest probability when you measure.
Shor's algorithm didn't just threaten RSA encryption. It transformed quantum computing overnight from a physics curiosity into a national security crisis.
Four years after Feynman's talk, David Deutsch at Oxford formally described a universal quantum computer — a quantum analog of the Turing machine. With Richard Jozsa in 1992, he created the first algorithm proven to be exponentially faster on a quantum computer than any classical alternative. The problem it solved was artificial, but the proof was real: quantum speedup was not theoretical. It was mathematical fact.
Then in 1994, mathematician Peter Shor at AT&T Bell Labs dropped a bomb. He published an algorithm for integer factorization that runs in polynomial time on a quantum computer. Classical computers need sub-exponential time for the same task. This might sound abstract until you realize what it means: the entire RSA encryption system — which secures virtually all internet commerce, banking, and government communications — relies on the assumption that factoring large numbers is computationally infeasible. Shor's algorithm, on a powerful enough quantum computer, could crack RSA-2048 in hours instead of the billions of years a classical machine would need.
Overnight, quantum computing went from a physics curiosity to a national security crisis. Intelligence agencies worldwide began funding quantum research. NIST launched a post-quantum cryptography standardization effort, finalizing the first three standards in August 2024 — mathematical systems believed to be resistant even to quantum attack.
Two years after Shor, Lov Grover at Bell Labs published a quantum search algorithm offering a quadratic speedup for unsorted databases. Where a classical computer needs N steps to search N items, Grover's algorithm needs only the square root of N. Less dramatic than Shor's exponential gain, but broadly applicable to almost any search problem.
The first physical quantum computers arrived in 1998 — 2-qubit machines using nuclear magnetic resonance, built independently by teams at IBM/MIT and Oxford. In 2001, IBM and Stanford used a 7-qubit NMR computer to factor the number 15 using Shor's algorithm. It was a toy demonstration, but it proved the theory worked on real hardware.
The race that followed has been extraordinary. In 2016, IBM put a 5-qubit quantum computer on the cloud — the first time anyone in the world could run quantum circuits on real hardware for free. Google's 53-qubit Sycamore claimed quantum supremacy in October 2019, performing a random circuit sampling task in 200 seconds that Google estimated would take the Summit supercomputer 10,000 years (IBM disputed this, claiming 2.5 days). China's team led by Jian-Wei Pan built Jiuzhang, a photonic quantum computer that in 2020 performed Gaussian boson sampling with 76 photons, and Zuchongzhi, a 66-qubit superconducting processor.
IBM's roadmap tells the scaling story: Falcon (27 qubits, 2019), Eagle (127 qubits, 2021), Osprey (433, 2022), Condor (1,121 qubits, 2023). But IBM learned that raw qubit count is misleading. Their 133-qubit Heron processor, with dramatically better error rates and a tunable coupler architecture, outperforms the larger Condor on practical tasks. The future is modular — linking multiple high-quality processors together rather than building one massive chip.
In February 2025, Microsoft announced Majorana 1, the world's first processor built on topological qubits — exotic quasiparticles predicted by Ettore Majorana in 1937. Topological qubits encode information in geometric properties of quantum states, making them inherently more resistant to environmental noise. If the approach works at scale, it could dramatically reduce the number of physical qubits needed for error correction.
The biggest challenge remains decoherence. Superconducting qubits must be cooled to 15 millikelvins — about 180 times colder than outer space — in dilution refrigerators. Even then, qubits lose their quantum state in microseconds to milliseconds. Current gate error rates hover around 0.1% to 1%, compared to classical computers at about one error per 10 quintillion operations. To run Shor's algorithm on meaningful key sizes, estimates suggest you would need roughly 4 million physical qubits. The largest gate-based machine today has about 1,100.
We are in what Caltech physicist John Preskill calls the NISQ era — Noisy Intermediate-Scale Quantum. Current machines can do interesting things on narrow problems but cannot yet outperform classical computers on any practical task. That milestone — quantum advantage, as distinct from quantum supremacy — remains the holy grail.
The applications waiting on the other side are profound. Drug discovery: molecules are quantum systems, and simulating protein folding is a natural fit. Materials science: designing better batteries, catalysts, and superconductors. Cryptography: both the threat (breaking encryption) and the defense (quantum key distribution). Financial modeling: quadratic speedups for Monte Carlo simulations and portfolio optimization. Climate science: higher-resolution models of atmospheric and ocean systems.
Intelligence agencies are already conducting "harvest now, decrypt later" operations — collecting encrypted data today, planning to decrypt it once quantum computers are powerful enough. This is not a future threat. It is happening now. The migration to post-quantum cryptography has begun: Apple has implemented it in iMessage, Signal has adopted it, and NIST's standards are rolling out globally.
Feynman's 1981 complaint has become a $35-billion global industry. Google, IBM, Microsoft, IonQ, Rigetti, Quantinuum, D-Wave, Atom Computing, PsiQuantum, and dozens of others are pursuing at least five fundamentally different hardware approaches. The diversity is a strength — it increases the odds that at least one path reaches fault-tolerant, error-corrected quantum computing. IBM targets 200 logical qubits by the early 2030s. Google aims for a useful, error-corrected machine by the end of this decade.
Quantum computers will not replace your laptop. They will not run your email or your browser. They are specialized machines for specific problem types where quantum mechanics provides a structural advantage. Think of them less as faster computers and more as entirely different instruments — telescopes for computational spaces that classical machines cannot see.
But for the problems they can solve — simulating nature, cracking codes, optimizing complex systems — they will be transformative. Feynman was right. Nature isn't classical. And after 45 years, we are finally building machines that speak its language.
How did this make you feel?
Recommended Gear
View all →Disclosure: Some links on this page may be affiliate links. If you make a purchase through these links, we may earn a small commission at no extra cost to you. We only recommend products we genuinely believe in.
Framework Laptop 16
The modular, repairable laptop that lets you upgrade every component. The right-to-repair movement in action.
Flipper Zero
Multi-tool for pentesters and hardware hackers. RFID, NFC, infrared, GPIO — all in your pocket.
The Innovators by Walter Isaacson
The untold story of the people who created the computer, internet, and digital revolution. Essential tech history.
reMarkable 2 Paper Tablet
E-ink tablet that feels like writing on real paper. No distractions, no notifications — just thinking.
Keep Reading
Want to dig deeper? Trace any technology back to its origins.
Start Research