IBM Just Hit 1,000 Qubits. The Physicist Who Demanded Quantum Computers Died Before Seeing One.
Richard Feynman told physicists in 1981 that simulating nature requires a quantum computer. He died in 1988 — 10 years before the first one was built.
Key Takeaways
- •Feynman proposed quantum computers at MIT in 1981 and died in 1988 — a decade before the first one was built
- •Peter Shor's 1994 algorithm showed quantum computers could break RSA encryption
- •IBM's Condor processor crossed 1,000 qubits in 2023
- •Google's Willow chip achieved a computation that would take a classical supercomputer 10 septillion years
- •The quantum computing market is valued at over $32 billion by 2025
Root Connection
Richard Feynman's 1981 MIT lecture → IBM Condor 1,121-qubit processor and Google Willow chip (2024)
ROOT
On May 7, 1981, at the Massachusetts Institute of Technology, a 62-year-old physicist with a thick Queens accent and a reputation for being the most brilliant troublemaker in science stood before a room of colleagues at the First Conference on Physics and Computation. His name was Richard Phillips Feynman — Nobel laureate, Manhattan Project veteran, bongo player, safecracker, and the man who would that afternoon articulate an idea that the world wouldn\'t catch up to for four decades.
His lecture was titled "Simulating Physics with Computers." The argument was elegant and devastating in its simplicity. Feynman had been thinking about a fundamental problem: if you want to simulate a quantum mechanical system — say, the behavior of electrons in a molecule — on a classical computer, the computational requirements explode exponentially. A system of just 40 quantum particles requires more classical computing states than there are atoms in the observable universe. Classical computers don\'t just struggle with quantum simulation. They are architecturally incapable of doing it efficiently.
Feynman\'s solution: build a computer that operates on quantum mechanical principles. Let the computation itself be quantum. Use superposition — the ability of quantum bits to exist in multiple states simultaneously — and entanglement — the spooky correlation between quantum particles — as computational resources rather than obstacles. He proposed what he called a "universal quantum simulator": a programmable quantum system that could simulate any other quantum system.
The audience was skeptical. Most physicists in 1981 thought quantum computing was a fascinating thought experiment with no practical path to realization. The engineering challenges seemed insurmountable. Quantum states are fragile — they collapse when observed, they decohere when disturbed by heat or electromagnetic noise. Building a computer that maintains quantum coherence long enough to perform useful calculations seemed physically impossible.
Feynman didn\'t care about skepticism. He\'d spent his career ignoring conventional wisdom. He\'d reinvented quantum electrodynamics using path integrals when everyone said the existing formulation was fine. He\'d diagnosed the Challenger disaster when NASA didn\'t want to hear the answer. But this time, he wouldn\'t live to see his idea vindicated. Richard Feynman died on February 15, 1988, at age 69, from kidney cancer and a rare form of abdominal cancer. He was ten years too early.
In 1985, British physicist David Deutsch at Oxford published the first formal description of a universal quantum computer, proving mathematically that Feynman\'s intuition was correct — a quantum computer could simulate any physical process efficiently. But the paper remained largely theoretical. Nobody knew what a quantum computer would actually be good for beyond simulating physics.
That changed in 1994 when mathematician Peter Shor, working at Bell Labs, published an algorithm that sent shockwaves through both the computing and intelligence communities. Shor\'s algorithm could factor large numbers exponentially faster than any known classical algorithm. This mattered because the entire security infrastructure of the internet — RSA encryption, used by banks, governments, and militaries worldwide — depends on the assumption that factoring large numbers is computationally intractable. A working quantum computer running Shor\'s algorithm could break RSA encryption like tissue paper.
The intelligence agencies started paying attention. The race began. In 1998, physicists Isaac Chuang (then at IBM\'s Almaden Research Center) and Neil Gershenfeld (MIT) built the first working quantum computer: a 2-qubit system using nuclear magnetic resonance (NMR) on a molecule of chloroform. It successfully ran a quantum algorithm. It was primitive — 2 qubits can\'t do anything useful — but it proved the concept was physically real. Feynman had been dead for ten years.
TODAY
The quantum computing landscape in 2025-2026 is a mix of genuine breakthroughs and sobering reality. IBM\'s Condor processor, unveiled in December 2023, contains 1,121 superconducting qubits — crossing the 1,000-qubit threshold for the first time. IBM\'s Quantum System Two, a modular quantum computer the size of a large room, is commercially available and accessible via the cloud. Over 2,000 organizations have run quantum experiments on IBM\'s platform.
Google\'s quantum team achieved a landmark in 2024 with their Willow chip. The breakthrough wasn\'t raw qubit count — Willow has 105 qubits — but quantum error correction. For the first time, Google demonstrated that adding more qubits to an error-correcting code actually reduced the error rate, rather than increasing it. This is the fundamental threshold that quantum computing needs to cross to become practically useful. Google also showed Willow completing a benchmark computation in under 5 minutes that would take the world\'s fastest classical supercomputer 10 septillion years — a number so large it exceeds the age of the universe.
The competitive landscape is intense. IBM and Google use superconducting qubits cooled to near absolute zero (15 millikelvins — colder than outer space). IonQ uses trapped ion qubits, which have longer coherence times but are slower. Microsoft is betting on topological qubits — a fundamentally different approach that, if it works, would be inherently error-resistant. Rigetti Computing, Quantinuum (formed from Honeywell Quantum Solutions and Cambridge Quantum), and several Chinese firms including Origin Quantum are all in the race.
The quantum computing market is valued at over $32 billion by 2025, but context is critical. We are still in what physicist John Preskill dubbed the "NISQ era" — Noisy Intermediate-Scale Quantum. Today\'s quantum computers are powerful enough to demonstrate quantum advantage on specific benchmarks but too error-prone for most practical applications. The qubits are noisy, decoherence times are short, and error rates, while improving, still limit useful computation.
Practical applications are in advanced testing phases. Drug discovery: pharmaceutical companies including Roche and Merck are using quantum computers to simulate molecular interactions for drug candidates. Materials science: BMW and Daimler are exploring quantum simulation for battery chemistry. Financial modeling: Goldman Sachs and JPMorgan have quantum computing teams working on portfolio optimization and risk analysis. Cryptography: NIST finalized its first post-quantum cryptography standards in 2024, preparing for the day quantum computers can break current encryption — a transition that will take a decade and cost billions.
Enjoy This Article?
RootByte is 100% independent - no paywalls, no corporate sponsors. Your support helps fund education, therapy for special needs kids, and keeps the research going.
Support RootByte on Ko-fiHow did this make you feel?
Recommended Gear
View all →Disclosure: Some links on this page may be affiliate links. If you make a purchase through these links, we may earn a small commission at no extra cost to you. We only recommend products we genuinely believe in.
Framework Laptop 16
The modular, repairable laptop that lets you upgrade every component. The right-to-repair movement in action.
Flipper Zero
Multi-tool for pentesters and hardware hackers. RFID, NFC, infrared, GPIO - all in your pocket.
The Innovators by Walter Isaacson
The untold story of the people who created the computer, internet, and digital revolution. Essential tech history.
reMarkable 2 Paper Tablet
E-ink tablet that feels like writing on real paper. No distractions, no notifications - just thinking.
Keep Reading
Want to dig deeper? Trace any technology back to its origins.
Start Research