Quantum computing may be the technology where the gap between what is claimed and what actually exists is widest. That makes it an unusually clean application of the book's Hype vs. Reality framework — and a useful corrective to the tendency to treat every emerging technology as either imminent revolution or permanent fantasy.
Quantum computing uses the principles of quantum mechanics — superposition, entanglement, and interference — to perform certain types of computation that classical computers cannot do efficiently. The key word is "certain." Quantum computers are not faster computers. They are different computers, suited to problems with specific mathematical structures.
In 2019, Google claimed "quantum supremacy" — that its Sycamore processor had performed a calculation in 200 seconds that would take the best classical supercomputer 10,000 years. The claim was contested (IBM argued a classical computer could do it in days with better algorithms), and the calculation itself had no practical application. But it demonstrated that quantum processors could do something classical processors could not.
Since then, progress has continued but the timeline for practical impact has stretched. IBM has published an ambitious roadmap for scaling quantum processors. Google, Microsoft, and others are pursuing different qubit technologies. Error correction — the ability to maintain quantum states long enough to complete useful computations — remains the central technical challenge, and solutions are advancing but not solved.
Where quantum computing would actually matter is in specific domains: simulating molecular behavior for drug discovery and materials science, breaking certain cryptographic systems (RSA encryption), and solving optimization problems in logistics and finance. Of these, the cryptographic implications are the most near-term concern. A sufficiently powerful quantum computer could break the encryption that secures most of the internet. This threat is taken seriously enough that governments and industry are already transitioning to "post-quantum" cryptographic standards.
The book's Occam's Razor discipline — counting the assumptions required for a prediction to come true — is perfectly suited to quantum computing discourse. Claims that quantum computers will "revolutionize everything" require a stack of assumptions: that error correction will be solved at scale, that enough logical qubits can be maintained coherently, that useful algorithms can be developed for real-world problems, and that the advantages justify the enormous cost. Each assumption is individually plausible but the combination is less certain than headlines suggest.
This does not mean quantum computing is hype. The science is real, the progress is genuine, and the cryptographic implications are serious enough to warrant immediate action on post-quantum encryption. But the Hype vs. Reality framework helps distinguish between "this technology will matter for specific, defined applications on a timeline of years to decades" and "this technology will change everything overnight" — and the former is far more likely than the latter.
The Technological Convergence dimension is relevant too. Quantum computing's most promising applications are in combination with other technologies: quantum simulation of molecules for drug design (convergence with biology), quantum optimization for AI training (convergence with machine learning), and quantum-safe cryptography (convergence with cybersecurity). It is an amplifier and enabler, not a standalone revolution.