Comment by yoan9224

2 days ago

Aaronson's take is characteristically grounded. The Willow chip announcement was impressive technically but the media coverage predictably overshot into "RSA is dead" territory when the actual achievement was improving error correction rates. The relevant timeline question is: when do quantum computers solve problems faster than classical computers for commercially useful tasks (not just contrived benchmarks)?

The error correction milestone matters because it's the gate to scaling. Previous quantum systems had error rates that increased faster than you could add qubits, making large-scale quantum computing impossible. If Willow actually demonstrates below-threshold error rates at scale (I'd want independent verification), that unblocks the path to 1000+ logical qubit systems. But we're still probably 5-7 years from "useful quantum advantage" on problems like drug discovery or materials simulation.

The economic argument is underrated. Even if quantum computers achieve theoretical advantage, they need to beat rapidly improving classical algorithms running on cheaper hardware. Every year we delay, classical GPUs get faster and quantum algorithms get optimized for near-term noisy hardware. The crossover point might be narrower than people expect.

What I find fascinating is the potential for hybrid classical-quantum algorithms where quantum computers handle specific subroutines (like sampling from complex distributions or solving linear algebra problems) while classical computers do pre/post-processing. That's probably the first commercial application - not replacing classical computers entirely but augmenting them for specific bottlenecks. Imagine a drug discovery pipeline where the 3D protein folding simulation runs on quantum hardware but everything else is classical.

First? Try only. I'd be willing to wager a sizeable amount of money that no one save for a few niche research institutions trying to improve quantum computing will ever be using fully quantum setups.

QC is not a panacea. There are a handful of algorithms that are in BQP-P, and most of those aren't really used in tasks I would imagine the average person frequently engaging in. Simultaneously, quantum computers necessarily have complications that classical computers lack. Combined, I doubt people will be using purely quantum computers ever.