← Back to context

Comment by FiloSottile

4 hours ago

See https://bas.westerbaan.name/notes/2026/04/02/factoring.html and https://scottaaronson.blog/?p=9665#comment-2029013 which are linked to in the first section of the article.

> Sure, papers about an abacus and a dog are funny and can make you look smart and contrarian on forums. But that’s not the job, and those arguments betray a lack of expertise. As Scott Aaronson said:

> Once you understand quantum fault-tolerance, asking “so when are you going to factor 35 with Shor’s algorithm?” becomes sort of like asking the Manhattan Project physicists in 1943, “so when are you going to produce at least a small nuclear explosion?”

To summarize, the hard part of scalable quantum computation is error correction. Without it, you can't factorize essentially anything. Once you get any practical error correction, the distance between 32-bit RSA and 2048-bit RSA is small. Similarly to how the hard part is to cause a self-sustaining fissile chain reaction, and once you do making the bomb bigger is not the hard part.

This is what the experts know, and why they tell us of the timelines they do. We'd do better not to dismiss them by being smug about our layperson's understanding of their progress curve.

The thing is, producing the right isotopes of uranium is mostly a linear process. It goes faster as you scale up of course, but each day a reactor produces a given amount. If you double the number of reactors you produce twice as much, etc.

There is no such equivalent for qubits or error correction. You can't say, we produce this much extra error correction per day so we will hit the target then and then.

There is also something weird in the graph in https://bas.westerbaan.name/notes/2026/04/02/factoring.html. That graph suggests that even with the best error correction in the graph, it is impossible to factor RSA-4 with less then 10^4 qubits. Which seems very odd. At the same time, Scott Aaronson wrote: "you actually can now factor 6- or 7-digit numbers with a QC". Which in the graph suggests that error rate must be very low already or quantum computers with an insane number of qubits exist.

Something doesn't add up here.

  • We are stretching the metaphor thin, but surely the progress towards an atomic bomb was not measured only in uranium production, in the same way that the progress towards a QC is not measured only in construction time of the machine.

    At the theory level, there were only theories, then a few breakthroughs, then some linear production time, then a big boom.

    > Something doesn't add up here.

    Please consider it might be your (and my) lack of expertise in the specific sub-field. (I do realize I am saying this on Hacker News.)

  • You can already factor a 6 digit number with a QC, but not with an algorithm that scales polynomially. The graph linked is for optimized variants of Shor's algorithm.

> produce at least a small nuclear explosion

The Manhattan Project scientists actually did this before anybody broke ground at Los Alamos. It was called the Chicago Pile. And if the control rods were removed and the SCRAM disabled, it absolutely would have created a "small nuclear explosion" in the middle of a major university campus.

Given the level of hype and how long it's been going on, I think it's totally reasonable for the wider world to ask the quantum crypto-breaking people to build a Chicago Pile first.

https://en.wikipedia.org/wiki/Chicago_Pile-1