Comment by Ardren
8 hours ago
> Shor of Damocles
What is the biggest number factored using Shor's algorithm?
Last time I looked it was very unimpressive.
Edit: It's gotten worse. 21 from 2012. "Replication of Quantum Factorisation Records with an 8-bit Home Computer, an Abacus, and a Dog" say the factorization of 35 in 2019 actually failed.
I will let Scott Aaronson speak. (See https://scottaaronson.blog/?p=9668)
> Sometimes these days, I'll survey the spectacular recent progress in fault-tolerance, 2-qubit gate fidelities, programmable hundred-qubit systems, etc., only to be answered with a sneer: "What's the biggest number that Shor's algorithm has factored? Still 15 after all these years? Haha, apparently the emperor has no clothes!" I've commented that this is sort of like dismissing the Manhattan Project as hopelessly stalled in 1944, on the ground that so far it hasn't produced even a tiny nuclear explosion... If there's a reason why you think it can't work beyond a certain scale, say so. But don't fixate on one external benchmark and ignore everything happening under the hood, if the experts are telling you that under the hood is where all the action now is, and your preferred benchmark is only relevant later.
> If there's a reason why you think it can't work beyond a certain scale, say so
I'm not saying it can't work. Just that in 14 years no one has managed to factor a larger number than 21. Seemingly focus has shifted to other factoring algorithms that don't have performance improvements over conventional computing.
I'm not the one implying that Shor's algorithm will breaking encryption in "a few years from now".
The concern is that there's a large enough chance that it might to be worth planning for the outcome. That chance doesn't need to be high for that to be the case. And there's good reason to believe that the size of number that has been factored up to now is not a reliable indicator that the growth rate will remain very slow.
(The analogy with the Manhattan project is apt: an adversary learning about it would have been wise to adjust their planning around the possibility of it succeeding even if they judged that it was not a given that it would)
> [...] no one has managed to factor a larger number than 21.
Small correction: no one has PUBLICLY managed to factor a larger number than 21.
There could be advances (foreign and domestic) that just don't get published because they represent having an upper-hand with regards to cryptography. So, from Game Theory perspective, not making waves is in the interest of nation states. They'll even try to be dismissive about concerns.
> dismissing the Manhattan Project as hopelessly stalled in 1944
Then again, there are enough examples of failed projects. Why should this be comparable to the Manhattan project? In 1944, it was only two years underway, whereas Shor's algorithm is over 30. Tons of articles have been published on quantum computing, while the A bomb was kept as secret as possible, making learning from other countries, sometimes even from colleagues, impossible. In 1942, an atomic explosion was still hypothetical, whereas quantum computing had its first commercial service 7 years ago. Etc.
So, while in principle lack of progress doesn't guarantee failure, a comparison to the Manhattan Project is stylistic bullshit.
> Then again, there are enough examples of failed projects. Why should this be comparable to the Manhattan project? In 1944, it was only two years underway, whereas Shor's algorithm is over 30.
1944 is a bit arbitrary. Szilard for one was thinking about it earlier:
> […] He conceived the nuclear chain reaction in 1933, and patented the idea in 1936. In late 1939 he wrote the letter for Albert Einstein's signature that resulted in the Manhattan Project that built the atomic bomb….
* https://en.wikipedia.org/wiki/Leo_Szilard
Partly inspired in 1932 by reading Wells' book, published in 1914:
* https://en.wikipedia.org/wiki/The_World_Set_Free
How long was humanity thinking about flying before the Wright brothers and 1903? We had Babbage's analytical engine (and Lovelace) in 1837, with Zuse's Z2 and the British bombes both in 1940; Zuse's Z3 in 1941.
The main point is that just as you can't ask for tiny nuclear explosion because nuclear physics just doesn't work that way, you also can't ask for factoring of 21 with Shor's algorithm. Quantum computing just doesn't work that way, sorry.
4 replies →
I talked to a guy who did his doctoral degree on quantum computing and he was not worried at all. In fact he thought it was wildly overhyped, and like cold fusion, self driving cars, or string theory, always just around the corner. Just give us five more years and another grant, please.
Meanwhile Waymo has 200 million autonomous miles under its belt.
4 replies →
N=1 sample size.
I talked to another guy with the same degree in the same field and he was concerned.
Scott used to be that guy.
I said this about LLMs a few years ago, and now here we are.
Yeah 70 years ago right.
The abacus thing is pretty funny, but it's dangerously uninformed. https://bas.westerbaan.name/notes/2026/04/02/factoring.html
Is there a better benchmark to use?
Honest question.
How can a lay person track the real word progress of quantum computers?
Most approaches have missing "capabilities" that can be tracked. Adam Zalcman lays them out for superconducting qubits here. https://westerbaan.name/~bas/rwpqc2026/adam.pdf
For the neutral atoms approach in particular there doesn't seem to be a clear capability missing anymore to building a full scale CRQC: each of the separate components has been demonstrated. Of course when they try to put everything together they'll undoubtedly hit unexpected issues with integration. Wish I could be a fly on the wall at those labs.