Comment by sanxiyn

9 hours ago

I will let Scott Aaronson speak. (See https://scottaaronson.blog/?p=9668)

> Sometimes these days, I'll survey the spectacular recent progress in fault-tolerance, 2-qubit gate fidelities, programmable hundred-qubit systems, etc., only to be answered with a sneer: "What's the biggest number that Shor's algorithm has factored? Still 15 after all these years? Haha, apparently the emperor has no clothes!" I've commented that this is sort of like dismissing the Manhattan Project as hopelessly stalled in 1944, on the ground that so far it hasn't produced even a tiny nuclear explosion... If there's a reason why you think it can't work beyond a certain scale, say so. But don't fixate on one external benchmark and ignore everything happening under the hood, if the experts are telling you that under the hood is where all the action now is, and your preferred benchmark is only relevant later.

> If there's a reason why you think it can't work beyond a certain scale, say so

I'm not saying it can't work. Just that in 14 years no one has managed to factor a larger number than 21. Seemingly focus has shifted to other factoring algorithms that don't have performance improvements over conventional computing.

I'm not the one implying that Shor's algorithm will breaking encryption in "a few years from now".

  • The concern is that there's a large enough chance that it might to be worth planning for the outcome. That chance doesn't need to be high for that to be the case. And there's good reason to believe that the size of number that has been factored up to now is not a reliable indicator that the growth rate will remain very slow.

    (The analogy with the Manhattan project is apt: an adversary learning about it would have been wise to adjust their planning around the possibility of it succeeding even if they judged that it was not a given that it would)

  • > [...] no one has managed to factor a larger number than 21.

    Small correction: no one has PUBLICLY managed to factor a larger number than 21.

    There could be advances (foreign and domestic) that just don't get published because they represent having an upper-hand with regards to cryptography. So, from Game Theory perspective, not making waves is in the interest of nation states. They'll even try to be dismissive about concerns.

> dismissing the Manhattan Project as hopelessly stalled in 1944

Then again, there are enough examples of failed projects. Why should this be comparable to the Manhattan project? In 1944, it was only two years underway, whereas Shor's algorithm is over 30. Tons of articles have been published on quantum computing, while the A bomb was kept as secret as possible, making learning from other countries, sometimes even from colleagues, impossible. In 1942, an atomic explosion was still hypothetical, whereas quantum computing had its first commercial service 7 years ago. Etc.

So, while in principle lack of progress doesn't guarantee failure, a comparison to the Manhattan Project is stylistic bullshit.

  • > Then again, there are enough examples of failed projects. Why should this be comparable to the Manhattan project? In 1944, it was only two years underway, whereas Shor's algorithm is over 30.

    1944 is a bit arbitrary. Szilard for one was thinking about it earlier:

    > […] He conceived the nuclear chain reaction in 1933, and patented the idea in 1936. In late 1939 he wrote the letter for Albert Einstein's signature that resulted in the Manhattan Project that built the atomic bomb….

    * https://en.wikipedia.org/wiki/Leo_Szilard

    Partly inspired in 1932 by reading Wells' book, published in 1914:

    * https://en.wikipedia.org/wiki/The_World_Set_Free

    How long was humanity thinking about flying before the Wright brothers and 1903? We had Babbage's analytical engine (and Lovelace) in 1837, with Zuse's Z2 and the British bombes both in 1940; Zuse's Z3 in 1941.

  • The main point is that just as you can't ask for tiny nuclear explosion because nuclear physics just doesn't work that way, you also can't ask for factoring of 21 with Shor's algorithm. Quantum computing just doesn't work that way, sorry.

    • The analogy between nuclear fission and quantum computing doesn’t really work. Fission was a relatively new physical phenomenon the Manhattan Project scientists were studying to turn it into a weapon of mass destruction on a scale that too had no precedent except in natural disasters. Quantum computing is a new technology that is supposed to make already effectively computable problems computable faster; it is ideally supposed to provide an increase in capacity, not capability. It should definitely be able to make tiny computations work before going for the bigger problems. That’s how all computing works, if it can’t solve simple problems, it’s never going to solve bigger ones. What you’re saying here essentially sounds like “there will be a magical event one day when quantum computing solves the biggest computing problems and we’ll all realize it works.”

      I am not particularly invested either which way about the likelihood of quantum computing being a major breakthrough or not but this is seeming like yet one more area of computing research like crypto and LLMs which in recent years is increasingly being flooded by people on a hype train.

I talked to a guy who did his doctoral degree on quantum computing and he was not worried at all. In fact he thought it was wildly overhyped, and like cold fusion, self driving cars, or string theory, always just around the corner. Just give us five more years and another grant, please.