← Back to context

Comment by charlieyu1

7 hours ago

Afaik we still haven’t factorised 35 using Shor’s Algorithm, I really don’t understand the current hype

There is some thought that we haven't factored anything yet via Shor's. The previous demonstrations assumed the results in their construction.

I don't think this is useful distinction nowadays. If Willow demonstrated RCS within 100-qubit, I don't really think there are physical limitations for them to implement Shor for 50-bit integer numbers. It is just less impressive so they are probably saving that work until 1000-qubit achieved in a few years.

  • Shor requires very good qubits to work, with a very low error rate. When someone will have a QC good enough to actually run it, even for small numbers, they will be screaming from the rooftops about it.

Yea that is a missing part from the argument. It is said that we should judge QC by the advances it has made and not if it can break the latest algos.

Well what is the progress then? How has QC advanced in the past 20 years, what is the latest QC can do?

  • Qbits and error rates have both improved by a few orders of magnitude. Google's paper from August is the first demonstration of error correction. We're probably ~5 years away from doing toy examples of Shor's algorithm with error correction, and ~20 from being able to factor numbers as big as a classical computer.