Comment by __turbobrew__

9 hours ago

Yea that is a missing part from the argument. It is said that we should judge QC by the advances it has made and not if it can break the latest algos.

Well what is the progress then? How has QC advanced in the past 20 years, what is the latest QC can do?

Qbits and error rates have both improved by a few orders of magnitude. Google's paper from August is the first demonstration of error correction. We're probably ~5 years away from doing toy examples of Shor's algorithm with error correction, and ~20 from being able to factor numbers as big as a classical computer.