← Back to context

Comment by japanuspus

2 days ago

> Well right now I am very skeptical, but I think we have somewhat given quantum computing plenty of time (we have given it decades) unless someone can convince me that it is not a scam.

Shor's paper on polynomial time factoring is from 1997, first real demonstration of quantum hardware (Monroe et al.) is from 1995: Yes, quantum has had decades -- but only barely, and is has certainly only now started to have generations.

To look at the kind of progress this means, take a look of some of the recent phd spinouts of leading research groups (Oxford Ionics etc.): There are a lot of organisations with nothing but engineering to go before they reach fault tolerance.

When I came back to quantum three years ago, fault tolerance was still to be based on the surface code ideas that floated when I did my phd ('04). Today, after everyone has started looking harder, it turns out that a bit of long-range connectivity can cut the error correction overhead by orders of magnitude (see recent public posts by IBM Quantum): The goalposts for fault tolerance are moving in the right direction.

And this is the key thing about quantum computing: you need error correction, and you need to do it with the same error-prone hardware that you correct for. There is a threshold hardware quality that will let you do this at a reasonable overhead, and before you reach this threshold all you have is a fancy random number generator.

But yes, feel free to be a pessimist -- just remember to own it when quantum happens in a few years.