Comment by prof-dr-ir

2 days ago

> Factoring becoming a reasonable benchmark is strongly related to quantum computing becoming useful.

Either this relation is not that strong, or factoring should "imminently" become a reasonable benchmark, or useful quantum computing cannot be "imminent". So which one is it?

I think you are the author of the blogpost I linked to? Did I maybe interpret it too negatively, and was it not meant to suggest that the second option is still quite some time away?

Under a Moore's law-like scenario, factoring 21 happens only about ~5 years before factoring a 1024 bit number. With all the optimizations, factoring an n bit number only requires ~n logical qbits, but most of those optimizations only work for large n, so 21 is only ~5 doubles away from 2^1024.

the other problem is that factoring 21 is so easy that it actually makes it harder to prove you've factored it with a functional quantum computer. for big numbers, your program can fail 99% of the time because if you get the result once, you prove that the algorithm worked. 21 is small enough that it's hard not to factor, so demonstrating that you've factored it with a qc is fairly hard. I wouldn't be surprised as a result if the first number publicly factored by a quantum computer (using error correction) was in the thousands instead of 21. By using a number that is not absolutely tiny, it becomes a lot easier to show that the system works.