Comment by cubefox
2 days ago
This sounds slightly alarming:
> I’m going to close this post with a warning. When Frisch and Peierls wrote their now-famous memo in March 1940, estimating the mass of Uranium-235 that would be needed for a fission bomb, they didn’t publish it in a journal, but communicated the result through military channels only. As recently as February 1939, Frisch and Meitner had published in Nature their theoretical explanation of recent experiments, showing that the uranium nucleus could fission when bombarded by neutrons. But by 1940, Frisch and Peierls realized that the time for open publication of these matters had passed.
> Similarly, at some point, the people doing detailed estimates of how many physical qubits and gates it’ll take to break actually deployed cryptosystems using Shor’s algorithm are going to stop publishing those estimates, if for no other reason than the risk of giving too much information to adversaries. Indeed, for all we know, that point may have been passed already. This is the clearest warning that I can offer in public right now about the urgency of migrating to post-quantum cryptosystems, a process that I’m grateful is already underway.
Does anyone know how much underway it is? Do we need to worry that the switch away from RSA won't be broadly deployed before quantum decryption becomes available?
From analytical arguments considering a rather generic error type, we already know that for the Shor algorithm to produce a useful result, the error rate with the number of logical qubits needs to decrease as ~n^(-1/3), where `n` is the number of bits in the number [1].
This estimate, however, assumes that interaction can be turned on between arbitrary two qubits. In practice, we can only do nearest-neighbour interactions on a square lattice, and we need to simulate the interaction between two arbitrary qubits by repeated application of SWAP gates, mangling the interaction through as in the 15th puzzle. This two-qubit simulation would add about `n` SWAP gates, which would then multiply the noise factor by the same factor, hence now we need an error rate for logical qubits on a square lattice to be around ~n^(-4/3)
Now comes the error correction. The estimates are somewhat hard to make here, as they depend on the sensitivity of the readout mechanism, but for example let’s say a 10-bit number can be factored with a logical qubit error rate of 10^{-5}. Then we apply a surface code that scales exponentially, reducing the error rate by 10 times with 10 physical qubits, which we could express as ~1/10^{m/10}, where m is the number of physical qubits (which is rather optimistic). Putting in the numbers, it would follow that we need 40 physical qubits for a logical qubit, hence in total 400k physical qubits.
That may sound reasonable, but then we made the assumption that while manipulating the individual physical qubits, decoherence for each individual qubit does not happen while they are waiting for their turn. This, in fact, scales poorly with the number of qubits on the chip because physical constraints limit the number of coaxial cables that can be attached, hence multiplexing of control signals and hence the waiting of the qubits is imminent. This waiting is even more pronounced in the quantum computer cluster proposals that tend to surface sometimes.
[1]: https://link.springer.com/article/10.1007/s11432-023-3961-3