← Back to context

Comment by pclmulqdq

1 day ago

It's been "engineering challenges" for 30 years. At some point, "engineering challenges" stops being a good excuse, and that point was about 20 years ago.

At some point, someone may discover some new physics that shows that all of these "engineering challenges" were actually a physics problem, but quantum physics hasn't really advanced in the last 30 years so it's understandable that the physicists are confused about what's wrong.

You might be right that we'll never have quantum computers capable of cracking conventional cryptographic methods, but I'd rather err on the side of caution in this regard considering how easy it is to switch, and how disastrous it could be otherwise.

  • As others pointed out, it's not so easy to switch, as the PQC versions require much more data to be sent to establish a connection, and consequently way more CPU time. So the CPS you can achieve with this type of cryptography will be MUCH worse than classical algorithms.

    • it doesn't get much easier than that, and the downsides are much much much less of an inconvenience than having your data breached depending on what it is.

  • "A First Successful Factorization of RSA-2048 Integer by D-Wave Quantum Computer" (2025-06) https://ieeexplore.ieee.org/document/10817698

    • Yeah, except when your "2048-bit" numbers are guaranteed to have factors that differ by exactly two bits, you can factor them with any computer you want.

      The D-wave also isn't capable of Shor's algorithm or any other quantum-accelerated version of this problem.

      3 replies →

    • D-Wave themselves do not emphasize this use case and have said many times that they don't expect annealing quantum computers to be used for this kind of decryption attack. Annealers are used for optimization problems where you're trying to find the lowest energy solution to a constraint problem, not Shor's Algorithm.

      In that sense, they're more useful for normal folks today, and don't pose as many potential problems.

Some good ideas take a long time.

Nuclear energy got commercialized in 1957. The core technology was discovered nearly 50 years earlier.

Electricity was first discovered in ~1750 but commercialized in the late 1800s.

Faraday's experiments on electromagnetism were in 1830-1855 but commercialization took decades.

(The list goes on ...)

  • Your idea of "core technology" is about the first time a theory was discovered that had a technology as a consequence. That's the only way nuclear energy's "core technology" is discovered in 1907. By the same token, quantum computing's "core technology" was discovered in 1926 during Erwin Schrodinger's work formalizing wave equations for quantum systems. During those periods when technology takes a long time, both the underlying physics and the engineering makes steady advances. 100 years later, we still have very little idea how or why quantum superposition works.

> quantum physics hasn't really advanced in the last 30 years so it's understandable that the physicists are confused about what's wrong.

I have my doubts about who’s the confused one. Quantum physics has advanced tremendously in the past 30 years. Do you realize we now have a scheme to break rsa 2048 with 1M noisy qubits? (See Gidney 2025)

  • Somehow, we have all these schemes to factor huge numbers, and yet the current record for actual implementation of Shor's algorithm and similar algorithms came factoring the number 15 in 2012. There was a recent paper about "factoring" 31, but that paper involved taking a number of simplifying steps assuming exactly that the number in use was a Mersenne number. People in this field keep showing "algorithm improvements" or "new devices" that are good enough to write a paper and yet somehow there's always an implementation problem or a translation problem when someone comes asking about using it.

    If this algorithm exists and works, and there are chips with 1000 noisy qubits, why has nobody used this algorithm to factor a 16-bit number? Why haven't they used it to factor the number 63? Factoring 63 on a quantum computer using a generic algorithm would be a huge advancement in capability, but there's always some reason why your fancy algorithm doesn't work with another guy's fancy hardware.

    At the same time, we continue to have no actual understanding of the actual underlying physics of quantum superposition, which is the principle on which this whole thing relies. We know that it happens and we have lots of equations that show that it happens and we have lots of algorithms that rely on it working, but we have continued to be blissfully unaware of why it happens (other than that the math of our theory says so). In the year 3000, physicists will be looking back at these magical parts of quantum theory with the same ridicule we use looking back at the magical parts of Newton's gravity.

  • And that's not even a quantum physics advance, that's a purely algorithmic advance!

    There's also been massive advances in terms of quantum engineering.