← Back to context

Comment by fxwin

1 day ago

The page only talks about adopting PQC for key agreement for SSH connections, not encryption in general so the overhead would be rather minimal here. Also from the FAQ:

"Quantum computers don't exist yet, why go to all this trouble?"

Because of the "store now, decrypt later" attack mentioned above. Traffic sent today is at risk of decryption unless post-quantum key agreement is used.

"I don't believe we'll ever get quantum computers. This is a waste of time"

Some people consider the task of scaling existing quantum computers up to the point where they can tackle cryptographic problems to be practically insurmountable. This is a possibilty. However, it appears that most of the barriers to a cryptographically-relevant quantum computer are engineering challenges rather than underlying physics. If we're right about quantum computers being practical, then we will have protected vast quantities of user data. If we're wrong about it, then all we'll have done is moved to cryptographic algorithms with stronger mathematical underpinnings.

Not sure if I'd take the cited paper (while fun to read) too seriously to inform my opinion the risks of using quantum-insecure encryption rather than as a cynical take on hype and window dressing in QC research.

>it appears that most of the barriers to a cryptographically-relevant quantum computer are engineering challenges rather than underlying physics

I've heard this 15 years ago when I started university. People claimed all the basics were done, that we "only" needed to scale. That we would see practical quantum computers in 5-10 years. Today I still see the same estimates. Maybe 5 years by extreme optimists, 10-20 years by more reserved people. It's the same story as nuclear fusion. But who's prepping for unlimited energy today? Even though it would make sense to build future industrial environments around that if they want to be competitive.

  • > People claimed all the basics were done, that we "only" needed to scale.

    This claim is fundamentally different from what you quoted.

    > But who's prepping for unlimited energy today?

    It's about tradoffs: It costs almost nothing to switch to PQC methods, but i can't see a way to "prep for unlimited energy" that doesn't come with huge cost/time-waste in the case that doesn't happen

    • > It's about tradoffs: It costs almost nothing to switch to PQC methods,

      It costs:

      - development time to switch things over

      - more computation, and thus more energy, because PQC algorithms aren't as efficient as classical ones

      - more bandwidth, because PQC algorithms require larger keys

      4 replies →

    • Anyway, what does prepping for unlimited energy look like? I guess, favoring electrical over fossil fuels. But for normal people and the vast majority of companies, that looks like preparing for mass renewable electricity anyway, which is already a good thing to do.

      2 replies →

  • I would just take this to mean that most people are bad at estimating timelines for complex engineering tasks. 15 years isn't a ton of time, and the progress that has been made was done with pretty limited resources (compared to, say, traditional microprocessors).

  • The comparison to fusion power doesn't hold.

    The costs to migrate to PQC continue to drop as they become mainstream algorithms. Second, the threat exists /now/ of organizations capturing encrypted data to decrypt later. There is no comparable current threat of "not preparing for fusion", whatever that entails.

It's been "engineering challenges" for 30 years. At some point, "engineering challenges" stops being a good excuse, and that point was about 20 years ago.

At some point, someone may discover some new physics that shows that all of these "engineering challenges" were actually a physics problem, but quantum physics hasn't really advanced in the last 30 years so it's understandable that the physicists are confused about what's wrong.

  • You might be right that we'll never have quantum computers capable of cracking conventional cryptographic methods, but I'd rather err on the side of caution in this regard considering how easy it is to switch, and how disastrous it could be otherwise.

  • Some good ideas take a long time.

    Nuclear energy got commercialized in 1957. The core technology was discovered nearly 50 years earlier.

    Electricity was first discovered in ~1750 but commercialized in the late 1800s.

    Faraday's experiments on electromagnetism were in 1830-1855 but commercialization took decades.

    (The list goes on ...)

    • Your idea of "core technology" is about the first time a theory was discovered that had a technology as a consequence. That's the only way nuclear energy's "core technology" is discovered in 1907. By the same token, quantum computing's "core technology" was discovered in 1926 during Erwin Schrodinger's work formalizing wave equations for quantum systems. During those periods when technology takes a long time, both the underlying physics and the engineering makes steady advances. 100 years later, we still have very little idea how or why quantum superposition works.

  • > quantum physics hasn't really advanced in the last 30 years so it's understandable that the physicists are confused about what's wrong.

    I have my doubts about who’s the confused one. Quantum physics has advanced tremendously in the past 30 years. Do you realize we now have a scheme to break rsa 2048 with 1M noisy qubits? (See Gidney 2025)

    • Somehow, we have all these schemes to factor huge numbers, and yet the current record for actual implementation of Shor's algorithm and similar algorithms came factoring the number 15 in 2012. There was a recent paper about "factoring" 31, but that paper involved taking a number of simplifying steps assuming exactly that the number in use was a Mersenne number. People in this field keep showing "algorithm improvements" or "new devices" that are good enough to write a paper and yet somehow there's always an implementation problem or a translation problem when someone comes asking about using it.

      If this algorithm exists and works, and there are chips with 1000 noisy qubits, why has nobody used this algorithm to factor a 16-bit number? Why haven't they used it to factor the number 63? Factoring 63 on a quantum computer using a generic algorithm would be a huge advancement in capability, but there's always some reason why your fancy algorithm doesn't work with another guy's fancy hardware.

      At the same time, we continue to have no actual understanding of the actual underlying physics of quantum superposition, which is the principle on which this whole thing relies. We know that it happens and we have lots of equations that show that it happens and we have lots of algorithms that rely on it working, but we have continued to be blissfully unaware of why it happens (other than that the math of our theory says so). In the year 3000, physicists will be looking back at these magical parts of quantum theory with the same ridicule we use looking back at the magical parts of Newton's gravity.

    • And that's not even a quantum physics advance, that's a purely algorithmic advance!

      There's also been massive advances in terms of quantum engineering.

Those are two odd questions to even ask/answer as first quantum computers exist and secondly, we have them on a certain scale. I assume what they mean is at a scale to do calculations that surpass existing classical calculations.