← Back to context

Comment by pilif

1 day ago

In light of the recent hilarious paper around the current state of quantum cryptography[1], how big is the need for the current pace of post quantum crypto adoption?

As far as I understand, the key material for any post quantum algorithm is much, much larger compared to non-quantum algorithms which leads to huge overheads in network traffic and of course CPU time.

[1]: https://eprint.iacr.org/2025/1237

The page only talks about adopting PQC for key agreement for SSH connections, not encryption in general so the overhead would be rather minimal here. Also from the FAQ:

"Quantum computers don't exist yet, why go to all this trouble?"

Because of the "store now, decrypt later" attack mentioned above. Traffic sent today is at risk of decryption unless post-quantum key agreement is used.

"I don't believe we'll ever get quantum computers. This is a waste of time"

Some people consider the task of scaling existing quantum computers up to the point where they can tackle cryptographic problems to be practically insurmountable. This is a possibilty. However, it appears that most of the barriers to a cryptographically-relevant quantum computer are engineering challenges rather than underlying physics. If we're right about quantum computers being practical, then we will have protected vast quantities of user data. If we're wrong about it, then all we'll have done is moved to cryptographic algorithms with stronger mathematical underpinnings.

Not sure if I'd take the cited paper (while fun to read) too seriously to inform my opinion the risks of using quantum-insecure encryption rather than as a cynical take on hype and window dressing in QC research.

  • >it appears that most of the barriers to a cryptographically-relevant quantum computer are engineering challenges rather than underlying physics

    I've heard this 15 years ago when I started university. People claimed all the basics were done, that we "only" needed to scale. That we would see practical quantum computers in 5-10 years. Today I still see the same estimates. Maybe 5 years by extreme optimists, 10-20 years by more reserved people. It's the same story as nuclear fusion. But who's prepping for unlimited energy today? Even though it would make sense to build future industrial environments around that if they want to be competitive.

    • > People claimed all the basics were done, that we "only" needed to scale.

      This claim is fundamentally different from what you quoted.

      > But who's prepping for unlimited energy today?

      It's about tradoffs: It costs almost nothing to switch to PQC methods, but i can't see a way to "prep for unlimited energy" that doesn't come with huge cost/time-waste in the case that doesn't happen

      8 replies →

    • I would just take this to mean that most people are bad at estimating timelines for complex engineering tasks. 15 years isn't a ton of time, and the progress that has been made was done with pretty limited resources (compared to, say, traditional microprocessors).

    • The comparison to fusion power doesn't hold.

      The costs to migrate to PQC continue to drop as they become mainstream algorithms. Second, the threat exists /now/ of organizations capturing encrypted data to decrypt later. There is no comparable current threat of "not preparing for fusion", whatever that entails.

  • It's been "engineering challenges" for 30 years. At some point, "engineering challenges" stops being a good excuse, and that point was about 20 years ago.

    At some point, someone may discover some new physics that shows that all of these "engineering challenges" were actually a physics problem, but quantum physics hasn't really advanced in the last 30 years so it's understandable that the physicists are confused about what's wrong.

    • You might be right that we'll never have quantum computers capable of cracking conventional cryptographic methods, but I'd rather err on the side of caution in this regard considering how easy it is to switch, and how disastrous it could be otherwise.

      12 replies →

    • Some good ideas take a long time.

      Nuclear energy got commercialized in 1957. The core technology was discovered nearly 50 years earlier.

      Electricity was first discovered in ~1750 but commercialized in the late 1800s.

      Faraday's experiments on electromagnetism were in 1830-1855 but commercialization took decades.

      (The list goes on ...)

      1 reply →

    • > quantum physics hasn't really advanced in the last 30 years so it's understandable that the physicists are confused about what's wrong.

      I have my doubts about who’s the confused one. Quantum physics has advanced tremendously in the past 30 years. Do you realize we now have a scheme to break rsa 2048 with 1M noisy qubits? (See Gidney 2025)

      2 replies →

  • Those are two odd questions to even ask/answer as first quantum computers exist and secondly, we have them on a certain scale. I assume what they mean is at a scale to do calculations that surpass existing classical calculations.

That paper is hilarious, and is correct that there's plenty of shit to make fun of... but there's also progress. I recommend watching Sam Jacques' talk from PQCrypto 2025 [0]. It would be silly to delay PQC adoption because of focusing on the irrelevant bad papers.

In the past ten years, on the theory side, the expected cost of cryptographically relevant quantum factoring has dropped by 1000x [1][2]. On the hardware side, fault tolerance demonstrations have gone from repetition code error rates of 1% error per round [3] to 0.00000001% error per round [fig3a of 4], with full quantum codes being demonstrated with an error rate of 0.2% [fig1d of 4] via a 2x reduction in error each time distance is increased by 2.

If you want to track progress in quantum computing, follow the gradual spinup of fault tolerance. Noise is the main thing blocking factoring of larger and larger numbers. Once the quality problem is turned into a quantity problem, then those benchmarks can start moving.

[0]: https://www.youtube.com/watch?v=nJxENYdsB6c

[1]: https://arxiv.org/abs/1208.0928

[2]: https://arxiv.org/abs/2505.15917

[3]: https://arxiv.org/abs/1411.7403

[4]: https://arxiv.org/abs/2408.13687

As a number of people have observed, what's happening now is mostly about key establishment, which tends to happen relatively infrequently, and so the overhead is mostly not excessive. With that said, a little more detail:

- Current PQ algorithms, for both signature and key establishment, have much larger key sizes than traditional algorithms. In terms of compute, they are comparably fast if not faster.

- Most protocols (e.g., TLS, SSH, etc.) do key establishment relatively infrequently (e.g., at the start of the connection) and so the key establishment size isn't a big deal, modulo some interoperability issues because the keys are big enough to push you over the TCP MTU, so you end up with the keys spanning two packets. One important exception here is double ratchet protocols like Signal or MLS which do very frequent key changes. What you sometimes see here is to rekey with PQ only occasionally (https://security.apple.com/blog/imessage-pq3/).

- In the particular case of TLS, message size for signatures is a much bigger deal, to a great extent because your typical TLS handshake involves a lot of signatures in the certificate chain. For this reason, there is a lot more concern about the viability of PQ signatures in TLS (https://dadrian.io/blog/posts/pqc-signatures-2024/). Possibly in other protocols too but I don't know them as well

>In light of the recent hilarious paper around the current state of quantum cryptography

I assumed that paper was intended as a joke. If it's supposed to be serious criticism of the concept of quantum computing then it's pretty off-base, akin to complaining that transistors couldn't calculate Pi in 1951.

> how big is the need for the current pace of post quantum crypto adoption?

It comes down to:

1) do you believe that no cryptographically-relevant quantum computer will be realised within your lifespan

2) how much you value the data that are trusting to conventional cryptography

If you believe that no QC will arrive in a timeframe you care about or you don't care about currently-private data then you'd be justified in thinking PQC is a waste of time.

OTOH if you're a maintainer of a cryptographic application, then IMO you don't have the luxury of ignoring (2) on behalf of your users, irrespective of (1).

Besides what's public knowledge, I tend to put a bit of stock in our intelligence agency calling for PQ adoption for systems that need to remain confidential for 20 years or more

edit: adding in some sources

2014: "between 2030 and 2040" according to https://www.aivd.nl/publicaties/publicaties/2014/11/20/infor... (404) via https://tweakers.net/reviews/5885/de-dreiging-van-quantumcom... (Dutch)

2021: "small chance it arrives by 2030" https://www.aivd.nl/documenten/publicaties/2021/09/23/bereid... (Dutch)

2025: "protect against ‘store now, decrypt later’ attacks by 2030", joint paper from 18 countries https://www.aivd.nl/binaries/aivd_nl/documenten/brochures/20... (English)

  • I don't want my government to keep secrets for 20 years. There is nothing I am OK with them doing that they can't be generally open about in time. Ex. the MLK files. No justification for the courts saying that the FBI files regarding MLK have to be kept under lock and key for 50 years.

    • I think that's a different discussion. Some people would like their chat messages to simply be secure until they die. So long as that's a valid desire, or one can think of another purpose for this, I think we can agree that it's worth considering whether PQC is worth implementing today

      Also, 2030 isn't 20 years away anymore and that's the recommendation I ended up finding in sources, even if they think it's only a small chance

That's just a fun joke paper deflating some of the more aggressive hype around QC. You shouldn't use it for making security and algorithm adoption decisions.

I don't think many cryptography engineers take Gutmann's paper seriously.

  • From the paper:

    > After our successful factorisation using a dog, we were delighted to learn that scientists have now discovered evidence of quantum entanglement in other species of mammals such as sheep [32]. This would open up an entirely new research field of mammal-based quantum factorisation. We hypothesise that the production of fully entangled sheep is easy, given how hard it can be to disentangle their coats in the first place. The logistics of assembling the tens of thousands of sheep necessary to factorise RSA-2048 numbers is left as an open problem.

  • The paper is a joke, but Gutmann does make some useful, non-joke suggestions in section 7. There's probably room for a serious, full-length paper on quantum factorization evaluation criteria.

> As far as I understand, the key material for any post quantum algorithm is much, much larger compared to non-quantum algorithms

This is somewhat correct, but needs some nuance.

First, the problem is bigger with signatures, which is why nobody is happy with the current post quantum signature schemes and people are working on better pq signature schemes for the future. But signatures aren't an urgent issue, as there is no "decrypt later" scenario for signatures.

For encryption, the overhead exists, but it isn't too bad. We are already deploying pqcrypto, and nobody seems to have an issue with it. Use a current OpenSSH and you use mlkem. Use a current browser with a server using modern libraries and you also use mlkem. I haven't heard anyone complaining that the Internet got so much slower in recent years due to pqcrypto key exchanges.

Compared to the overall traffic we use commonly these days, the few extra kb during the handshake (everything else is not affected) doesn't matter much.

I imagine the key exchange is just once per connection, right? So the overhead seems not too bad.

Especially since I think a pretty large number of computers/hostnames that are ssh'able today will probably have the same root password if they're still connected to the internet 10-20 years from now

  • So what person is running an SSH server and configuring it to use post-quantum crypto, but is using password Auth? Priorities are out-of-whack.

    Not that this is a bad thing, but first start using keys, then start rotating them regularly and then worry about theoretical future attacks.

    • Those are completely disjoint threats.

      A captured SSH session should never be able to decrypted by an adversary regardless of whether it uses passwords or keys, or how weak the password is.

  • root can't normally log in via ssh. Unless the default configuration is changed.

    • In OpenSSH root cannot login.

      In TinySSH, which also implements the ntru exchange, root is always allowed.

      I don't know what the behavior is in Dropbear, but the point is that OpenSSH is not the only implementation.

      TinySSH would also enable you to quiet the warning on RHEL 7 or other legacy platforms.

    • Fwiw some distros ask if you want root access enabled on install; I assume there's always some chance of it being enabled for install stuff and forgotten, or the user misreading and thinking it means any root access.

>... which leads to huge overheads in network traffic and of course CPU time.

This is just the key exchange. You're exchanging keys for the symmetric cipher you'll be using for traffic in the session. There's really no overhead to talk about.

  • Indeed, I'll expand a bit: Asymmetrical crypto has always been incredibly slow compared to symmetrical crypto which is either HW accelerated (AES) or fast on the CPU (ChaCha20).

    But since the symmetrical key is the same for both sides you must either share it ahead of time or use asymmetrical crypto to exchange the symmetrical keys to go brrrrr

  • This still greatly affects connections/second, which is an important metric. Especially since servers don't always like very long lived connections, so you may get plenty of connections during an HTTP interaction.

    • It doesn't "greatly" affect it at all. The extra traffic and time required between curve25519 and ML-KEM768+X25519 is actually less than the jump from RSA2048 to RSA4096. Imagine how silly a person would appear if they had been this alarmist about RSA4096. When building for scales where it may eventually add up you should already be taking such scale into consideration.

>As far as I understand, the key material for any post quantum algorithm is much, much larger compared to non-quantum algorithms which leads to huge overheads in network traffic and of course CPU time.

Eh? Public-key (asymmetric) cryptography is already very expensive compared to symmetric even under classical, that's normal, what it's used for is the vital but limited operation of key-exchange for AES or whatever fast symmetric algorithm afterwards. My understanding (and serious people in the field please correct me if I'm wrong!) is that the potential cryptographically relevant quantum computer issue threats almost 100% to key exchange, not symmetric encryption. The best theoretical search algorithm vs symmetric is Grover's which offers a square-root speed up, and thus trivially countered if necessary by doubling the key size (ie, 256-bits vs Grovers would offer 128-bits classical equivalent and 512-bits would offer 256-bits, which is already more than enough). The vast super majority of a given SSH session's traffic isn't typically handshakes unless something is quite odd, and you're likely going to have a pretty miserable experience in that case regardless. So even if the initial handshake gets made significantly more expensive it should be pretty irrelevant to network overhead, it still only happens during the initiation of a given session right?