Comment by sigmoid10
1 day ago
>it appears that most of the barriers to a cryptographically-relevant quantum computer are engineering challenges rather than underlying physics
I've heard this 15 years ago when I started university. People claimed all the basics were done, that we "only" needed to scale. That we would see practical quantum computers in 5-10 years. Today I still see the same estimates. Maybe 5 years by extreme optimists, 10-20 years by more reserved people. It's the same story as nuclear fusion. But who's prepping for unlimited energy today? Even though it would make sense to build future industrial environments around that if they want to be competitive.
> People claimed all the basics were done, that we "only" needed to scale.
This claim is fundamentally different from what you quoted.
> But who's prepping for unlimited energy today?
It's about tradoffs: It costs almost nothing to switch to PQC methods, but i can't see a way to "prep for unlimited energy" that doesn't come with huge cost/time-waste in the case that doesn't happen
> It's about tradoffs: It costs almost nothing to switch to PQC methods,
It costs:
- development time to switch things over
- more computation, and thus more energy, because PQC algorithms aren't as efficient as classical ones
- more bandwidth, because PQC algorithms require larger keys
> It costs:
Not wrong, but given these algorithms are mostly used at setup, how much cost is actually being occurred compared to the entire session? Certainly if your sessions are short-lived then the 'overhead' of PQC/hybrid is higher, but I'd be curious to know the actually byte and energy costs over and above non-PQC/hybrid, i.e., how many bytes/joules for a non-PQC exchange and how many more by adding PQC. E.g.
> Unfortunately, many of the proposed post-quantum cryptographic primitives have significant drawbacks compared to existing mechanisms, in particular producing outputs that are much larger. For signatures, a state of the art classical signature scheme is Ed25519, which produces 64-byte signatures and 32-byte public keys, while for widely-used RSA-2048 the values are around 256 bytes for both. Compare this to the lowest security strength ML-DSA post-quantum signature scheme, which has signatures of 2,420 bytes (i.e., over 2kB!) and public keys that are also over a kB in size (1,312 bytes). For encryption, the equivalent would be comparing X25519 as a KEM (32-byte public keys and ciphertexts) with ML-KEM-512 (800-byte PK, 768-byte ciphertext).
* https://neilmadden.blog/2025/06/20/are-we-overthinking-post-...
"The impact of data-heavy, post-quantum TLS 1.3 on the Time-To-Last-Byte of real-world connections" (PDF):
* https://csrc.nist.gov/csrc/media/Events/2024/fifth-pqc-stand...
(And development time is also generally one-time.)
1 reply →
> - development time to switch things over
This is a one time cost, and generally the implementations we're switching to are better quality than the classical algorithms they replace. For instance, the implementation of ML-KEM we use in OpenSSH comes from Cryspen's libcrux[1], which is formally-verified and quite fast.
[1] https://github.com/cryspen/libcrux
> - more computation, and thus more energy, because PQC algorithms aren't as efficient as classical ones
ML-KEM is very fast. In OpenSSH it's much faster than classic DH at the same security level and only slightly slower than ECDH/X25519.
> - more bandwidth, because PQC algorithms require larger keys
For key agreement, it's barely noticeable. ML-KEM public keys are slightly over 1Kb. Again this is larger than ECDH but comparable to classic DH.
PQ signatures are larger, e.g. a ML-DSA signature is about 3Kb but again this only happens once or twice per SSH connection and is totally lost in the noise.
all of which are costs that pale in comparison to having your data compromised, depending on what it is
Anyway, what does prepping for unlimited energy look like? I guess, favoring electrical over fossil fuels. But for normal people and the vast majority of companies, that looks like preparing for mass renewable electricity anyway, which is already a good thing to do.
With limitless energy you can have "fossil fuel" synthesized from air and water [1] and use existing "fossil fuel" infrastructure.
[1] https://www.wired.com/2012/10/fuel-from-air/
could also be just massively scaling up energy consumption with little concern for efficiency (since limitless would imply very low cost), which would probably be a bad idea for renewables, and in case of not-so-cheap energy also very expensive
I would just take this to mean that most people are bad at estimating timelines for complex engineering tasks. 15 years isn't a ton of time, and the progress that has been made was done with pretty limited resources (compared to, say, traditional microprocessors).
The comparison to fusion power doesn't hold.
The costs to migrate to PQC continue to drop as they become mainstream algorithms. Second, the threat exists /now/ of organizations capturing encrypted data to decrypt later. There is no comparable current threat of "not preparing for fusion", whatever that entails.