Comment by dhx
4 days ago
Amongst the numerous reasons why you _don't_ want to rush into implementing new algorithms is even the _reference implementation_ (and most other early implementations) for Kyber/ML-KEM included multiple timing side channel vulnerabilities that allowed for key recovery.[1][2]
djb has been consistent in view for decades that cryptography standards need to consider the foolproofness of implementation so that a minor implementation mistake specific to timing of specific instructions on specific CPU architectures, or specific compiler optimisations, etc doesn't break the implementation. See for example the many problems of NIST P-224/P-256/P-384 ECC curves which djb has been instrumental in fixing through widespread deployment of X25519.[3][4][5]
[1] https://cryspen.com/post/ml-kem-implementation/
[2] https://kyberslash.cr.yp.to/faq.html / https://kyberslash.cr.yp.to/libraries.html
[3] https://en.wikipedia.org/wiki/Elliptic_curve_point_multiplic...
Given the emphasis on reliability of implementations of an algorith, it's ironic that the Curve 25519-based Ed25519 digital signature standard was itself specified and originally implemented in such a way as to lead to implementation divergence on what a valid and invalid signature actually was. See https://hdevalence.ca/blog/2020-10-04-its-25519am/
Not a criticism, if anything it reinforces DJB's point. But it makes clear that ease of (proper) implementation also needs to cover things like proper canonicalization of relevant security variables and that supporting multiple modes of operation doesn't actually lead to different answers of security questions meant to give the same answer.
This logic does not follow. Your argument seems to be "the implementation has security bugs, so let's not ratify the standard." That's not how standards work though. Ensuring an implementation is secure is part of the certification process. As long as the scheme itself is shown to be provably secure, that is sufficient to ratify a standard.
If anything, standardization encourages more investment, which means more eyeballs to identify and plug those holes.
No, the argument is that the algorithm (as specified in the standard) is difficult to implement correctly, so we should tweak it/find another one. This is a property of the algorithm being specified, not just an individual implementation, and we’ve seen it play out over and over again in cryptography.
I’d actually like to see more (non-cryptographic) standards take this into account. Many web standards are so complicated and/or ill-specified that trillion dollar market cap companies have trouble implementing them correctly/consistently. Standards shouldn’t just be thrown over the wall and have any problems blamed on the implementations.
> No, the argument is that the algorithm (as specified in the standard) is difficult to implement correctly, so we should tweak it/find another one.
This argument is without merit. ML-KEM/Kyber has already been ratified as the PQC KEM standard by NIST. What you are proposing is that the NIST process was fundamentally flawed. This is a claim that requires serious evidence as backup.
14 replies →
It's more like "the standard makes it easier to create insecure implementations." Our standards shouldn't just be "sufficient" they should be "robust."
this is like saying just use C and don't write any memory bugs. possible, but life could be a lot better if it weren't so easy to do so.
Great, you’ve just convinced every C programmer to use a hand rolled AES implementation on their next embedded device. Only slightly joking.
15 replies →
Yeah except there are certified versions of AES written in C. Which makes your point what exactly?
> See for example the many problems of NIST P-224/P-256/P-384 ECC curves
What are those problems exactly? The whitepaper from djb only makes vague claims about NSA being a malicious actor, but after ~20 years no known backdoors nor intentional weaknesses has been reliably proven?
As I understand it, a big issue is that they are really hard to implement correctly. This means that backdoors and weaknesses might not exist in the theoretical algorithm, but still be common in real-world implementations.
On the other hand, Curve25519 is designed from the ground up to be hard to implement incorrectly: there are very few footguns, gotchas, and edge cases. This means that real-world implementations are likely to be correct implementations of the theoretical algorithm.
This means that, even if P-224/P-256/P-384 are on paper exactly as secure as Curve25519, they could still end up being significantly weaker in practice.
I tried to defend a similar argument in a private forum today and basically got my ass handed to me. In practice, not only would modern P-curve implementations not be "significantly weaker" than Curve25519 (we've had good complete addition formulas for them for a long time, along with widespread hardware support), but Curve25519 causes as many (probably more) problems than it solves --- cofactor problems being more common in modern practice than point validation mistakes.
In TLS, Curve25519 vs. the P-curves are a total non-issue, because TLS isn't generally deployed anymore in ways that even admit point validation vulnerabilities (even if implementations still had them). That bit, I already knew, but I'd assumed ad-hoc non-TLS implementations, by random people who don't know what point validation is, might tip the scales. Turns out guess not.
Again, by way of bona fides: I woke up this morning in your camp, regarding Curve25519. But that won't be the camp I go to bed in.
1 reply →
> As I understand it, a big issue is that they are really hard to implement correctly.
Any reference for the "really hard" part? That is a very interesting subject and I can't imagine it's independent of the environment and development stack being used.
I'd welcome any standard that's "really hard to implement correctly" as a testbed for improving our compilers and other tools.
1 reply →
It would be wise for people to remember that it’s worth doing basic sanity checks before making claims like no backdoors from the NSA. strong encryption has been restricted historically so we had things like DES and 3DES and Crypto AG. In the modern internet age juniper has a bad time with this one https://www.wired.com/2013/09/nsa-backdoor/.
Usually it’s really hard to distinguish intent, and so it’s possible to develop plausible deniability with committees. Their track record isn’t perfect.
With WPA3 cryptographers warned about the known pitfall of standardizing a timing sensitive PAKE, and Harkin got it through anyway. Since it was a standard, the WiFi committee gladly selected it anyway, and then resulted in dragonbleed among other bugs. The techniques for hash2curve have patched that
It's "Dragonblood", not "Dragonbleed". I don't like Harkin's PAKE either, but I'm not sure what fundamental attribute of it enables the downgrade attack you're talking about.
When you're talking about the P-curves, I'm curious how you get your "sanity check" argument past things like the Koblitz/Menezes "Riddle Wrapped In An Enigma" paper. What part of their arguments did you not find persuasive?
6 replies →
The NSA changed the S-boxes in DES and this made people suspicious they had planted a back door but then when differential cryptanalysis was discovered people realized that the NSA changes to S-boxes made them more secure against it.
20 replies →
They're vulnerable to "High-S" malleable signatures, while ed25519 isn't. No one is claiming they're backdoored (well, some people somewhere probably are), but they do have failure modes that ed25519 doesn't which is the GP's point.
in the NIST Curve arena, I think DJB's main concern is engineering implementation - from an online slide deck he published:
As to whether or not the NSA is a strategic adversary to some people using ECC curves, I think that's right in the mandate of the org, no? If a current standard is super hard to implement, and theoretically strong at the same time, that has to make someone happy on a red team. At least, it would make me happy, if I were on such a red team.
He does a motte-and-bailey thing with the P-curves. I don't know if it's intentional or not.
Curve25519 was a materially important engineering advance over the state of the art in P-curve implementations when it was introduced. There was a window of time within which Curve25519 foreclosed on Internet-exploitable vulnerabilities (and probably a somewhat longer period of time where it foreclosed on some embedded vulnerabilities). That window of time has pretty much closed now, but it was real at the time.
But he also does a handwavy thing about how the P-curves could have been backdoored. No practicing cryptgraphy engineer I'm aware of takes these arguments seriously, and to buy them you have to take Bernstein's side over people like Neil Koblitz.
The P-curve backdoor argument is unserious, but the P-curve implementation stuff has enough of a solid kernel to it that he can keep both arguments alive.
16 replies →
Well, DJB also focused on "nothing up my sleeve" design methodology for curves. The implication was that any curves that were not designed in such a way might have something nefarious going on.
Dual_EC's backdoor can't be proven, but it's almost certainly real.