← Back to context

Comment by glitchc

4 days ago

This logic does not follow. Your argument seems to be "the implementation has security bugs, so let's not ratify the standard." That's not how standards work though. Ensuring an implementation is secure is part of the certification process. As long as the scheme itself is shown to be provably secure, that is sufficient to ratify a standard.

If anything, standardization encourages more investment, which means more eyeballs to identify and plug those holes.

No, the argument is that the algorithm (as specified in the standard) is difficult to implement correctly, so we should tweak it/find another one. This is a property of the algorithm being specified, not just an individual implementation, and we’ve seen it play out over and over again in cryptography.

I’d actually like to see more (non-cryptographic) standards take this into account. Many web standards are so complicated and/or ill-specified that trillion dollar market cap companies have trouble implementing them correctly/consistently. Standards shouldn’t just be thrown over the wall and have any problems blamed on the implementations.

  • > No, the argument is that the algorithm (as specified in the standard) is difficult to implement correctly, so we should tweak it/find another one.

    This argument is without merit. ML-KEM/Kyber has already been ratified as the PQC KEM standard by NIST. What you are proposing is that the NIST process was fundamentally flawed. This is a claim that requires serious evidence as backup.

    • You can't be serious. "The standard was adopted, therefore it must be able to be implemented in any or all systems?"

      NIST can adopt and recommend whatever algorithms they might like using whatever criteria they decide they want to use. However, while the amount of expertise and experience on display by NIST in identifying algorithms that are secure or potentially useful is impressive, there is no amount of expertise or experience that guarantees any given implementation is always feasible.

      Indeed, this is precisely why elliptic curve algorithms are often not available, in spite of a NIST standard being adopted like 8+ years ago!

      2 replies →

    • DJB has specific (technical and non-conspiratorial) bones to pick with the algorithm. He’s as much an expert in cryptographic implementation flaws and misuse resistance as anybody at NIST. Doesn’t mean he’s right all the time, but blowing him off as if he’s just some crackpot isn’t even correctly appealing to authority.

      I hate that his more tinfoil hat stuff (which is not totally unjustified, mind you) overshadows his sober technical contributions in these discussions.

      10 replies →

It's more like "the standard makes it easier to create insecure implementations." Our standards shouldn't just be "sufficient" they should be "robust."

this is like saying just use C and don't write any memory bugs. possible, but life could be a lot better if it weren't so easy to do so.

  • Great, you’ve just convinced every C programmer to use a hand rolled AES implementation on their next embedded device. Only slightly joking.

  • Yeah except there are certified versions of AES written in C. Which makes your point what exactly?