Comment by johncolanduoni

4 days ago

I mean have a bunch of competent teams that (importantly) didn’t design the algorithm read the final draft and write their versions of it. Then they and others can perform practical analysis on each (empirically look for timing side channels on x86 and ARM, fuzz them, etc.).

> If instead you mean "figure out after some period of implementation whether the standard itself is good", I don't know how that's meant to be workable.

The forcing function can potentially be: this final draft is the heir apparent. If nothing serious comes up in the next 6 months, it will be summarily finalized.

It’s possible this won’t get any of the implementers off their ass on a reasonable timeframe - this happens with web standards all the time. It’s also possible that this is very unlikely to uncover anything not already uncovered. Like I said, I’m not totally convinced that in this specific field it makes sense. But your arguments against it are fully general against this kind of phased process at all, and I think it has empirically improved recent W3C and IETF standards (including QUIC and HTTP2/3) a lot compared to the previous method.

Again: that has now happened. What have we learned from it that we needed to know 3 years ago when NIST chose Kyber? That's an important question, because this is a whole giant thread about Bernstein's allegation that the IETF is in the pocket of the NSA (see "part 4" of this series for that charming claim).

Further, the people involved in the NIST PQ key establishment competition are a murderers row of serious cryptographers and cryptography engineers. All of them had the knowhow and incentive to write implementations of their constructions and, if it was going to showcase some glaring problem, of their competitors. What makes you think that we lacked implementation understanding during this process?

  • I don’t think IETF is in the pocket of the NSA. I really wish the US government hadn’t hassled Bernstein so much when he was a grad student, it would make his stuff way more focused on technical details and readable without rolling your eyes.

    > Further, the people involved in the NIST PQ key establishment competition are a murderers row of serious cryptographers and cryptography engineers.

    That’s actually my point! When you’re trying to figure out if your standard is difficult to implement correctly, that everyone who worked on the reference implementations is a genius who understands it perfectly is a disadvantage for finding certain problems. It’s classic expert blindness, like you see with C++ where the people working on the standard understand the language so completely they can’t even conceive of what will happen when it’s in the hands of someone that doesn’t sleep with the C++ standard under their pillow.

    Like, would anyone who developed ECC algorithms have forgotten to check for invalid curve points when writing an implementation? Meanwhile among mere mortals that’s happened over and over again.

    • I don't think this has much of anything to do with Bernstein's qualms with the US government. For all his concerns about NIST process, he himself had his name on a NIST PQC candidate. Moreover, he's gotten into similar spats elsewhere. This isn't even the first time he's gotten into a heap of shit at IETF/IRTF. This springs to mind:

      https://mailarchive.ietf.org/arch/msg/cfrg/qqrtZnjV1oTBHtvZ1...

      This wasn't about NSA or the USG! Note the date. Of course, had this happened in 2025, we'd all know about it, because he'd have blogged it.

      But I want to circle back to the point I just made: you've said that we'd all be better off if there was a burning-in period for implementors before standards were ratified. We've definitely burnt in MLKEM now! What would we have done differently knowing what we now know?

      2 replies →