← Back to context

Comment by tptacek

3 years ago

No. I don't think we should rely on formal standards, like FIPS, NIST, and the IETF. Like Bernstein himself, I do think we should rely on peer-reviewed expert cryptography. I use Chapoly, not a stream cipher I concocted myself, or some bizarro cipher cascade posted to HN. This is what I'm talking about when I mentioned the Noise Protocol Framework.

If IETF standards happen to end up with good cryptography because they too adopt things like Noise or Ed25519, that's great. I don't distrust the IETF's ability to standardize something like HTTP/3. I do deeply distrust the process they use to arrive at cryptographic architectures. It's gotten markedly better, but there's every reason to believe it'll backslide a generation from now.

(There are very excellent people who contribute to things like CFRG and I wouldn't want to be read as disparaging any of them. It's the process I have an issue with, not anything happening there currently.)

Standards are for people who are not experts in the field or don't have the time and energy to research the existing crypto and actually sift through them to try and decide what to trust and what not to trust.

Lack of standardization might just make it harder for Joe to filter through the google searches and figure out what algorithm to use. He may just pick the first result on Google, which is an ad for the highest bidder on some keywords which may or may not be good.

> No. I don't think we should rely on formal standards, like FIPS, NIST, and the IETF.

I assume your concerns are with the process of standardization, and not the idea of standards themselves. After all, there are plenty of expert peer-reviews going on in NIST and in the IRTF.

Noise is useful for building your own bespoke kit, but there does need to be an agreement to use it in the same manner if you hope for interoperability. Things like public key crypto are precisely useful because the other side can read the information back out at the end of the process, even if they aren't running e.g. the same email client version.

  • NIST is procedurally the least objectionable of all of these standards bodies. Contests are better than collaborations. But NIST itself is a force for evil, not for the lurid message board reason of a shadowy cabal of lizard people trying to weaken PQC, but because "NIST standardization" keeps a lot of 1990s-era crypto in use and prevents a lot of modern crypto from being deployed in the industry.

I guess this is my point: If you have strong mathematicians and cryptographers, you don't end up using NIST.

There are lots of companies who have need for cryptography who don't know who to trust. What should they do in a world where the standards bodies are adversarial?

Maybe this is just the future, if you don't know crypto you're doomed to either do the research or accept that you're probably backdoored? Seems like a rough place to be...

  • So use whatever crypto Signal uses, or that WireGuard uses. You're not working in a vacuum. You don't even trust NIST to begin with, and yet we still encrypt things, so I'm a little confuddled by the argument that NIST's role as a trusted arbiter of cryptography is vital to our industry. NIST is mostly a force for evil!

    • Signal’s crypto doesn’t solve all problems (neither does wireguard).

      For example, we built private information recovery using the first production grade open source implementation of oblivious RAM (https://mobilecoin.com/overview/explain-like-i'm-five/fog you’ll want to skip to the software engineer section) so that organizations could obliviously store and recover customer transactions without being able to observe them. The signal protocol’s techniques might be part of a cryptographic solution but it is not a silver-bullet.

      I guess, notably, we never looked at NIST when designing it so maybe that’s the end of the discussion there.

      1 reply →