Comment by tptacek
2 years ago
An important detail you really want to understand before reading this is that NIST (and NSA) didn't come up with these algorithms; they refereed a competition, in which most of the analysis was done by competitors and other academics. The Kyber team was Roberto Avanzi, Joppe Bos, Léo Ducas, Eike Kiltz, Tancrède Lepoint, Vadim Lyubashevsky, John M. Schanck, Gregor Seiler, Damien Stehlé, and also Peter Schwabe, a collaborator of Bernstein's.
Absolutely, but NIST ultimately choose the winners, giving them the option to pick (non-obviously) weak/weaker algorithms. Historically only the winners are adopted. Look at the AES competition - how often do you see Serpent being mentioned, despite it having a larger security margin than Rijndael by most accounts?
> Historically only the winners are adopted. Look at the AES competition
Often, yes. But also consider the SHA-3 competition.
BLAKE2 seems more widely used than what was chosen for SHA-3 (Keccak). What was submitted for the SHA-3 competition was BLAKE1 (it didn't have a number back then but I think this is clearer) so it's not like NIST said that Keccak is better than BLAKE2, they only said it's better than BLAKE1 (per their requirements, which are unlikely to align with your requirements because of the heavy weighing of speed-in-hardware), but still this is an example of a widely used algorithm that is not standardized.
> how often do you see Serpent being mentioned, despite it having a larger security margin than Rijndael
The goal of an encryption algorithm is not only to be secure. Sure, that has to be a given: nobody is going to use a broken algorithm when given a choice. But when you have two secure options, the more efficient one is the one to choose. You could use a 32k RSA key just to be sure, or a 4k RSA key which (to the best of my knowledge) everyone considers safe until quantum. (After quantum, you need something like a 1TB key, as djb humorously proposed.)
Wikipedia article on Serpent: "The 32 rounds mean that Serpent has a higher security margin than Rijndael; however, Rijndael with 10 rounds is faster and easier to implement for small blocks."
I don't know that nobody talks about Serpent solely because it was not chosen as winner. It may just be that Rijndael with 256-bit keys is universally considered secure and is more efficient at doing its job.
Re: BLAKE2, I'm not sure it's fair to say that BLAKE2 is more widely used overall. But I do agree BLAKE2 is a bit of an outlier in terms of adoption. I think part of the reason is that SHA2 remains the go-to option, else I'd expect the ecosystem to consolidate around SHA3.
Re: Serpent, there are many things to unpack here but, in summary, you don't know a priori how large of a security margin you need (given the primary function of a cipher, you want to pick the conservative option), efficiency concerns become much less relevant with hardware-accelerated implementations and years of Moore's law performance uplifts, low-power devices can take advantage of much lighter algorithms than Rijndael OR Serpent, ease of implementation does not equal ease of correct/secure implementation vis-a-vis side channel attacks, and certainly if Serpent was chosen you wouldn't see Rijndael talked about much.
6 replies →
> BLAKE2 seems more widely used than what was chosen for SHA-3 (Keccak)
There's also BLAKE3 and it is amazing. How is its adoption going?
https://github.com/BLAKE3-team/BLAKE3
1 reply →
SHA-3 is vastly more widely used than BLAKE.
I fully admit to having a weak spot for Serpent - it is self-bitslicing (see the submission package or the linux kernel tree), which in hindsight makes constant time software easier to write, and it was faster in hardware even when measured at the time, which is where we have ended up putting AES anyway (e.g. AES-NI etc).
BUT. On security margins, you could argue the Serpent designers were too conservative: https://eprint.iacr.org/2019/1492 It is also true that cryptanalytic attacks appear to fare slightly better against AES than Serpent. What does this mean? A brute force attack has the same number of operations as the claimed security level, say, 2^128 for 128-bit. An attack is something better than this: fewer operations. All of the attacks we know about achieve slightly less than this security level - which is nonetheless still impossible to do - but that comes at a cost: they need an infeasible amount of memory. In terms of numbers: 9000 TB to reduce 2^128 to 2^126 against full-round AES according to a quick check of wikipedia. For reference, the lightweight crypto competition considered 2^112 to be sufficient margin. 2^126 is still impossible.
In practice, the difference between Serpent and AES in terms of cryptanalytic security is meaningless. It is not an example of NIST picking a weaker algorithm deliberately, or I would argue, even unintentionally. It (AES) was faster when implemented in software for the 32-bit world that seemed to be the PC market at the time.
Implemented correctly, I agree the difference in security margin may not be too important. Otherwise, Serpent is more resistant to timing attacks. Weaknesses in implementation are as important as weaknesses in design.
Regardless, the comparison wasn't intended to argue for a meaningful difference in security margin, but to show that that the winner of the competition, well, wins (in adoption).
> BUT. On security margins, you could argue the Serpent designers were too conservative: https://eprint.iacr.org/2019/1492
Thanks for digging that paper out again. It is really telling that AES only gets a bit of a bump (10-30%) while the other ones gain like 2x or more.
I was about to comment that the competitors to AES were definitely too conservative, and it bit them because of how much slower it made them in software and larger in hardware.
Blowfish has a continuing existence as the basis for bcrypt.
It works as a password hash for reasons having in part to do with why it isn’t a great general purpose cipher.
3 replies →
Correct me if I'm wrong, everything is also being done out in the open for everyone to see. The NIST aren't using some secret analysis to make any recommendations.
Teams of cryptographers submit several proposals (and break each other's proposals). These people are well respected, largely independent, and assumed honest. Some of the mailing lists provided by NIST where cryptographers collaborated to review each other's work are public
NIST may or may not consort with your friendly local neighborhood NSA people, who are bright and talented contributors in their own right. That's simply in addition to reading the same mailing lists
At the end, NIST gets to pick a winner and explain their reasonning. What influenced the decision is surely a combination of things, some of which may be internal or private discussions
> NIST may or may not consort with your friendly local neighborhood NSA people
It is worth noting that while breaking codes is a big part of the NSA's job, they also have a massive organization (NSA Cybersecurity, but I prefer the old name Information Assurance) that works to protect US and allied systems and cryptographic applications.
In the balance, weakening American standards does little to help with foreign collection. Their efforts would be much better spent injecting into the GOST process (Russia and friends) or State Cryptography Administration (China and friends).
2 replies →
I was under the impression that only fools trust NIST after DUAL_EC_whatsit.
Is that not the case?
8 replies →
My rule of thumb in these situations is always: if they could, they would.
I've seen enough blatant disregard for humanity to assume any kind of honesty in the powers that were.
I'm sure the NSA 9-5ers justify weakening standards processes by the fact it's still secure enough to be useful for citizens and some gov orgs but flawed enough to help themselves when it matters at x point in the future.
No one can say they pushed some useless or overtly backdoored encryption. That's rarely how Intel agencies work. It's also not how they need to work to maintain their effectiveness indefinitely.
When the CIA is trying to recruit for HUMINT if they can get claws into anything whether it's a business conference that has a 0.1% chance they'll meet some pliable young but likely future industry insider that may or may not turn into a valuable source then they'll show up to every single year to that conference. It's a matter of working every angle you can get.
They aren't short of people, time, or money. And in security tiny holes in a dam turn into torrents of water all the time.
The fact NIST is having non public backroom meetings with NSA, concealing NSA employee paper authors, generating a long series of coincidental situations preferencing one system, and stonewalling FIOAs from reputable individuals. IDK, if was a counter intelligence officer in charge of detecting foreign IC work I'd be super suspicious of anything sold as safe and open from that org.
Where is your evidence other than your gut feeling from other unrelated news articles?
> everything is also being done out in the open for everyone to see
Well, everything apart from the secret stuff:
"I filed a FOIA request "NSA, NIST, and post-quantum cryptography" in March 2022. NIST stonewalled, in violation of the law. Civil-rights firm Loevy & Loevy filed a lawsuit on my behalf.
That lawsuit has been gradually revealing secret NIST documents, shedding some light on what was actually going on behind the scenes, including much heavier NSA involvement than indicated by NIST's public narrative"
Thing is... no lawsuit will ever reveal documents directly against the USA's national interest.
Every single document will be reviewed by a court before being opened up, and any which say "We did this so we can hoodwink the public and snoop on russia and china" won't be included.
1 reply →
There is a final standardization step where NIST selects constants, and this is done without always consulting with the research team. Presumably, these are usually random, but the ones chosen for the Dual-EC DRBG algorithm seem to have been compromised. SHA-3 also had some suspicious constants/padding, but that wasn't shown to be vulnerable yet.
The problem with Dual EC isn't the sketchy "constants", but rather the structure of the construction, which is a random number generator that works by doing a public key transformation on its state. Imagine CTR-DRBG, but standardized with a constant AES key. You don't so much wonder about the provenance of the key so much as wonder why the fuck there's a key there at all.
I don't know of any cryptographer or cryptography engineer that takes the SHA3 innuendo seriously. Do you?
Additional backstory that might be helpful here: about 10 years ago, Bernstein invested a pretty significant amount of time on a research project designed to illustrate that "nothing up my sleeves" numbers, like constants formed from digits of pi, e, etc, could be used to backdoor standards. When we're talking about people's ability to cast doubt on standards, we should keep in mind that the paragon of that idea believes it to be true of pi.
I'm fine with that, for what it's worth. Cryptography standards are a force for evil. You can just reject the whole enterprise of standardizing cryptography of any sort, and instead work directly from reference designs from cryptographers. That's more or less how Chapoly came to be, though it's standardized now.
8 replies →
You don't really know, but you can be reasonably sure that they didn't sabotage the submissions themselves.