Comment by xt00
3 years ago
Speaking of dual-EC -- it does seem like 2 questions seem to be often debated, but it can't be neglected that some of the vocal debaters may be NSA shills:
1. does the use of standards actually help people, or make it easier for the NSA to determine which encryption method was used?
2. are there encryption methods that actually do not suffer from reductions in randomness or entropy etc when just simply running the algorithm on the encrypted output multiple times?
It seems that these question often have piles of people ready to jump in saying "oh, don't roll your own encryption, ooh scary... fear uncertainty doubt... and oh whatever you do, don't encrypt something 3X that will probably make it easier to decrypt!!" .. but it would be great if some neutral 3rd party could basically say, ok here is an algorithm that is ridiculously hard to break, and you can crank up the number of bits to a super crazy number.. and then also you can run the encryption N times and just not knowing the number of times it was encrypted would dramatically increase the complexity of decryption... but yea how many minutes before somebody jumps in saying -- yea, don't do that, make sure you encrypt with a well known algorithm exactly once.. "trust me"...
1. Formal, centralized crypto standards, be they NIST or IETF, are a force for evil.
2. All else equal, fewer dependencies on randomness are better. But all else is not equal, and you can easily lose security by adding determinism to designs willy-nilly in an effort to minimize randomness dependencies.
Nothing is, any time in the conceivable future, change to make a broken RNG not game-over. So the important thing remains ensuring that there's a sound design for your RNG.
None of our problems have anything to do with how "much" you encrypt something, or with "cranking up the number of bits". That should be good news for you; generally, you can run ChaPoly or AES-CTR and trust that a direct attack on the cipher isn't going to be an issue for you. Most of our problems are in the joinery, not the beams themselves.
The problem with formal centralized standards is that they tend to become ceilings rather than floors for quality, and it's hard to write them otherwise. They do however serve a function in keeping total snake oil crypto out of government and industry. Having some rubber stamp from people who at least know something keeps people with no knowledge of cryptography from buying the latest absolutely uncrackable post-quantum military grade AES-4096 cryptography product.
I'm also not sold on the idea that informal popularity contests or academic processes (which are often themselves opaque) are always superior to formalized cryptography standards. It's absolutely possible for modern intelligence agencies to infiltrate, steer, and subvert decentralized communities and private sector institutions. We see it all the time.
IMHO Internet culture is unbelievably naive about this. Everyone of course believes that they are hip and smart enough to spot astroturf and could never be conned. Everyone thinks only other people who are obviously less savvy and smart than them could be conned. "Wake up sheeple!" is never spoken to the mirror.
For all we know the NIST curves and AES are stronger than the other stuff and there's an astroturf effort to get non-government entities not to use them! Get the hipsters using vulnerable stuff while NIST/NSA keep recommending the good stuff for classified government use. How do we know DJB doesn't work for the NSA? (I do not believe any of this.!)
This way is madness. So I stick with the rule of "solid evidence or go home" when it comes to allegations and with general consensus of people who seem to know more than myself when it comes to algorithms and constructions.
>We see it all the time.
Really? I'd like to hear about that.
And is astroturfing the most likely attack vector? That might work on big social media where it's easy to feel like you've got a finger on the pulse of public opinion by scrolling down a long list of anonymous content, but it presumably wouldn't work in crypto (or crypto adjacent) communities which are much smaller and where individual reputations are quite important.
1 reply →
>2. are there encryption methods that actually do not suffer from reductions in randomness or entropy etc when just simply running the algorithm on the encrypted output multiple times?
I think all block ciphers (e.g. AES) meet that definition. For AES, for a specific key, there's a 1-to-1 mapping of plaintexts to ciphertexts. It's impossible that running a plaintext through AES produces a ciphertext with less entropy, because if the ciphertext had less entropy, it would be impossible to decrypt to get back the plaintext, but AES always allows decryption.
> some neutral 3rd party
Unfortunately, this would appear to be the bit we've not yet solved, nor are we likely to.
> are there encryption methods that actually do not suffer from reductions in randomness or entropy etc when just simply running the algorithm on the encrypted output multiple times?
Unless you can prove that all e.g. 2^256 possible 256 bit inputs map to 2^256 different 256 bit outputs (for every key, in the case of encryption), then chances are you lose strength with every application because multiple inputs map to the same output (and consequently some outputs are not reachable).
For encryption, as opposed to hashing, you can’t have multiple inputs map to the same output, because then you wouldn’t be able to decrypt the output.
it's very easy to prove that all encryption functions are 1 to 1. Otherwise, you couldn't decrypt the data.