Comment by nmitchko
2 years ago
Unfortunately, the NSA & NIST most likely is recommending a quantum-proof security that they've developed cryptanalysis against, either through high q-bit proprietary technology or specialized de-latticing algorithms .
The NSA is very good at math, so I'm be thoroughly surprised if this analysis was error by mistake rather than error through intent.
The NSA also has a mission-based interest in _breaking_ other people's crypto though, which is generally known.
Which is generally known, so I'm surprised by your argument. Even if the NSA knows more than they are telling us, this doesn't result in most of us feeling less worried, as their ends may not be strengthening the public's cryptography!
Yes: https://en.wikipedia.org/wiki/Dual_EC_DRBG
Also, we still to this day do not know where the seed for P256 and P384 came from. And we're using that everywhere. There is a non-zero chance that the NSA basically has a backdoor for all NIST ECC curves, and no one actually seems to care.
NIST P-256 curve seed came from the X9.62 specification drafted in 1997. It was provided by an NSA employee, Jerry Solinas, as an example seed among many other seeds, including those provided by Certicom. Read this for more details: https://eprint.iacr.org/2015/1018
Or you find it somewhat credible but still use them because fending off the NSA is not something you want to spend energy on, and you are confident in the fact that NSA think no one else can find the backdoor.
Isn't that what the person you're replying to said?
It's clear to me now that it is! Either I misread it, or maybe they edited it to make it more clear!
I just find it sad that it's things like these that make it impossible for the layman to figure out what is going on with, for example, Mochizuki's new stuff
I have no reason to doubt that a lot of math has been made more difficult than necessary just because it is known to give a subtle military advantage in some cases, but this isn't new;
[flagged]
what do you have against dodo birds
"High q-bit proprietary technology" and "specialized de-latticing algorithms" are made up terms that nobody uses.
I'm stuck on trying to work out what it would mean to de-lattice something. Would that transform a lattice basis into a standard vector space basis in R or something, or, like MOV, would it send the whole lattice to an element of some prime extension field?
In my mind's eye, it's cooler: it's like, you render the ciphertext as a raster image, and then "de-lattice" it to reveal the underlying plaintext, scanline by scanline.
i'm still working on understanding lattices better
but i can imagine, based on my own ignorance, creativity, and lack of correct understanding, would be some kind of factorization.
as I think while trying to better know what's a lattice, I imagine a lattice like a coordinate pair, but instead of each coordinate existing on a line, they exist on a binary tree (or some other directed graph explored from a root outwards without cycles)
which means you have two such binary-trees (not necessarily binary, but it's just easier to work with them seemingly)
and then you combine these into ONE lattice. so then, to de-lattice means to recover the binary trees.
but when I say binary tree I'm thinking about rational numbers (because stern broccott trees)
2 replies →
Just bounce a graviton particle beam of the main deflector dish.
> through high q-bit proprietary technology
Somebody would leak or steal that as it would be a GIGANTIC leap forward in our engineering skill at the quantum level.
Getting more than a handful of qubits to stay coherent and not collapse into noise is a huge research problem right now, and progress has been practically non-existent for almost a decade.
"Specialized de-latticing algorithms"?