> Aside: I really wish people would link to court documents whenever they talk about an ongoing lawsuit.
I just want to second that and thank you for the link. Most reporting is just horribly bad at covering legal stuff because all the stuff that makes headlines that people click on is mostly nonsense.
And a big thank you to the wonderful people at the Free Law Project for giving us the ability to find and link to this stuff. They're a non-profit and they accept donations. (hint hint)
It's just a vanilla FOIA lawsuit, of the kind hundreds of people file every month when public bodies fuck up FOIA.
If NIST puts up any kind of fight (I don't know why they would), it'll be fun to watch Matt and Wayne, you know, win a FOIA case. There's a lot of nerd utility in knowing more about how FOIA works!
But you're not going to get the secrets of the Kennedy assassination by reading this thing.
I will draw to your attention two interesting facts.
First, OpenSSH has disregarded the winning (crystals) variants, and implemented hybrid NTRU-Prime. The Bernstein blog post discusses hybrid designs.
"Use the hybrid Streamlined NTRU Prime + x25519 key
exchange method by default ("sntrup761x25519-sha512@openssh.com").
The NTRU algorithm is believed to resist attacks enabled by future
quantum computers and is paired with the X25519 ECDH key exchange
(the previous default) as a backstop against any weaknesses in
NTRU Prime that may be discovered in the future. The combination
ensures that the hybrid exchange offers at least as good security
as the status quo."
Second, Daniel Bernstein has filed a public complaint against the NIST process, and the FOIA stonewalling adds more concern and doubt that the current results are fair.
I may believe almost all of this is overblown and silly, as like a matter of cryptographic research, but I'll say that Matt Topic and Merrick Wayne are the real deal, legit the lawyers you want working on something like this, and if they're involved, presumably some good will come out of the whole thing.
Matt Topic is probably best known as the FOIA attorney who got the Laquan McDonald videos released in Chicago; I've been peripherally involved in some work he and Merrick Wayne did for a friend, in a pretty technical case that got fierce resistance from CPD, and those two were on point. Whatever else you'd say about Bernstein here, he knows how to pick a FOIA lawyer.
A maybe more useful way to say the same thing is: if Matt Topic and Merrick Wayne are filing this complaint, you should probably put your money on them having NIST dead-to-rights with the FOIA process stuff.
> "I may believe almost all of this is overblown and silly, as like a matter of cryptographic research ..."
Am I misunderstanding you, or are you saying that you believe almost all of DJB's statements claiming that NIST/NSA is doctoring cryptography is overblown and silly? If that's the case, would you mind elaborating?
I don't think it's a bad thing to push back and demand transparency. At the very least the pressure helps keep NIST honest. Keep reminding them over and over and over again about dual-EC and they're less likely to try stupid stuff like that again.
Speaking of dual-EC -- it does seem like 2 questions seem to be often debated, but it can't be neglected that some of the vocal debaters may be NSA shills:
1. does the use of standards actually help people, or make it easier for the NSA to determine which encryption method was used?
2. are there encryption methods that actually do not suffer from reductions in randomness or entropy etc when just simply running the algorithm on the encrypted output multiple times?
It seems that these question often have piles of people ready to jump in saying "oh, don't roll your own encryption, ooh scary... fear uncertainty doubt... and oh whatever you do, don't encrypt something 3X that will probably make it easier to decrypt!!" .. but it would be great if some neutral 3rd party could basically say, ok here is an algorithm that is ridiculously hard to break, and you can crank up the number of bits to a super crazy number.. and then also you can run the encryption N times and just not knowing the number of times it was encrypted would dramatically increase the complexity of decryption... but yea how many minutes before somebody jumps in saying -- yea, don't do that, make sure you encrypt with a well known algorithm exactly once.. "trust me"...
I have no doubt that they are great at their job, but when it comes to lawsuits the judge(s) are equally as important. You could get everything right but a judge has extreme power to interpret the law or even ignore it in select cases.
I wouldn't say they ignore the law, but legislation like FOIA has a lot of discretion to balance competing interests and that's where a judge would make the most different despite all the great articulations of the most brilliant lawyers.
Near the end of the post – after 50 years of axe grinding – djb does eventually get to the point wrt pqcrypto. I find the below excerpt particularly damning. Why not wrap nascent pqcrypto in classical crypto? Suspect!
--
The general view today is that of course post-quantum cryptography should be an extra layer on top of well-established pre-quantum cryptography. As the French government cybersecurity agency (Agence nationale de la sécurité des systèmes d'information, ANSSI) put it at the end of 2021:
Acknowledging the immaturity of PQC is important: ANSSI will not endorse any direct drop-in replacement of currently used algorithms in the short/medium term. However, this immaturity should not serve as an argument for postponing the first deployments. ANSSI encourages all industries to progress towards an initiation of a gradual overlap transition in order to progressively increase trust on the post-quantum algorithms and their implementations while ensuring no security regression as far as classical (pre-quantum) security is concerned. ...
Given that most post-quantum algorithms involve message sizes much larger than the current pre-quantum schemes, the extra performance cost of an hybrid scheme remains low in comparison with the cost of the underlying post-quantum scheme. ANSSI believes that this is a reasonable price to pay for guaranteeing an additional pre-quantum security at least equivalent to the one provided by current pre-quantum standardized algorithms.
But NSA has a different position: it says that it "does not expect to approve" hybrids. Publicly, NSA justifies this by
- pointing to a fringe case where a careless effort to add an extra security layer damaged security, and
- expressing "confidence in the NIST PQC process".
Does that mean the original NISTPQC process, or the current NISTPQC process in which NIST, evidently surprised by attacks, announced plans to call for new submissions?
Of course, if NSA/IDA have secretly developed an attack that works for a particular type of post-quantum cryptosystem, then it makes sense that they'd want people to start using that type of cryptosystem and turn off the existing pre-quantum cryptosystem.
This is the least compelling argument Bernstein makes in the whole post, because it's simply not the job of the NIST PQC program to design or recommend hybrid classical/PQC schemes. Is it fucky and weird if NSA later decides to recommend against people using hybrid key establishment? Yes. Nobody should listen to NSA about that, or anything else. But NIST ran a PQC KEM and signature contest, not a secure transport standardization. Sir, this is a Wendy's.
It’s compelling in context. If the NSA influenced NIST standards 3x in the past — DES, DSA, Dual EC — then shouldn’t we be on high alert this 4th time around?
That NSA is already recommending against hybrid, instead of waiting for the contest results, might signal they’ve once again managed to game the standardization process itself.
At the very least — given the exhaustive history in this post — you’d like to know what interactions NSA and NIST have had this time around. Thus, djb’s FOIA. And thus the lawsuit when the FOIA went unanswered. It all seems very reasonable to me.
An interesting thing that is happening on Bitcoin mailing list is that although it would be quite easy to add Lamport signatures as an extra safety feature for high value transactions, as they would be quite expensive and easy to misuse (they can be used only once, which is a problem if money is sent to the same address twice), the current concensus between developers is to ,,just wait for NSA/NIST to be ready with the algorithm''. I haven't seen any discussion on the possibility of never being ready on purpose because of a sabotage.
An expert, prominent, and someone who the whole cryptography community listens to, and he calls out the lies, crimes, and blatant hypocrisy of his own government.
I genuinely fear that he will be suicided one of these days.
I think the United States is more about charging people with crimes and ruining their lives that way rather than disappearing people. Russia might kill you with Polonium and make sure everyone knows it, but America will straight up “legally“ torture you in prison via several means and then argue successfully that those methods were legal and convince the world you weren’t tortured. Anyone who’s a target for that treatment, though, knows that’s a lie.
The FBI will just interview you over whatever and then charge you for lying to a federal agent or dig up some other unrelated dirt. While the original investigation gets mysteriously dropped a year later.
I just want to say, the problem here is worldwide standards bodies for encryption need to be trustworthy. It is incredibly hard to know what encryption is actually real without a deep mathematics background and even then, a choir of peers must be able to present algorithms, and audits of those algorithms with a straight face.
Presenting broken-by-design encryption undermines public confidence in what should be one of our most sacrosanct institutions: the National Institute of Standards and Technology (NIST). Many enterprises do not possess the capability to audit these standards and will simply use whatever NIST recommends. The danger is that we could be engineering embedded systems which will be in use for decades which are not only viewable by the NSA (which you might be ok with depending on your political allegiance) but also likely viewable by any capable organization on earth (which you are probably not ok with irrespective of your political allegiance).
In short, we must have trustworthy cryptography standards. If we do not, bedlam will follow.
There's an easier problem here, which is that our reliance on formal standards bodies for the selection of cryptography constructions is bad, and, not hardly just at NIST, has been over the last 20 years mostly a force for evil. One of the most important "standards" in cryptography, the Noise Protocol Framework, will probably never be a formal standard. But on the flip side, no formal standards body is going to crud it up with nonsense.
So, no, I'd say that bedlam will not follow from a lack of trustworthy cryptography standards. We've trusted standards too much as it is.
Look, my point is that there are lots of companies around the world who can’t afford highly skilled mathematicians and cryptographers on staff. These institutions rely on NIST to help them determine what encryption systems may make sense. If NIST is truly adversarial, the public has a right to know and determine how to engage going forward.
I think this is a sloppy take. If you read the full back-and-forth on the FOI request between D.J. Bernstein and NIST, it becomes readily apparent that there is _something_ rotten in the state of NIST.
Now of course that doesn't necessarily mean that NIST's work is completely compromised by the NSA (even though it has been in the past), but there are other problems that are similarly serious. For example, if NIST is unable to explain how certain key decisions were made along the way to standardisation, and those decisions appear to go against what would be considered by prominent experts in the field as "good practice", then NIST has a serious process problem. This is important work. It affects everyone in the world. And certain key parts of NIST's decision making process seem to be explained with not much more than a shrug. That's a problem.
All you're saying here is that NIST failed to comply with FOIA. That's not unusual. No public body does a reliably good job of complying with FOIA, and many public bodies seem to have a bad habit of pre-judging the "merits" of FOIA requests, when no merit threshold exists for their open records requirements.
NIST failing to comply with FOIA makes them an intransigent public body, like all the rest of them, from your local water reclamation board to the Department of Energy.
It emphatically does not lend support to any of this litigants concerns about the PQC process. I don't know enough (really, anything) about the PQC "contest" to judge claims about its validity, but I do know enough --- like, the small amount of background information needed --- to say that it's risible to suggest that any of the participating teams were compromised by intelligence agencies; that claim having been made in this post saps its credibility.
So, two things I think a reasonable person would want to establish here: first, that NIST's behavior with respect to the FOIA request is hardly any kind of smoking gun, and second that the narrative being presented in this post about the PQC contest seems somewhere between "hand-wavy" and "embarrassing".
What's with the infighting here? Nothing about the post comes across as conspiracy theory level or reputation ruining. It makes me question the motives of those implying he's crazy, to be honest.
Post-quantum cryptography is essentially a full-employment program for elite academic public key cryptographers, which is largely what the "winning" PQC teams consist of. So, yeah, suggesting that one of those teams was compromised by an intelligence agency is "conspiracy theory level".
Nobody is denying the legitimacy of the suit itself. NIST is obligated to follow public records law, and public records law is important. Filippo's message, which we're all commenting on here, says that directly.
Yes, he appears to be unreasonably dismissive of the blindly obvious history and the current situation.
As an aside, this tracks with his choice of employers - at least one of which was a known and documented NSA collaborator (as well as a victim, irony of irony) before he took the job with them.
As Upton Sinclair remarked: “It is difficult to get a man to understand something when his salary depends upon his not understanding it.”
Joining Google after Snowden revealed PRISM and BULLRUN, as well as MUSCULAR, is almost too rich to believe, Meanwhile he asserts and dismisses Bernstein as a conspiracy theorist. It’s a classic bad faith ad-hominem coincidence theory.
> The same people tend to have trouble grasping that most of the vulnerabilities exploited and encouraged by NSA are also exploitable by the Chinese government. These people start with the assumption that Americans are the best at everything; ergo, we're also the best at espionage. If the Chinese government stole millions of personnel records from the U.S. government, records easily usable as a springboard for further attacks, this can't possibly be because the U.S. government made a policy decision to keep our computer systems "weak enough to still permit an attack of some nature using very sophisticated (and expensive) techniques".
I'm not sure if I understand this part. I was under the impression that the OPM hack was a result of poor authn and authz controls, unrelated to cryptography. Was there a cryptography component sourced somewhere?
If, rather than hoarding offensive tools & spying, the NSA had interpreted its mission as being to harden the security of government infrastructure (surely even more firmly within the remit of national security) and spent its considerable budget in that direction, would authn and authz controls have been used at the OPM?
This is my understanding as well. I asked this very same question less than a week ago[1], and now it's the first Google result when you search "OPM Dual_EC_DRBG."
The response to my comment covers some circumstantial evidence. But I'm not personally convinced; human factors are a much more parsimonious explanation.
Why don’t we require that all internal communications and records be public, available within 24 hours on the web, and provide a very painful mechanism involving significant personal effort of high level employees for every single communication or document that is to be redacted in some way? The key is requiring manual, personal (non-delegatable) effort on the part of senior bureaucrats, and to allow a private cause of action for citizens and waiver of immunity for bureaucrats.
We could carve out (or maybe not) specific things like allowing automatic redaction of employee PII and PII of citizens receiving government benefits.
After many decades, it’s clear that the current approach to FOIA and sunshine laws just isn’t working.
The carve-out you mention is a decent idea on paper, but in practice is a difficult process. There's really no way to do it in any significant degree without basically putting all gov to a complete halt. Consider that government is not staffed with technical people, nor necessarily critically minded people to implement these systems.
There are ways to push for FOIA improvements that don't require this sort of drastic approach. Problem is, it takes a lot of effort on the parts of FOIA requesters, through litigation and change in the laws. Things get surprisingly nuanced when you really get down into what a "record" is, specifically for digital information. I definitely wouldn't want to have "data" open by default in this manner, because it would lead to privacy hell.
Another component of this all is to consider contractors and subcontractors. Would they fall under this? If so, to what degree? If not, how do we prevent laundering of information through contractors/subcontractors?
To a large degree, a lot of "positive" transparency movements like the one you suggest can ironically lead to reduced transparency in some of the more critical sides of transparency. A good example of that is "open data", which gives an appearance of providing complete data, but without the legal requirements to enforce it. Makes gov look good but it de-incentivizes transparency pushback and there's little way to identify whether all relevant information is truly exposed. I would imagine similar would happen here.
A private right of action and waiver of immunity solves most of the “bad actor” problems.
The big issue is how to preserve what actually needs to be secret (in the interest of the USA, not the interests of the bureaucracy) while forcing everything else to be public.
A lot of things are secret that don’t need to be secret; that’s a side effect of mandatory data classification and normal bureaucratic incentives- you won’t get in trouble for over-classifying, and classified information is a source of bureaucratic power. So you have to introduce a really strong personal incentive to offset that or nothing will ever change.
Personally, I don’t think that information should be classified if it came from public sources. Or maybe only allow such information to be classified for a short period of time, eg one year.
The longer and/or higher the classification level, the more effort should be involved, to create disincentives to over-classification.
The old Abe rhetoric was powerful but it always felt like it was only hitting home on two of the three points. Obviously government, by definition really, is of the people. The much better parts were for the people and by the people.
Not sure if the US with it's torture-base aka Guantanamo and torture-safe-houses around the world really has the right to call someone else "evil", i don't mean that as "whataboutissm" but that human lives are not more "worth" in the US as in Mainland China
I've only recently started to digg a bit deeper into crypto algorithms ( looking into various types of curves etc), and it gave me the uneasing feeling that the whole industry is relying on the expertise of only a handful of guys to actually ensure that crypto schemes used today are really working.
Am i wrong ? are there actually thousands and thousands of people with the expertise to actually proove that the algorithms used today are really safe ?
I don’t know if that’s easily quantifiable, but I had a cryptography professor (fairly well-known nowadays) several years ago tell us that she only trusted 7 people (or some other absurdly low number), one of them being djb, to be able to evaluate the security of cryptographic schemes.
Perhaps thousands of people in the world can show you proofs of security, but very few of them may be able to take into account all practical considerations like side channels and the like.
There may be thousands of people in the entire world who understand cryptanalysis well enough to accurately judge the security of modern ciphers. Most aren't living or working in the U.S.
It's very difficult to do better. The mathematics is complex and computer science hasn't achieved proofs of the hypotheses underlying cryptography. The best we can achieve is heuristic judgements about what the best possible attacks are, and P?=NP is an open question.
> The mathematics is complex and computer science hasn't achieved proofs of the hypotheses underlying cryptography.
No unconditional proofs (except for the OTP ofc), but there are quite a few conditional proofs. For example, it's possible to show that CBC is secure if the underlying block cipher is.
Proof! the entire field of cryptography can prove absolutely nothing other than that a single use of One time pad is secure. the rest is all hand waving, that boils down to no-one I know knows how to do this, and I cant do it myself, so I believe it's secure.
So the best we have in cryptography is trusting "human instincts/judgements" about various algorithms. Which then further reduces to trusting humans.
Most programmers don't need to prove crypto algorithms. There are many situations where you can just use TLS 1.3 and let it choose the ciphers. If you really need to build a custom protocol or file format, you can still use libsodium's secretbox, crypto_box, and crypto_kx functions which use the right algorithms.
This is completely unrelated to the question being asked by the parent. They aren't asking about the average programmer. They are asking how many people in the world can truly 'prove' (to some reasonable degree) that the cryptography in use and the algorithms that are implementing that cryptography are 'secure' (to some reasonable degree).
Put another way, they are asking how many people in the world could verify that the algorithms used by libsodium, crypto_box, etc. are secure.
Tangential question: while some FOIA requests do get stonewalled, I continue to be fascinated that they're honored in other cases. What exactly prevents the government from stonewalling practically every request that it doesn't like, until and unless it's ordered by a court to comply? Is there any sort of penalty for their noncompliance?
Tangential to the tangent: is there any reason to believe FOIA won't be on the chopping block in a future Congress? Do the majority of voters even know (let alone care enough) about it to hold their representatives accountable if they try to repeal it?
I know someone who works in gov (Australia, not US) who told me all about a FOI request that he was stonewalling. From memory, the request was open ended and would have revealed more than it possibly intended it to, and would have revealed some proprietary trade secrets from a third party contractor. That said, it was probably a case that would attract some public interest.
The biggest factors preventing governments from stonewalling every FOI case are generally time and money. Fighting FOI cases is time consuming and expensive and it's simply easier to hand over the information.
At least in Australia I gather it it somewhat common for FOI offices to work with an FOI applicant to ask them to narrow the request if it is so broad as to cost too much or take too long to process, or is likely to just to be returned as hundreds of black pages.
Previous FOI responses show more savvy FOI applicants in the past have also (when they don't get the outcome they desired):
1. Formally requested review of decisions to withhold information from release. This almost always lead to more information being released.
2. Waited and tried requesting the same or similar information again in a later year when different people are involved.
3. Sent a follow up FOIA request for correspondence relating to how a previous (or unanswered) request was or is being processed by the FOI office and other parties responding to the request. This has previously shown somewhat humorous interactions with FOI offices such as "We're not going to provide that information because {lame excuse}" vs FOI office "You have to. CC:Executives" vs "No" vs Executives "It's not your information" etc etc.
4. Sent a follow up FOIA request for documentation, policies, training material and the likes for how FOI requests are assessed as well as how and by whom decisions are made to release or withhold information.
5. Sent a follow up FOIA request for documentation, policies, staffing levels, budgets, training material and the likes for how a typical event that the original FOIA request referred to would be handled (if details of a specific event are not being provided).
Responses to (2), (3) and (4) are probably more interesting to applicants than responses to (1), (2) and original requests, particularly when it is clear the applicant currently or previously has knowledge of what they're requesting.
> The biggest factors preventing governments from stonewalling every FOI case are generally time and money.
Is there any backpressure in the system to make the employee(s) responsible for responding/signing off on the disclosure actually care about how expensive it is to fight a case? I would've thought they would think, "Well, the litigation cost doesn't affect me, I just approve/deny requests based on their merits."
If there's the suspiscion that NIST interests aren't aligned with the public ones (at least wrt cryprography, I hope they're at least honest with the physical constants), why do we still allow them do dictate the standards?
I mean, there's plenty of standards bodies and experts in the cryptography community around the world that could probably do a better job. At this point NIST should be treated as a compromised certificate authority: just ignore them and move along.
Good god, this guy is a bad communicator. Bottom line up front:
> NIST has produced zero records in response to this [March 2022] FOIA request [to determine whether/how NSA may have influenced NIST's Post-Quantum Cryptography Standardization Project]. Civil-rights firm Loevy & Loevy has now filed suit on my behalf in federal court, the United States District Court for the District of Columbia, to force NIST to comply with the law.
So... Common pattern:. NSA, it's representatives or affiliates make claims that longer key lengths are unnecessary or have too much of a performance cost.
So... I make the claim again. Let's multiply all key lengths by 10. Ie. 2048 bit RSA becomes 20480 bit RSA.
Who here thinks that's a bad idea? Previously on HN such ideas have been downvoted and comments have been made against them. I wonder, who has it been doing that, and what were their motives?
Here's an interesting question. Even if post-quantum cryptography is securely implemented, doesn't the advent of neurotechnology (BCIs, etc.) make that method of security obsolete?
With read and write capability to the brain, assuming this comes to fruition at some point, encryption as we know it won't work anymore. But I don't know, maybe this isn't something we have to worry about just quite yet.
The thing you're missing is that BCIs and friends are, themselves, computers, and thus securable with post-quantum cryptography, or any cryptography for that matter, or any means of securing a computer. And thus, for somebody to read-write to your computers, they need to read-write to your brain(s), but to read-write to your brain(s), they need to read-write to the computers implanted in your brain(s). It's a security cycle whose overall power is determined by the least-secure element in the chain.
Any sane person will also not touch BCIs and similar technology with a 100 lightyear pole unless the designing company reveals every single fucking silicon atom in the hardware design and every single fucking bit in the software stack at every level of abstraction, and ships the device with several redundant watchdogs and deadmen timers around it that can safely kill or faraday-cage the implant on user-defined events or manually.
Alas, humans are very rarely sane, and I come to the era of bio hacking (in all senses of the word) with low expectations.
Cryptographic secrets stored in human brains are already vulnerable to an attack mechanism that requires $5 worth of interface hardware that can be procured and operated with very little training. Physical security controls do a decent job of preventing malicious actors from connecting said hardware to vulnerable brains. I assume the same would be true with the invention of BCIs more sophisticated than a crescent wrench.
Yeah I’ve even had very personal dreams where my Linux root password was spoken in the dream. I’m glad I don’t talk in my sleep. There’s also truth serums that can be weaponized in war scenarios to extract secrets from the enemy without resorting to torture.
I have the feelings the govs around the world get more and more sued related to serious digital matters.
Here, once the heat wave is finally over, I will see again my lawyer about the interoperability of gov related sites with noscript/basic (x)html browsers.
So the TLDR is… you do roll your own crypto? I mean you probably need to know how to create a RNG that passes Practrand and smasher first and also a hash function that does the same but cool.
Weirdly, any time I've suggested that maaaybe being too trusting of a known bad actor which has repeatedly published intentionally weak cryptography is a bad idea, I've received a whole lot of push-back and downvotes here on this site.
The related “just ignore NIST” crowd is intentionally or unintentionally dismissing serious issues of governance. Anyone who deploys this argument is questionable in my mind, essentially bad faith actors, especially when the topic is about the problems brought to the table by NIST and NSA.
It is a good sign that those people are actively ignoring the areas where you have no choice and you must have your data processed by a party required to deploy FIPS certified software or hardware.
I'm working on a project that involves a customized version of some unclassified, non-intelligence software for a defense customer at my job (not my ideal choice of market, but it wasn't weapons so okay with it). Some of the people on the project come from the deeper end of that industry, with several TS/SCI contract and IC jobs on their resumes.
We were looking over some errors on the sshd log and it was saying it couldn't find the id_ed25519 server cert. I remarked that that line must have stayed even though the system was put in FIPS mode which probably only allowed the NIST-approved ECC curve and related this story, how everyone else has moved over to ed25519 and the government is the only one left using their broken algorithm.
One of the IC background guys (who is a very nice person, nothing against them) basically said, yeah the NSA used to do all sorts of stuff that was a bad idea, mentioning the Clipper chip, etc. What blew my mind is that they seemed to totally have reasonable beliefs about government surveillance and powers, but then when it comes to someone like Snowden, thinks their are a traitor and should have used the internal channels instead of leaking. I just don't understand how they think those same people who run NSA would have cared one bit, or didn't know about it already. I always assumed the people that worked in the IC would just think all this stuff was OK to begin with I guess.
I don't know what the takeaway is from that, it just seems like a huge cognitive dissonance.
Many government or government affiliated organizations are required to comply with NIST approved algorithms by regulation or for interoperability. If NIST cannot be trusted as a reputable source it leaves those organizations in limbo. They are not equipped to roll their own crypto and even if they did, it would be a disaster.
"Other people have no choice but to trust NIST" is not a good argument for trusting NIST. Somehow I don't imagine the NSA is concerned about -- and is probably actively in favor of -- those organizations having backdoors.
Another upvote from someone with many friends and colleagues in NIST. I hope transparency prevails and NISTers side with that urge as well (I suspect many do).
They could and should leak more documents if they have evidence of malfeasance.
There are both legal safe avenues via the IG process and legally risky many journalists who are willing to work for major change. Sadly legal doesn’t mean safe in modern America and some whistleblower have suffered massive retribution even when they play by “the rules” laid out in public law.
The history in this blog post is excellently researched on the topic of NSA and NIST cryptographic sabotage. It presents some hard won truths that many are uncomfortable to discuss, let alone to actively resist.
The author of the blog post is also well known for designing and releasing many cryptographic systems as free software. There is a good chance that your TLS connections are secured by some of these designs.
Given his track record, and the actual meat of this suit, I think he has a good chance.
- He is an expert in the domain
- He made a lawful request
- He believes he's experiencing an obstruction of his rights
I don't see anything egregious here. Being critical of your government is a protected right for USA. Everyone gets a moment to state their case if they'd like to make an accusation.
Suing sounds offensive, but that is the official process for submitting an issue that a government can understand and address. I'm seeing some comments here that seem aghast at the audacity to accuse the government at your own peril, and it shows an ignorance of history.
>Being critical of your government is a protected right for USA. Everyone gets a moment to state their case if they'd like to make an accusation.
Unless a kangaroo “FISA court” says you can’t - in which case you’re screwed, and can’t even tell anyone about the “sentence” if it included a gag order. Still better than getting droned I suppose.
the author was also part of the Linux kernel SPECK cipher talks that broke down in 2013 due to the nsa's stonewalling and hand waving for technical data and explanations.
I remember reading about this in Steven Levy's crypto and elsewhere, there was a lot of internal arguing about lots of this stuff at the time and people had different opinions. I remember that some of the suggested changes from NSA shared with IBM were actually stronger against a cryptanalysis attack on DES that was not yet publicly known (though at the the time people suspected they were suggesting this because it was weaker, the attack only became publicly known later). I tried to find the specific info about this, but can't remember the details well enough. Edit: I think it was this:https://en.wikipedia.org/wiki/Differential_cryptanalysis
They also did intentionally weaken a standard separately from that and all the arguing about 'munitions export' intentionally requiring weak keys etc. - all the 90s cryptowar stuff that mostly ended after the clipper chip failure. They also worked with IBM on DES, but some people internally at NSA were upset that they shared this after the fact. The history is a lot more mixed with a lot of people arguing about what the right thing to do is and no general consensus on a lot of this stuff.
You are not accurately reflecting the history that is presented in the very blog post we are discussing.
NSA made DES weaker for everyone by reducing the key size. IBM happily went along. The history of IBM is dark. NSA credited tweaks to DES can be understood as ensuring that a weakened DES stayed deployed longer which was to their advantage. They clearly explain this in the history quoted by the author:
“Narrowing the encryption problem to a single, influential algorithm might drive out competitors, and that would reduce the field that NSA had to be concerned about. Could a public encryption standard be made secure enough to protect against everything but a massive brute force attack, but weak enough to still permit an attack of some nature using very sophisticated (and expensive) techniques?”
They’re not internally conflicted. They’re strategic saboteurs.
> I remember that some of the suggested changes from NSA shared with IBM were actually stronger against a cryptanalysis attack on DES that was not yet publicly known
So we have that and other examples of NSA apparently strengthening crypto, then we have the dual-EC debacle and some of the info in the Snowden leaks showing that they've tried to weaken it.
I feel like any talk about NSA influence on NIST PQ or other current algorithm development is just speculation unless someone can turn up actual evidence one way or another. I can think of reasons the NSA would try to strengthen it and reasons they might try to weaken it, and they've done both in the past. You can drive yourself nuts constructing infinitely recursive what-if theories.
Right came here to make the same point. The first lawsuit alluded to in the blog post title resulted in an important holding that source code can be protected free expression.
Why? Http is simpler, less fragile, not dependent on good will of third parties, the content is public, and proving authenticity of text on Internet is always hard, even when served via the https scheme. I bet Bernstein thinks there is little point in forcing people to use https to read his page.
If you think I went around looking to dig up dirt, I didn't. I just searched djb's name on Twitter to find more discussions about the subject, as post-quantum cryptography is an area I'm curious about.
Regarding asking for a disclosure, I thought that was widely accepted around here. If the CEO of some company criticised a competitor's product, we would generally expect them to disclose that fact upfront. I thought that was appropriate here given the dismissive tone of GP.
He won a case against the government representing himself so I think he would be on good footing. He is a professor where I graduated and even the faculty told me he was interesting to deal with. Post QC is his main focus right now and also he published curve25519.
Yeah, terrible idea, except this is Daniel Bernstein, who already had an equally terrible idea years ago, and won. That victory was hugely important, it pretty much enabled much of what we use today (to be developed, exported, used without restrictions, etc etc etc)
seems like they just need a judge to force the NSA to comply with a Freedom of Information Act request, its just part of the process
I'm stonewalled on an equivalent Public Record Act request w/ a state, and am kind of annoyed that I have to use the state's court system
Doesn't feel super partial and a couple law journals have written about how its not partial at all in this state and should be improved by the legislature
This is part of a class division where we cannot practically exercise our rights which are clearly enumerated in public law. Only people with money or connections can even attempt to get many kinds of records.
It’s wrong and government employees involved should be fired, and perhaps seriously punished. If people at NIST had faced real public scrutiny and sanction for their last round of sabotage, perhaps we wouldn’t see delay and dismissal by NIST.
Delay of responding to these requests is yet another kind of sabotage of the public NIST standardization processes. Delay in standardization is delay in deployment. Delay means mass surveillance adversaries have more ciphertext that they can attack with a quantum computer. This isn’t a coincidence, though I am sure the coincidence theorists will come out in full force.
NIST should be responsive in a timely manner and they should be trustworthy, we rely on their standards for all kinds of mandatory data processing. It’s pathetic that Americans don’t have several IG investigations in parallel covering NIST and NSA behavior. Rather we have to rely on a professor to file lawsuits for the public (and cryptographers involved in the standardization process) to have even a glimpse of what is happening. Unbelievable but good that someone is doing it. He deserves our support.
Though 99% of the time I would agree with you, the public has to have faith in people who claim to be fighting (with previously noted successes in Bernstein v. US) in our best interests.
> It could even generate an algorithm so complicated it would be close to impossible for a human mind to comprehend the depth of it.
Okay... then some nefarious actor's above-human-intelligence neural network instantly decodes the algorithm deemed too complicated for human understanding?
I don't see how opaque neural nets are suddenly going to make security-through-obscurity work.
So, question then, isn't one of the differences between this time's selection, compared to previous selections, that some of the algorithms are open source with their code available.
Not really. For the same reason that "here's your github login" doesn't equate to you suddenly being able to be effective in a new company. You might be able to look things up in the code and understand how things are being done, but you don't know -why- things are being done that way.
A lot of the instances in the post even show the NSA giving a why. It's not a particular convincing why, but it was enough to sow doubt. The reason to make all discussions public is so that there isn't an after the fact "wait, why is that obviously odd choice being done?" but instead a before the fact "I think we should make a change". The burden of evidence is different for that. A "I think we should reduce the key length for performance" is a much harder sell when the spec already prescribes a longer key length, than an after the fact "the spec's key length seems too short" "Nah, it's good enough, and we need it that way for performance". The status quo always has inertia.
Thanks for the response, that's making sense. I've also tried following the PQC Google Groups but a lot of the language is beyond my grasp.
Also... I don't understand why I've been downvoted for asking a question, I'm trying to learn but HN can certainly be unwelcoming to the 'curious' (which is why I thought we are here)
Who cares about a particular piece of source code? Cryptanalysis is about the mathematical structure of the ciphers. When we say the NSA backdoored an algorithm, we don't mean that they included hidden printf statements in "the source code". It means that mathematicians at the NSA have knowledge of weaknesses in the construction, that are not known publicly.
Worth noting DJB (the article author) was on two competing (losing) teams to Kyber[0] in Round 3. And has an open submission in round 4 (still in progress). That's going to slightly complicate any FOIA until after the fact, or it should. Not that there's no merit in the request.
It is wrong to imply he is unreasonable here. NIST has been dismissive and unprofessional towards him and others in this process. They look terrible because they’re not doing their jobs.
Several of his student’s proposals won the most recent round. He still has work in the next round. NIST should have answered in a timely manner.
On what basis do you think any of these matters can or may complicate the FOIA process?
This definitely has the sting of bitterness in it, I doubt djb would have filed this suit if NTRU Prime would have won the PQC NIST contest. It's hard to evaluate this objectively when there are strong emotions involved.
When it comes to the number of times DJB is right versus the number of times that DBJ is wrong, I'll fully back DJB. Simply put the NSA/NIST cannot and should not be trusted in this case.
If NTRU Prime had been declared the winner, would this suit have been filed? It's the same contest, same people, same suspicious behavior from NIST. I don't think this suit would have come up. djb is filing this suit because of alleged bad behavior, but I have doubts that it's the real reason.
Perhaps the old advice (“never roll your own crypto”) should be reevaluated? If you’re creative enough, you could combine and apply existing algorithms in such ways that it would be very difficult to decrypt? Think 500 programmatic combinations (steps) of encryption applying different algorithms. Content encrypted in this way would require knowledge of the encryption sequence in order to execute the required steps in reverse. No amount of brute force could help here…
> Would require knowledge of the encryption sequence...
This is security by obscurity. Reputable encryptions work under the assumption that you have full knowledge about the encryption/decryption process.
You could however argue that the sequence then becomes part of the key. However, this key [ie. the sequence of encryptions] would then be at most as strong as the strongest encryption in this sequence, which kindof defeats the purpose.
No, an important property of a secure cryptographic cipher is that it should be as close to a random permutation of the input as possible.
A "randomly assembled" cipher that just chains together different primitives without much thought is very unlikely to have that, which will mean that it will probably have "interesting" statistical properties that can be observed given enough plaintext/ciphertext pairs, and those can then be exploited in order to break it.
No not at all, that advice is still good. Even more important if your are talking about modifying algorithms. Your gonna want proofs of resistance or immunity to certain classes of attacks. A subtle change can easily make a strong primitive useless.
If anyone is curious, the courtlistener link for the lawsuit is here: https://www.courtlistener.com/docket/64872195/bernstein-v-na...
(And somebody has already kindly uploaded the documents to RECAP, so it costs you nothing to access.)
Aside: I really wish people would link to court documents whenever they talk about an ongoing lawsuit.
> Aside: I really wish people would link to court documents whenever they talk about an ongoing lawsuit.
I just want to second that and thank you for the link. Most reporting is just horribly bad at covering legal stuff because all the stuff that makes headlines that people click on is mostly nonsense.
And a big thank you to the wonderful people at the Free Law Project for giving us the ability to find and link to this stuff. They're a non-profit and they accept donations. (hint hint)
It's just a vanilla FOIA lawsuit, of the kind hundreds of people file every month when public bodies fuck up FOIA.
If NIST puts up any kind of fight (I don't know why they would), it'll be fun to watch Matt and Wayne, you know, win a FOIA case. There's a lot of nerd utility in knowing more about how FOIA works!
But you're not going to get the secrets of the Kennedy assassination by reading this thing.
I will draw to your attention two interesting facts.
First, OpenSSH has disregarded the winning (crystals) variants, and implemented hybrid NTRU-Prime. The Bernstein blog post discusses hybrid designs.
"Use the hybrid Streamlined NTRU Prime + x25519 key exchange method by default ("sntrup761x25519-sha512@openssh.com"). The NTRU algorithm is believed to resist attacks enabled by future quantum computers and is paired with the X25519 ECDH key exchange (the previous default) as a backstop against any weaknesses in NTRU Prime that may be discovered in the future. The combination ensures that the hybrid exchange offers at least as good security as the status quo."
https://www.openssh.com/releasenotes.html
Second, Daniel Bernstein has filed a public complaint against the NIST process, and the FOIA stonewalling adds more concern and doubt that the current results are fair.
https://www.google.com/url?q=https://groups.google.com/a/lis...
What are the aims of the lawsuit? Can the NIST decision on crystals be overturned by the court, and is that the goal?
18 replies →
I may believe almost all of this is overblown and silly, as like a matter of cryptographic research, but I'll say that Matt Topic and Merrick Wayne are the real deal, legit the lawyers you want working on something like this, and if they're involved, presumably some good will come out of the whole thing.
Matt Topic is probably best known as the FOIA attorney who got the Laquan McDonald videos released in Chicago; I've been peripherally involved in some work he and Merrick Wayne did for a friend, in a pretty technical case that got fierce resistance from CPD, and those two were on point. Whatever else you'd say about Bernstein here, he knows how to pick a FOIA lawyer.
A maybe more useful way to say the same thing is: if Matt Topic and Merrick Wayne are filing this complaint, you should probably put your money on them having NIST dead-to-rights with the FOIA process stuff.
> "I may believe almost all of this is overblown and silly, as like a matter of cryptographic research ..."
Am I misunderstanding you, or are you saying that you believe almost all of DJB's statements claiming that NIST/NSA is doctoring cryptography is overblown and silly? If that's the case, would you mind elaborating?
I believe the implication that NIST or NSA somehow bribed one of the PQC researchers to weaken a submission is risible.
I believe that NIST is obligated to be responsive to FOIA requests, even if the motivation behind those requests is risible.
96 replies →
I don't think it's a bad thing to push back and demand transparency. At the very least the pressure helps keep NIST honest. Keep reminding them over and over and over again about dual-EC and they're less likely to try stupid stuff like that again.
Speaking of dual-EC -- it does seem like 2 questions seem to be often debated, but it can't be neglected that some of the vocal debaters may be NSA shills:
1. does the use of standards actually help people, or make it easier for the NSA to determine which encryption method was used?
2. are there encryption methods that actually do not suffer from reductions in randomness or entropy etc when just simply running the algorithm on the encrypted output multiple times?
It seems that these question often have piles of people ready to jump in saying "oh, don't roll your own encryption, ooh scary... fear uncertainty doubt... and oh whatever you do, don't encrypt something 3X that will probably make it easier to decrypt!!" .. but it would be great if some neutral 3rd party could basically say, ok here is an algorithm that is ridiculously hard to break, and you can crank up the number of bits to a super crazy number.. and then also you can run the encryption N times and just not knowing the number of times it was encrypted would dramatically increase the complexity of decryption... but yea how many minutes before somebody jumps in saying -- yea, don't do that, make sure you encrypt with a well known algorithm exactly once.. "trust me"...
10 replies →
Transparency is good, and, as Bernstein's attorneys will ably establish, not optional.
1 reply →
I have no doubt that they are great at their job, but when it comes to lawsuits the judge(s) are equally as important. You could get everything right but a judge has extreme power to interpret the law or even ignore it in select cases.
I wouldn't say they ignore the law, but legislation like FOIA has a lot of discretion to balance competing interests and that's where a judge would make the most different despite all the great articulations of the most brilliant lawyers.
1 reply →
Near the end of the post – after 50 years of axe grinding – djb does eventually get to the point wrt pqcrypto. I find the below excerpt particularly damning. Why not wrap nascent pqcrypto in classical crypto? Suspect!
--
The general view today is that of course post-quantum cryptography should be an extra layer on top of well-established pre-quantum cryptography. As the French government cybersecurity agency (Agence nationale de la sécurité des systèmes d'information, ANSSI) put it at the end of 2021:
Acknowledging the immaturity of PQC is important: ANSSI will not endorse any direct drop-in replacement of currently used algorithms in the short/medium term. However, this immaturity should not serve as an argument for postponing the first deployments. ANSSI encourages all industries to progress towards an initiation of a gradual overlap transition in order to progressively increase trust on the post-quantum algorithms and their implementations while ensuring no security regression as far as classical (pre-quantum) security is concerned. ...
Given that most post-quantum algorithms involve message sizes much larger than the current pre-quantum schemes, the extra performance cost of an hybrid scheme remains low in comparison with the cost of the underlying post-quantum scheme. ANSSI believes that this is a reasonable price to pay for guaranteeing an additional pre-quantum security at least equivalent to the one provided by current pre-quantum standardized algorithms.
But NSA has a different position: it says that it "does not expect to approve" hybrids. Publicly, NSA justifies this by
- pointing to a fringe case where a careless effort to add an extra security layer damaged security, and
- expressing "confidence in the NIST PQC process".
Does that mean the original NISTPQC process, or the current NISTPQC process in which NIST, evidently surprised by attacks, announced plans to call for new submissions?
Of course, if NSA/IDA have secretly developed an attack that works for a particular type of post-quantum cryptosystem, then it makes sense that they'd want people to start using that type of cryptosystem and turn off the existing pre-quantum cryptosystem.
This is the least compelling argument Bernstein makes in the whole post, because it's simply not the job of the NIST PQC program to design or recommend hybrid classical/PQC schemes. Is it fucky and weird if NSA later decides to recommend against people using hybrid key establishment? Yes. Nobody should listen to NSA about that, or anything else. But NIST ran a PQC KEM and signature contest, not a secure transport standardization. Sir, this is a Wendy's.
It’s compelling in context. If the NSA influenced NIST standards 3x in the past — DES, DSA, Dual EC — then shouldn’t we be on high alert this 4th time around?
That NSA is already recommending against hybrid, instead of waiting for the contest results, might signal they’ve once again managed to game the standardization process itself.
At the very least — given the exhaustive history in this post — you’d like to know what interactions NSA and NIST have had this time around. Thus, djb’s FOIA. And thus the lawsuit when the FOIA went unanswered. It all seems very reasonable to me.
What’s that old saying, “fool me thrice…”?
6 replies →
An interesting thing that is happening on Bitcoin mailing list is that although it would be quite easy to add Lamport signatures as an extra safety feature for high value transactions, as they would be quite expensive and easy to misuse (they can be used only once, which is a problem if money is sent to the same address twice), the current concensus between developers is to ,,just wait for NSA/NIST to be ready with the algorithm''. I haven't seen any discussion on the possibility of never being ready on purpose because of a sabotage.
Why not start that discussion yourself?
Indeed as potato said, link this article in the ML for them to see that NIST can not be fully trusted
An expert, prominent, and someone who the whole cryptography community listens to, and he calls out the lies, crimes, and blatant hypocrisy of his own government.
I genuinely fear that he will be suicided one of these days.
I think the United States is more about charging people with crimes and ruining their lives that way rather than disappearing people. Russia might kill you with Polonium and make sure everyone knows it, but America will straight up “legally“ torture you in prison via several means and then argue successfully that those methods were legal and convince the world you weren’t tortured. Anyone who’s a target for that treatment, though, knows that’s a lie.
The FBI will just interview you over whatever and then charge you for lying to a federal agent or dig up some other unrelated dirt. While the original investigation gets mysteriously dropped a year later.
McAfee and Epstein pop to mind. Maybe also Aaron Swartz.
4 replies →
I just want to say, the problem here is worldwide standards bodies for encryption need to be trustworthy. It is incredibly hard to know what encryption is actually real without a deep mathematics background and even then, a choir of peers must be able to present algorithms, and audits of those algorithms with a straight face.
Presenting broken-by-design encryption undermines public confidence in what should be one of our most sacrosanct institutions: the National Institute of Standards and Technology (NIST). Many enterprises do not possess the capability to audit these standards and will simply use whatever NIST recommends. The danger is that we could be engineering embedded systems which will be in use for decades which are not only viewable by the NSA (which you might be ok with depending on your political allegiance) but also likely viewable by any capable organization on earth (which you are probably not ok with irrespective of your political allegiance).
In short, we must have trustworthy cryptography standards. If we do not, bedlam will follow.
Please recall, the last lawsuit that DJB filed was the one that resulted in essentially "Code is speech" in our world (https://en.wikipedia.org/wiki/Bernstein_v._United_States).
There's an easier problem here, which is that our reliance on formal standards bodies for the selection of cryptography constructions is bad, and, not hardly just at NIST, has been over the last 20 years mostly a force for evil. One of the most important "standards" in cryptography, the Noise Protocol Framework, will probably never be a formal standard. But on the flip side, no formal standards body is going to crud it up with nonsense.
So, no, I'd say that bedlam will not follow from a lack of trustworthy cryptography standards. We've trusted standards too much as it is.
Believing both "Don't roll your own crypto" and "Don't trust the standards" would seem to leave the average developer in something of a quandry, no?
11 replies →
how could NIST possibly be "one of our most sacrosanct institutions" after the NSA already fucked them with Dual_EC_DRBG?
whoever wants to recommend standards at any point since 2015 needs to be someone else
https://en.wikipedia.org/wiki/NIST_SP_800-90A for this who have forgotten.
Look, my point is that there are lots of companies around the world who can’t afford highly skilled mathematicians and cryptographers on staff. These institutions rely on NIST to help them determine what encryption systems may make sense. If NIST is truly adversarial, the public has a right to know and determine how to engage going forward.
12 replies →
Flippo valrosida and Matthey green aren't too happy.
https://twitter.com/matthew_d_green/status/15556838562625208...
I think this is a sloppy take. If you read the full back-and-forth on the FOI request between D.J. Bernstein and NIST, it becomes readily apparent that there is _something_ rotten in the state of NIST.
Now of course that doesn't necessarily mean that NIST's work is completely compromised by the NSA (even though it has been in the past), but there are other problems that are similarly serious. For example, if NIST is unable to explain how certain key decisions were made along the way to standardisation, and those decisions appear to go against what would be considered by prominent experts in the field as "good practice", then NIST has a serious process problem. This is important work. It affects everyone in the world. And certain key parts of NIST's decision making process seem to be explained with not much more than a shrug. That's a problem.
All you're saying here is that NIST failed to comply with FOIA. That's not unusual. No public body does a reliably good job of complying with FOIA, and many public bodies seem to have a bad habit of pre-judging the "merits" of FOIA requests, when no merit threshold exists for their open records requirements.
NIST failing to comply with FOIA makes them an intransigent public body, like all the rest of them, from your local water reclamation board to the Department of Energy.
It emphatically does not lend support to any of this litigants concerns about the PQC process. I don't know enough (really, anything) about the PQC "contest" to judge claims about its validity, but I do know enough --- like, the small amount of background information needed --- to say that it's risible to suggest that any of the participating teams were compromised by intelligence agencies; that claim having been made in this post saps its credibility.
So, two things I think a reasonable person would want to establish here: first, that NIST's behavior with respect to the FOIA request is hardly any kind of smoking gun, and second that the narrative being presented in this post about the PQC contest seems somewhere between "hand-wavy" and "embarrassing".
10 replies →
What's with the infighting here? Nothing about the post comes across as conspiracy theory level or reputation ruining. It makes me question the motives of those implying he's crazy, to be honest.
Post-quantum cryptography is essentially a full-employment program for elite academic public key cryptographers, which is largely what the "winning" PQC teams consist of. So, yeah, suggesting that one of those teams was compromised by an intelligence agency is "conspiracy theory level".
Nobody is denying the legitimacy of the suit itself. NIST is obligated to follow public records law, and public records law is important. Filippo's message, which we're all commenting on here, says that directly.
3 replies →
Dismissing this lawsuit as a conspiracy theory is embarrassing for both of them.
There is ample evidence to document malfeasance by the involved parties, and it’s reasonable to ask NIST to follow public law.
> Dismissing this lawsuit as a conspiracy theory is embarrassing for both of them.
They are not dismissing the lawsuit.
17 replies →
Filippo Valsorda seems to be happy to ignore the fact that NIST already let an NSA backdoor in, as recently as 2014:
https://wikipedia.org/wiki/Dual_EC_DRBG
is he really just going to ignore something from 8 years ago?
Yes, he appears to be unreasonably dismissive of the blindly obvious history and the current situation.
As an aside, this tracks with his choice of employers - at least one of which was a known and documented NSA collaborator (as well as a victim, irony of irony) before he took the job with them.
As Upton Sinclair remarked: “It is difficult to get a man to understand something when his salary depends upon his not understanding it.”
Joining Google after Snowden revealed PRISM and BULLRUN, as well as MUSCULAR, is almost too rich to believe, Meanwhile he asserts and dismisses Bernstein as a conspiracy theorist. It’s a classic bad faith ad-hominem coincidence theory.
4 replies →
Thanks for letting me know. I think I'll consider both of them compromised.
Man, mobile typos suck.
> The same people tend to have trouble grasping that most of the vulnerabilities exploited and encouraged by NSA are also exploitable by the Chinese government. These people start with the assumption that Americans are the best at everything; ergo, we're also the best at espionage. If the Chinese government stole millions of personnel records from the U.S. government, records easily usable as a springboard for further attacks, this can't possibly be because the U.S. government made a policy decision to keep our computer systems "weak enough to still permit an attack of some nature using very sophisticated (and expensive) techniques".
I'm not sure if I understand this part. I was under the impression that the OPM hack was a result of poor authn and authz controls, unrelated to cryptography. Was there a cryptography component sourced somewhere?
If, rather than hoarding offensive tools & spying, the NSA had interpreted its mission as being to harden the security of government infrastructure (surely even more firmly within the remit of national security) and spent its considerable budget in that direction, would authn and authz controls have been used at the OPM?
This is my understanding as well. I asked this very same question less than a week ago[1], and now it's the first Google result when you search "OPM Dual_EC_DRBG."
The response to my comment covers some circumstantial evidence. But I'm not personally convinced; human factors are a much more parsimonious explanation.
[1]: https://news.ycombinator.com/item?id=32286528
holy crap, i wondered why the post didn't mention work by dj bernstein outing flaws in curves submitted by nsa...
Well, didn't expect the post to actually be written by him.
Why don’t we invert FOIA?
Why don’t we require that all internal communications and records be public, available within 24 hours on the web, and provide a very painful mechanism involving significant personal effort of high level employees for every single communication or document that is to be redacted in some way? The key is requiring manual, personal (non-delegatable) effort on the part of senior bureaucrats, and to allow a private cause of action for citizens and waiver of immunity for bureaucrats.
We could carve out (or maybe not) specific things like allowing automatic redaction of employee PII and PII of citizens receiving government benefits.
After many decades, it’s clear that the current approach to FOIA and sunshine laws just isn’t working.
[ed] fixed autocorrect error
The carve-out you mention is a decent idea on paper, but in practice is a difficult process. There's really no way to do it in any significant degree without basically putting all gov to a complete halt. Consider that government is not staffed with technical people, nor necessarily critically minded people to implement these systems.
There are ways to push for FOIA improvements that don't require this sort of drastic approach. Problem is, it takes a lot of effort on the parts of FOIA requesters, through litigation and change in the laws. Things get surprisingly nuanced when you really get down into what a "record" is, specifically for digital information. I definitely wouldn't want to have "data" open by default in this manner, because it would lead to privacy hell.
Another component of this all is to consider contractors and subcontractors. Would they fall under this? If so, to what degree? If not, how do we prevent laundering of information through contractors/subcontractors?
To a large degree, a lot of "positive" transparency movements like the one you suggest can ironically lead to reduced transparency in some of the more critical sides of transparency. A good example of that is "open data", which gives an appearance of providing complete data, but without the legal requirements to enforce it. Makes gov look good but it de-incentivizes transparency pushback and there's little way to identify whether all relevant information is truly exposed. I would imagine similar would happen here.
A private right of action and waiver of immunity solves most of the “bad actor” problems.
The big issue is how to preserve what actually needs to be secret (in the interest of the USA, not the interests of the bureaucracy) while forcing everything else to be public.
A lot of things are secret that don’t need to be secret; that’s a side effect of mandatory data classification and normal bureaucratic incentives- you won’t get in trouble for over-classifying, and classified information is a source of bureaucratic power. So you have to introduce a really strong personal incentive to offset that or nothing will ever change.
Personally, I don’t think that information should be classified if it came from public sources. Or maybe only allow such information to be classified for a short period of time, eg one year.
The longer and/or higher the classification level, the more effort should be involved, to create disincentives to over-classification.
1 reply →
The old Abe rhetoric was powerful but it always felt like it was only hitting home on two of the three points. Obviously government, by definition really, is of the people. The much better parts were for the people and by the people.
Qualifiers such as evil aren't really useful when there hasn't been a country acting honorably on that stage for a long time, if ever.
Here's a phrasing that might be more appropriate:
"Since we're backstabbers and scoundrels, we should exercise caution around each other."
Do you think it's tough for those regimes to pay someone to do FOIA requests for them? Or to get jobs at government agencies?
We should rethink the concept of a “secret”. If it’s really a secret, it will still be worth the effort to protect.
1 reply →
Surely "keeping things a little more hidden" depends on reliable cryptography.
Not sure if the US with it's torture-base aka Guantanamo and torture-safe-houses around the world really has the right to call someone else "evil", i don't mean that as "whataboutissm" but that human lives are not more "worth" in the US as in Mainland China
side question :
I've only recently started to digg a bit deeper into crypto algorithms ( looking into various types of curves etc), and it gave me the uneasing feeling that the whole industry is relying on the expertise of only a handful of guys to actually ensure that crypto schemes used today are really working.
Am i wrong ? are there actually thousands and thousands of people with the expertise to actually proove that the algorithms used today are really safe ?
I don’t know if that’s easily quantifiable, but I had a cryptography professor (fairly well-known nowadays) several years ago tell us that she only trusted 7 people (or some other absurdly low number), one of them being djb, to be able to evaluate the security of cryptographic schemes.
Perhaps thousands of people in the world can show you proofs of security, but very few of them may be able to take into account all practical considerations like side channels and the like.
There may be thousands of people in the entire world who understand cryptanalysis well enough to accurately judge the security of modern ciphers. Most aren't living or working in the U.S.
It's very difficult to do better. The mathematics is complex and computer science hasn't achieved proofs of the hypotheses underlying cryptography. The best we can achieve is heuristic judgements about what the best possible attacks are, and P?=NP is an open question.
> The mathematics is complex and computer science hasn't achieved proofs of the hypotheses underlying cryptography.
No unconditional proofs (except for the OTP ofc), but there are quite a few conditional proofs. For example, it's possible to show that CBC is secure if the underlying block cipher is.
Proof! the entire field of cryptography can prove absolutely nothing other than that a single use of One time pad is secure. the rest is all hand waving, that boils down to no-one I know knows how to do this, and I cant do it myself, so I believe it's secure.
So the best we have in cryptography is trusting "human instincts/judgements" about various algorithms. Which then further reduces to trusting humans.
This "monoculture" post raised this point several years ago.
https://lwn.net/Articles/681616/
Most programmers don't need to prove crypto algorithms. There are many situations where you can just use TLS 1.3 and let it choose the ciphers. If you really need to build a custom protocol or file format, you can still use libsodium's secretbox, crypto_box, and crypto_kx functions which use the right algorithms.
This is completely unrelated to the question being asked by the parent. They aren't asking about the average programmer. They are asking how many people in the world can truly 'prove' (to some reasonable degree) that the cryptography in use and the algorithms that are implementing that cryptography are 'secure' (to some reasonable degree).
Put another way, they are asking how many people in the world could verify that the algorithms used by libsodium, crypto_box, etc. are secure.
2 replies →
The grandparent post is asking about the people who need to know enough to program TLS to
> let it choose
This guy is the best kind of curmudgeon. I love it.
Tangential question: while some FOIA requests do get stonewalled, I continue to be fascinated that they're honored in other cases. What exactly prevents the government from stonewalling practically every request that it doesn't like, until and unless it's ordered by a court to comply? Is there any sort of penalty for their noncompliance?
Tangential to the tangent: is there any reason to believe FOIA won't be on the chopping block in a future Congress? Do the majority of voters even know (let alone care enough) about it to hold their representatives accountable if they try to repeal it?
I know someone who works in gov (Australia, not US) who told me all about a FOI request that he was stonewalling. From memory, the request was open ended and would have revealed more than it possibly intended it to, and would have revealed some proprietary trade secrets from a third party contractor. That said, it was probably a case that would attract some public interest.
The biggest factors preventing governments from stonewalling every FOI case are generally time and money. Fighting FOI cases is time consuming and expensive and it's simply easier to hand over the information.
At least in Australia I gather it it somewhat common for FOI offices to work with an FOI applicant to ask them to narrow the request if it is so broad as to cost too much or take too long to process, or is likely to just to be returned as hundreds of black pages.
Previous FOI responses show more savvy FOI applicants in the past have also (when they don't get the outcome they desired):
1. Formally requested review of decisions to withhold information from release. This almost always lead to more information being released.
2. Waited and tried requesting the same or similar information again in a later year when different people are involved.
3. Sent a follow up FOIA request for correspondence relating to how a previous (or unanswered) request was or is being processed by the FOI office and other parties responding to the request. This has previously shown somewhat humorous interactions with FOI offices such as "We're not going to provide that information because {lame excuse}" vs FOI office "You have to. CC:Executives" vs "No" vs Executives "It's not your information" etc etc.
4. Sent a follow up FOIA request for documentation, policies, training material and the likes for how FOI requests are assessed as well as how and by whom decisions are made to release or withhold information.
5. Sent a follow up FOIA request for documentation, policies, staffing levels, budgets, training material and the likes for how a typical event that the original FOIA request referred to would be handled (if details of a specific event are not being provided).
Responses to (2), (3) and (4) are probably more interesting to applicants than responses to (1), (2) and original requests, particularly when it is clear the applicant currently or previously has knowledge of what they're requesting.
Interesting, thanks for the anecdote!
> The biggest factors preventing governments from stonewalling every FOI case are generally time and money.
Is there any backpressure in the system to make the employee(s) responsible for responding/signing off on the disclosure actually care about how expensive it is to fight a case? I would've thought they would think, "Well, the litigation cost doesn't affect me, I just approve/deny requests based on their merits."
Presumably most government employees are acting in good faith - why wouldn’t they fulfil a reasonable FOIA request?
This is likely the result of some actors not acting in good faith, and so have no choice but to stonewall lest their intransigence be revealed.
All execs have to do is not staff the FOIA department, and requests get ignored. People generally prefer free time to doing paperwork, if boss allows.
If there's the suspiscion that NIST interests aren't aligned with the public ones (at least wrt cryprography, I hope they're at least honest with the physical constants), why do we still allow them do dictate the standards?
I mean, there's plenty of standards bodies and experts in the cryptography community around the world that could probably do a better job. At this point NIST should be treated as a compromised certificate authority: just ignore them and move along.
Good god, this guy is a bad communicator. Bottom line up front:
> NIST has produced zero records in response to this [March 2022] FOIA request [to determine whether/how NSA may have influenced NIST's Post-Quantum Cryptography Standardization Project]. Civil-rights firm Loevy & Loevy has now filed suit on my behalf in federal court, the United States District Court for the District of Columbia, to force NIST to comply with the law.
Edit: Yes, I know who DJB is.
Well, he is an expert in cryptic communication
That is truly burying the lede...
I spent most of the post asking myself "okay, I'm guessing this is something about post-quantum crypto, but what are you actually suing about?"
djb has got to be the single biggest pain in the ass for the NSA and I love it.
Yippee! DJB for the win for the rest of us!
My background is in normal, enterprise-saas-style software development projects, and the whole notion of post-quantum crypto kind of baffles me.
Funnily enough, this post coincides with the release of a newsletter issue[0] by a friend of mine - unzip.dev - about lattice-based cryptography.
A bit of a shameless plug, but it really is a great bit of intro for noobs in the area like myself.
[0] https://unzip.dev/0x00a-lattice-based-cryptography/
So... Common pattern:. NSA, it's representatives or affiliates make claims that longer key lengths are unnecessary or have too much of a performance cost.
So... I make the claim again. Let's multiply all key lengths by 10. Ie. 2048 bit RSA becomes 20480 bit RSA.
Who here thinks that's a bad idea? Previously on HN such ideas have been downvoted and comments have been made against them. I wonder, who has it been doing that, and what were their motives?
Here's an interesting question. Even if post-quantum cryptography is securely implemented, doesn't the advent of neurotechnology (BCIs, etc.) make that method of security obsolete?
With read and write capability to the brain, assuming this comes to fruition at some point, encryption as we know it won't work anymore. But I don't know, maybe this isn't something we have to worry about just quite yet.
The thing you're missing is that BCIs and friends are, themselves, computers, and thus securable with post-quantum cryptography, or any cryptography for that matter, or any means of securing a computer. And thus, for somebody to read-write to your computers, they need to read-write to your brain(s), but to read-write to your brain(s), they need to read-write to the computers implanted in your brain(s). It's a security cycle whose overall power is determined by the least-secure element in the chain.
Any sane person will also not touch BCIs and similar technology with a 100 lightyear pole unless the designing company reveals every single fucking silicon atom in the hardware design and every single fucking bit in the software stack at every level of abstraction, and ships the device with several redundant watchdogs and deadmen timers around it that can safely kill or faraday-cage the implant on user-defined events or manually.
Alas, humans are very rarely sane, and I come to the era of bio hacking (in all senses of the word) with low expectations.
Cryptographic secrets stored in human brains are already vulnerable to an attack mechanism that requires $5 worth of interface hardware that can be procured and operated with very little training. Physical security controls do a decent job of preventing malicious actors from connecting said hardware to vulnerable brains. I assume the same would be true with the invention of BCIs more sophisticated than a crescent wrench.
The encryption is fine, that's just a way to avoid it. Much like how tire-iron attacks don't break passwords so much as bypass them.
Ok that's actually a great point. To make the comparison:
Tire-irons require physical proximity. And torture generally doesn't work, at least in the case of getting a private key.
Reading/writing to the brain, on the other hand, requires no physical proximity if wireless. And the person(s) won't even know it's happening.
These seem like totally different paradigms to me.
10 replies →
Yeah I’ve even had very personal dreams where my Linux root password was spoken in the dream. I’m glad I don’t talk in my sleep. There’s also truth serums that can be weaponized in war scenarios to extract secrets from the enemy without resorting to torture.
So this lawsuit is to try and force the release of some documents that might be embarrassing.
However, if the lawsuit is won, I would think it very unlikely the documents aren't rewritten 'on national security ' grounds before release.
So nothing will be learned either way.
This is one hell of a well written argument.
I have the feelings the govs around the world get more and more sued related to serious digital matters. Here, once the heat wave is finally over, I will see again my lawyer about the interoperability of gov related sites with noscript/basic (x)html browsers.
Ironically, when I visit the site Chrome says my connection is not secured by TLS.
I was hoping for chacha20+Poly1305
You can see for yourself if you visit the HTTPS version.
>Connection Encrypted (TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256, 256 bit keys, TLS 1.2)
Are you logging into the site?
dig @1.1.1.1 blog.cr.yp.to is failing for me, but 8.8.8.8 works. Annoying!
Seems odd to me a crypto blog isn't using https these days.
There are ways of writing that make one look less like a paranoid conspiracy theorist.
yeah, but where do all these big primes come from?
So the TLDR is… you do roll your own crypto? I mean you probably need to know how to create a RNG that passes Practrand and smasher first and also a hash function that does the same but cool.
Why is the link in the URL http: not https: ? Irony?
Well https uses the NIST standards so.... ;)
This is just due to the way that the OP posted it, not how it was originally published. This website forces HTTPS using ChaCha20-Poly1305 standard.
If you spend all day making bagels do you go home and make bagels for dinner?
It's a static text blog, not a bank
> It's a static text blog, not a bank
I want those delivered by https most, because http leaks the exact page I've visited, rather than just the domain.
1 reply →
See: "Here's Why Your Static Website Needs HTTPS" by Troy Hunt
https://www.troyhunt.com/heres-why-your-static-website-needs...
The NSA has recorded your receipt of this message.
Weirdly, any time I've suggested that maaaybe being too trusting of a known bad actor which has repeatedly published intentionally weak cryptography is a bad idea, I've received a whole lot of push-back and downvotes here on this site.
Indeed. Have my upvote stranger.
The related “just ignore NIST” crowd is intentionally or unintentionally dismissing serious issues of governance. Anyone who deploys this argument is questionable in my mind, essentially bad faith actors, especially when the topic is about the problems brought to the table by NIST and NSA.
It is a good sign that those people are actively ignoring the areas where you have no choice and you must have your data processed by a party required to deploy FIPS certified software or hardware.
I'm working on a project that involves a customized version of some unclassified, non-intelligence software for a defense customer at my job (not my ideal choice of market, but it wasn't weapons so okay with it). Some of the people on the project come from the deeper end of that industry, with several TS/SCI contract and IC jobs on their resumes.
We were looking over some errors on the sshd log and it was saying it couldn't find the id_ed25519 server cert. I remarked that that line must have stayed even though the system was put in FIPS mode which probably only allowed the NIST-approved ECC curve and related this story, how everyone else has moved over to ed25519 and the government is the only one left using their broken algorithm.
One of the IC background guys (who is a very nice person, nothing against them) basically said, yeah the NSA used to do all sorts of stuff that was a bad idea, mentioning the Clipper chip, etc. What blew my mind is that they seemed to totally have reasonable beliefs about government surveillance and powers, but then when it comes to someone like Snowden, thinks their are a traitor and should have used the internal channels instead of leaking. I just don't understand how they think those same people who run NSA would have cared one bit, or didn't know about it already. I always assumed the people that worked in the IC would just think all this stuff was OK to begin with I guess.
I don't know what the takeaway is from that, it just seems like a huge cognitive dissonance.
I think the term "doublethink" was invented specifically for government functionaries like the IC guy you describe.
Being consistently and perfectly dogmatic requires holding two contradictory beliefs in your head at once. It's a skill.
3 replies →
While I am skeptical of US domestic surveillance, Snowden leaked this information in the worst possible way.
Try internal whistleblower channels first. Not being heard? Mail to members of Congress? Contact congress? Contact the media?
Instead he fled to an adversary with classified material. That's not good faith behavior imo. Traitor
6 replies →
Many government or government affiliated organizations are required to comply with NIST approved algorithms by regulation or for interoperability. If NIST cannot be trusted as a reputable source it leaves those organizations in limbo. They are not equipped to roll their own crypto and even if they did, it would be a disaster.
"Other people have no choice but to trust NIST" is not a good argument for trusting NIST. Somehow I don't imagine the NSA is concerned about -- and is probably actively in favor of -- those organizations having backdoors.
3 replies →
"Roll your own crypto" typically refers to making your own algorithm or implementation of an algorithm not choosing the algorithm.
9 replies →
Another upvote from someone with many friends and colleagues in NIST. I hope transparency prevails and NISTers side with that urge as well (I suspect many do).
They could and should leak more documents if they have evidence of malfeasance.
There are both legal safe avenues via the IG process and legally risky many journalists who are willing to work for major change. Sadly legal doesn’t mean safe in modern America and some whistleblower have suffered massive retribution even when they play by “the rules” laid out in public law.
As Ellsberg said: Courage is contagious!
The history in this blog post is excellently researched on the topic of NSA and NIST cryptographic sabotage. It presents some hard won truths that many are uncomfortable to discuss, let alone to actively resist.
The author of the blog post is also well known for designing and releasing many cryptographic systems as free software. There is a good chance that your TLS connections are secured by some of these designs.
One of his previous lawsuits was critical to practically protecting free speech during the First Crypto War: https://en.m.wikipedia.org/wiki/Bernstein_v._United_States
I hope he wins.
Given his track record, and the actual meat of this suit, I think he has a good chance.
- He is an expert in the domain
- He made a lawful request
- He believes he's experiencing an obstruction of his rights
I don't see anything egregious here. Being critical of your government is a protected right for USA. Everyone gets a moment to state their case if they'd like to make an accusation.
Suing sounds offensive, but that is the official process for submitting an issue that a government can understand and address. I'm seeing some comments here that seem aghast at the audacity to accuse the government at your own peril, and it shows an ignorance of history.
I'd add
* and it's been 20 yrs since the 9/11 attacks which predicated a lot of the more recent dragnets
21 replies →
>Being critical of your government is a protected right for USA. Everyone gets a moment to state their case if they'd like to make an accusation.
Unless a kangaroo “FISA court” says you can’t - in which case you’re screwed, and can’t even tell anyone about the “sentence” if it included a gag order. Still better than getting droned I suppose.
Trump Card: National Security
3 replies →
the author was also part of the Linux kernel SPECK cipher talks that broke down in 2013 due to the nsa's stonewalling and hand waving for technical data and explanations.
nsa speck was never adopted.
https://en.m.wikipedia.org/wiki/Speck_(cipher)
Interesting read!
I remember reading about this in Steven Levy's crypto and elsewhere, there was a lot of internal arguing about lots of this stuff at the time and people had different opinions. I remember that some of the suggested changes from NSA shared with IBM were actually stronger against a cryptanalysis attack on DES that was not yet publicly known (though at the the time people suspected they were suggesting this because it was weaker, the attack only became publicly known later). I tried to find the specific info about this, but can't remember the details well enough. Edit: I think it was this: https://en.wikipedia.org/wiki/Differential_cryptanalysis
They also did intentionally weaken a standard separately from that and all the arguing about 'munitions export' intentionally requiring weak keys etc. - all the 90s cryptowar stuff that mostly ended after the clipper chip failure. They also worked with IBM on DES, but some people internally at NSA were upset that they shared this after the fact. The history is a lot more mixed with a lot of people arguing about what the right thing to do is and no general consensus on a lot of this stuff.
You are not accurately reflecting the history that is presented in the very blog post we are discussing.
NSA made DES weaker for everyone by reducing the key size. IBM happily went along. The history of IBM is dark. NSA credited tweaks to DES can be understood as ensuring that a weakened DES stayed deployed longer which was to their advantage. They clearly explain this in the history quoted by the author:
“Narrowing the encryption problem to a single, influential algorithm might drive out competitors, and that would reduce the field that NSA had to be concerned about. Could a public encryption standard be made secure enough to protect against everything but a massive brute force attack, but weak enough to still permit an attack of some nature using very sophisticated (and expensive) techniques?”
They’re not internally conflicted. They’re strategic saboteurs.
9 replies →
> I remember that some of the suggested changes from NSA shared with IBM were actually stronger against a cryptanalysis attack on DES that was not yet publicly known
So we have that and other examples of NSA apparently strengthening crypto, then we have the dual-EC debacle and some of the info in the Snowden leaks showing that they've tried to weaken it.
I feel like any talk about NSA influence on NIST PQ or other current algorithm development is just speculation unless someone can turn up actual evidence one way or another. I can think of reasons the NSA would try to strengthen it and reasons they might try to weaken it, and they've done both in the past. You can drive yourself nuts constructing infinitely recursive what-if theories.
9 replies →
Right came here to make the same point. The first lawsuit alluded to in the blog post title resulted in an important holding that source code can be protected free expression.
Why is the submission URL using http instead of https? That just seems... bizarre.
https://blog.cr.yp.to/20220805-nsa.html works too.
Cryptography experts know when to care about security. Cryptography enthusiasts try to slap encryption on everything.
Why? Http is simpler, less fragile, not dependent on good will of third parties, the content is public, and proving authenticity of text on Internet is always hard, even when served via the https scheme. I bet Bernstein thinks there is little point in forcing people to use https to read his page.
That's just wrong on so many levels. Troy Hunt has an excellent explanation: https://www.troyhunt.com/heres-why-your-static-website-needs...
3 replies →
MITM could change what the client receives, right?
1 reply →
Just FYI, On my Firefox its saying "Connection Secure (upgraded to https)", its actually using ECDHE CHACHA20 SHA256.
Note: I have "Enable HTTPS-Only Mode in all windows" on by default.
We ban accounts that post like this, so please don't.
https://news.ycombinator.com/newsguidelines.html
This isn't the sort of shit you can start here, take a look at
https://news.ycombinator.com/newsguidelines.html
If you think I went around looking to dig up dirt, I didn't. I just searched djb's name on Twitter to find more discussions about the subject, as post-quantum cryptography is an area I'm curious about.
Regarding asking for a disclosure, I thought that was widely accepted around here. If the CEO of some company criticised a competitor's product, we would generally expect them to disclose that fact upfront. I thought that was appropriate here given the dismissive tone of GP.
4 replies →
We detached this subthread from https://news.ycombinator.com/item?id=32363982.
Not sure about the disclosure, having a grudge with djb is not particularly a minority thing.
Whatever "grudge" I have with Bernstein is, to say the least, grudging.
You could not have less of an idea of what you're talking about here.
Seems like a baaad idea lol.
He won a case against the government representing himself so I think he would be on good footing. He is a professor where I graduated and even the faculty told me he was interesting to deal with. Post QC is his main focus right now and also he published curve25519.
He was represented by the EFF during the first, successful case. They declined to represent him in the second case, which ended in a stalemate.
6 replies →
Yeah, terrible idea, except this is Daniel Bernstein, who already had an equally terrible idea years ago, and won. That victory was hugely important, it pretty much enabled much of what we use today (to be developed, exported, used without restrictions, etc etc etc)
seems like they just need a judge to force the NSA to comply with a Freedom of Information Act request, its just part of the process
I'm stonewalled on an equivalent Public Record Act request w/ a state, and am kind of annoyed that I have to use the state's court system
Doesn't feel super partial and a couple law journals have written about how its not partial at all in this state and should be improved by the legislature
This is part of a class division where we cannot practically exercise our rights which are clearly enumerated in public law. Only people with money or connections can even attempt to get many kinds of records.
It’s wrong and government employees involved should be fired, and perhaps seriously punished. If people at NIST had faced real public scrutiny and sanction for their last round of sabotage, perhaps we wouldn’t see delay and dismissal by NIST.
Delay of responding to these requests is yet another kind of sabotage of the public NIST standardization processes. Delay in standardization is delay in deployment. Delay means mass surveillance adversaries have more ciphertext that they can attack with a quantum computer. This isn’t a coincidence, though I am sure the coincidence theorists will come out in full force.
NIST should be responsive in a timely manner and they should be trustworthy, we rely on their standards for all kinds of mandatory data processing. It’s pathetic that Americans don’t have several IG investigations in parallel covering NIST and NSA behavior. Rather we have to rely on a professor to file lawsuits for the public (and cryptographers involved in the standardization process) to have even a glimpse of what is happening. Unbelievable but good that someone is doing it. He deserves our support.
5 replies →
Please include links with https://
NSA employees downvoted this?
Seriously! Tons of people ranting about crypto visiting a non-TLs website!
Maybe this is too much tinfoil hattery, but are we sure DJB isn't a government asset? He'd be the perfect deep-cover agent.
Please don’t do the JTRIG thing. Dan is a national treasure and we would be lucky to have more people like him fighting for all of us.
Between the two, material evidence shows that NIST is the deep-cover agent sabotaging our cryptography.
Though 99% of the time I would agree with you, the public has to have faith in people who claim to be fighting (with previously noted successes in Bernstein v. US) in our best interests.
Perhaps the best way to build trust in a cryptographic algorithm is to have it devised by certifiably neutral general purpose mathematic neural net.
It could even generate an algorithm so complicated it would be close to impossible for a human mind to comprehend the depth of it.
> It could even generate an algorithm so complicated it would be close to impossible for a human mind to comprehend the depth of it.
Okay... then some nefarious actor's above-human-intelligence neural network instantly decodes the algorithm deemed too complicated for human understanding?
I don't see how opaque neural nets are suddenly going to make security-through-obscurity work.
"Certifiably neutral"
So, by a process that hasn't been designed yet. Especially when one considers how opaque most neutral nets are to human scrutiny.
I mean, if the source, training data, and query interface are public, it would be insanely difficult to hide a backdoor
There i "designed" your impossible criterion in just a few obvious steps you could have inferred
2 replies →
So, question then, isn't one of the differences between this time's selection, compared to previous selections, that some of the algorithms are open source with their code available.
For example, Kyber, one of the finalists, is here: https://github.com/pq-crystals/kyber
And where it's not open source, I believe in the first round submissions, everyone included reference implementations.
Does the code being available make it easy to verify whether there are some shady/shenanigans going on, even without NIST's cooperation?
Not really. For the same reason that "here's your github login" doesn't equate to you suddenly being able to be effective in a new company. You might be able to look things up in the code and understand how things are being done, but you don't know -why- things are being done that way.
A lot of the instances in the post even show the NSA giving a why. It's not a particular convincing why, but it was enough to sow doubt. The reason to make all discussions public is so that there isn't an after the fact "wait, why is that obviously odd choice being done?" but instead a before the fact "I think we should make a change". The burden of evidence is different for that. A "I think we should reduce the key length for performance" is a much harder sell when the spec already prescribes a longer key length, than an after the fact "the spec's key length seems too short" "Nah, it's good enough, and we need it that way for performance". The status quo always has inertia.
Thanks for the response, that's making sense. I've also tried following the PQC Google Groups but a lot of the language is beyond my grasp.
Also... I don't understand why I've been downvoted for asking a question, I'm trying to learn but HN can certainly be unwelcoming to the 'curious' (which is why I thought we are here)
What? :D
Who cares about a particular piece of source code? Cryptanalysis is about the mathematical structure of the ciphers. When we say the NSA backdoored an algorithm, we don't mean that they included hidden printf statements in "the source code". It means that mathematicians at the NSA have knowledge of weaknesses in the construction, that are not known publicly.
Well, that was why I asked the question. I didn't think asking a question deserved downvotes and ridicule.
Worth noting DJB (the article author) was on two competing (losing) teams to Kyber[0] in Round 3. And has an open submission in round 4 (still in progress). That's going to slightly complicate any FOIA until after the fact, or it should. Not that there's no merit in the request.
[0]: https://csrc.nist.gov/Projects/post-quantum-cryptography/pos...
> the Supreme Court has observed that a FOIA requester's identity generally "has no bearing on the merits of his or her FOIA request."
https://www.justice.gov/archives/oip/foia-guide-2004-edition...
It is wrong to imply he is unreasonable here. NIST has been dismissive and unprofessional towards him and others in this process. They look terrible because they’re not doing their jobs.
Several of his student’s proposals won the most recent round. He still has work in the next round. NIST should have answered in a timely manner.
On what basis do you think any of these matters can or may complicate the FOIA process?
This definitely has the sting of bitterness in it, I doubt djb would have filed this suit if NTRU Prime would have won the PQC NIST contest. It's hard to evaluate this objectively when there are strong emotions involved.
When it comes to the number of times DJB is right versus the number of times that DBJ is wrong, I'll fully back DJB. Simply put the NSA/NIST cannot and should not be trusted in this case.
You misread. I'm saying his reasons for filing are in question. NIST probably was being dishonest. That's not the reason there is a lawsuit though.
1 reply →
It's funny how often the bitterness of a post is used as an excuse to dismiss the long and well documented case being made.
If NTRU Prime had been declared the winner, would this suit have been filed? It's the same contest, same people, same suspicious behavior from NIST. I don't think this suit would have come up. djb is filing this suit because of alleged bad behavior, but I have doubts that it's the real reason.
1 reply →
Perhaps the old advice (“never roll your own crypto”) should be reevaluated? If you’re creative enough, you could combine and apply existing algorithms in such ways that it would be very difficult to decrypt? Think 500 programmatic combinations (steps) of encryption applying different algorithms. Content encrypted in this way would require knowledge of the encryption sequence in order to execute the required steps in reverse. No amount of brute force could help here…
> Would require knowledge of the encryption sequence...
This is security by obscurity. Reputable encryptions work under the assumption that you have full knowledge about the encryption/decryption process.
You could however argue that the sequence then becomes part of the key. However, this key [ie. the sequence of encryptions] would then be at most as strong as the strongest encryption in this sequence, which kindof defeats the purpose.
No, an important property of a secure cryptographic cipher is that it should be as close to a random permutation of the input as possible.
A "randomly assembled" cipher that just chains together different primitives without much thought is very unlikely to have that, which will mean that it will probably have "interesting" statistical properties that can be observed given enough plaintext/ciphertext pairs, and those can then be exploited in order to break it.
No not at all, that advice is still good. Even more important if your are talking about modifying algorithms. Your gonna want proofs of resistance or immunity to certain classes of attacks. A subtle change can easily make a strong primitive useless.