Comment by lmm
2 years ago
This is a "Seinfeld is unfunny" situation. People don't remember what development practices were like prior to this essay.
> ESR's argument suggests OpenSSL should not hire security researchers to look for bugs, since all bugs are shallow and people will just quickly find them - the Bazaar approach.
It suggests they should release early and often to allow outside researchers a chance to find bugs, rather than relying on internal contributors to find them. Which seems to have worked in this case.
> Especially in security, it's clear that some bugs are deep. Any project that cares about security actually has to follow a cathedral-like approach to looking for them.
No it isn't. I still haven't seen examples of bugs like that.
> Releasing security critical code early is only making the problem worse, not better.
How?
> Which seems to have worked in this case.
How is missing a critical security issue that endangers all encryption offered by OpenSSL for more than a year (giving attackers access to your private keys via a basic network request) "working in this case"?
> No it isn't. I still haven't seen examples of bugs like that.
If you don't think finding Heartbleed after a year in OpenSSL was the process working, how about finding Shellshock was hiding in Bash for more than 20 years? Was that still a shallow bug, or is bash a project that just doesn't get enough eyes on it?
> How?
By letting people expose their data for years with a false sense of security. By encouraging projects to think security is someone else's problem, and they don't need to really worry about it.
Rather than releasing support for TLS heartbeats that steal your private keys for a whole year, it would have obviously and indisputably been better if the feature had been delayed until a proper security audit had been performed.
That the bug was eventually found is in no way a merit of the model. People find security bugs in closed source software all the time. The NSA and the Chinese and who knows who else usually find them even earlier, and profit handsomely from it.
> How is missing a critical security issue that endangers all encryption offered by OpenSSL for more than a year (giving attackers access to your private keys via a basic network request) "working in this case"?
The fact that it was found by people outside the project is the system working.
> If you don't think finding Heartbleed after a year in OpenSSL was the process working, how about finding Shellshock was hiding in Bash for more than 20 years? Was that still a shallow bug, or is bash a project that just doesn't get enough eyes on it?
Yes it's a shallow bug. I mean look at it. And look at who found it.
> Rather than releasing support for TLS heartbeats that steal your private keys for a whole year, it would have obviously and indisputably been better if the feature had been delayed until a proper security audit had been performed.
How much auditing do you realistically think a project with a grand total of one (1) full-time contributor would've managed?
If the code hadn't been publicly released we'd still be waiting for the bug to be found today.
> The fact that it was found by people outside the project is the system working.
This happens all the time to Windows, to Intel's hardware architecture, even to remote services that people don't even have the binaries for. There is nothing special about people outside the team finding security bugs in your code. After all, that's also what attackers are.
> Yes it's a shallow bug. I mean look at it. And look at who found it.
If a bug that hid from almost every developer on the planet for 20 years (that's how popular bash is) is still shallow, then I have no idea how you define a non-shallow bug.
> How much auditing do you realistically think a project with a grand total of one (1) full-time contributor would've managed?
That's irrelevant to this discussion. Per the essay, even a company as large as Microsoft would be better off releasing anything they do immediately, instead of "wasting time" on in-house security audits.
> If the code hadn't been publicly released we'd still be waiting for the bug to be found today.
I'm not saying they shouldn't have released the code along with the binary, I'm saying they shouldn't have released anything. It would have been better for everyone if OpenSSL did not support heartbeats at all, for a few more years, rather than it supporting heartbeats that leak everyone's private keys if you just ask them nicely.
This is the point of the Cathedral model: you don't release software at all until you're reasonably sure it's secure. The Bazaar model is that you release sofwatre as soon as it even seems to work sometimes, and pass on the responsibility for finding that it doesn't work to "the community". And the essay has the audacity to claim that the second model would actually produce better quality.
2 replies →
I remember what development practices were like prior to this essay, which is part of why I feel so strongly that it's overrated. Several other people on this thread were working in the mid-late '90s too.