Comment by lmm
2 years ago
> The fact that it was only found 2 years after being introduced (by non-attackers at least), in one of the most used pieces of software in the world, suggests that it wasn't actually shallow by any definition.
Or that few people were looking.
> I don't think it's relevant that it could have been found by anyone. We know empirically that it just wasn't. It was found by security auditors, which is just about as far as you can be from a random eyeball.
It was found by security people with security skills. But those were not people closely associated with the OpenSSL project; in fact as far as I can see they weren't prior contributors or project members at all. That very much supports ESR's argument.
> It was found by security people with security skills. But those were not people closely associated with the OpenSSL project; in fact as far as I can see they weren't prior contributors or project members at all. That very much supports ESR's argument.
It doesn't. ESR's argument suggests OpenSSL should not hire security researchers to look for bugs, since all bugs are shallow and people will just quickly find them - the Bazaar approach.
What Heartbleed has shown is that the OpenSSL project would be much higher quality if it took a more Cathedral-like approach and actively look for security researchers to work on it, and make them a part of their release process. Because they didn't, they released with a critical security vulnerability for more than a year (and there are very possibly many others).
Especially in security, it's clear that some bugs are deep. Any project that cares about security actually has to follow a cathedral-like approach to looking for them. Releasing security critical code early is only making the problem worse, not better.
This is a "Seinfeld is unfunny" situation. People don't remember what development practices were like prior to this essay.
> ESR's argument suggests OpenSSL should not hire security researchers to look for bugs, since all bugs are shallow and people will just quickly find them - the Bazaar approach.
It suggests they should release early and often to allow outside researchers a chance to find bugs, rather than relying on internal contributors to find them. Which seems to have worked in this case.
> Especially in security, it's clear that some bugs are deep. Any project that cares about security actually has to follow a cathedral-like approach to looking for them.
No it isn't. I still haven't seen examples of bugs like that.
> Releasing security critical code early is only making the problem worse, not better.
How?
> Which seems to have worked in this case.
How is missing a critical security issue that endangers all encryption offered by OpenSSL for more than a year (giving attackers access to your private keys via a basic network request) "working in this case"?
> No it isn't. I still haven't seen examples of bugs like that.
If you don't think finding Heartbleed after a year in OpenSSL was the process working, how about finding Shellshock was hiding in Bash for more than 20 years? Was that still a shallow bug, or is bash a project that just doesn't get enough eyes on it?
> How?
By letting people expose their data for years with a false sense of security. By encouraging projects to think security is someone else's problem, and they don't need to really worry about it.
Rather than releasing support for TLS heartbeats that steal your private keys for a whole year, it would have obviously and indisputably been better if the feature had been delayed until a proper security audit had been performed.
That the bug was eventually found is in no way a merit of the model. People find security bugs in closed source software all the time. The NSA and the Chinese and who knows who else usually find them even earlier, and profit handsomely from it.
4 replies →
I remember what development practices were like prior to this essay, which is part of why I feel so strongly that it's overrated. Several other people on this thread were working in the mid-late '90s too.
If "few people were looking" at OpenSSL, one of the most widely-used pieces of open source software in the entire industry, Eric Raymond's point is refuted.
That's just one possibility. There are many ways for a development process to go wrong.
The whole thesis is that the open source userbase forms the army of eyeballs that will surface all bugs --- they're part of the development process. So no, this dodge doesn't work either; it doesn't cohere with what Raymond said.