Comment by tsimionescu

2 years ago

The fact that it was only found 2 years after being introduced (by non-attackers at least), in one of the most used pieces of software in the world, suggests that it wasn't actually shallow by any definition.

I don't think it's relevant that it could have been found by anyone. We know empirically that it just wasn't. It was found by security auditors, which is just about as far as you can be from a random eyeball.

Edit: An even more egregious example is of course the ~20 year old Shellshock family of bugs in bash.

> The fact that it was only found 2 years after being introduced (by non-attackers at least), in one of the most used pieces of software in the world, suggests that it wasn't actually shallow by any definition.

Or that few people were looking.

> I don't think it's relevant that it could have been found by anyone. We know empirically that it just wasn't. It was found by security auditors, which is just about as far as you can be from a random eyeball.

It was found by security people with security skills. But those were not people closely associated with the OpenSSL project; in fact as far as I can see they weren't prior contributors or project members at all. That very much supports ESR's argument.

  • > It was found by security people with security skills. But those were not people closely associated with the OpenSSL project; in fact as far as I can see they weren't prior contributors or project members at all. That very much supports ESR's argument.

    It doesn't. ESR's argument suggests OpenSSL should not hire security researchers to look for bugs, since all bugs are shallow and people will just quickly find them - the Bazaar approach.

    What Heartbleed has shown is that the OpenSSL project would be much higher quality if it took a more Cathedral-like approach and actively look for security researchers to work on it, and make them a part of their release process. Because they didn't, they released with a critical security vulnerability for more than a year (and there are very possibly many others).

    Especially in security, it's clear that some bugs are deep. Any project that cares about security actually has to follow a cathedral-like approach to looking for them. Releasing security critical code early is only making the problem worse, not better.

    • This is a "Seinfeld is unfunny" situation. People don't remember what development practices were like prior to this essay.

      > ESR's argument suggests OpenSSL should not hire security researchers to look for bugs, since all bugs are shallow and people will just quickly find them - the Bazaar approach.

      It suggests they should release early and often to allow outside researchers a chance to find bugs, rather than relying on internal contributors to find them. Which seems to have worked in this case.

      > Especially in security, it's clear that some bugs are deep. Any project that cares about security actually has to follow a cathedral-like approach to looking for them.

      No it isn't. I still haven't seen examples of bugs like that.

      > Releasing security critical code early is only making the problem worse, not better.

      How?

      6 replies →

  • If "few people were looking" at OpenSSL, one of the most widely-used pieces of open source software in the entire industry, Eric Raymond's point is refuted.