← Back to context

Comment by krisoft

5 years ago

> My suggestion is just learn how to write secure code in C.

That is a good suggestion to an individual developer. What is your suggestion to a lead developer of a big organisation? Let’s say to the CTO of Apple.

You can see at that level of abstraction the “make sure every one of your developers know how to write secure code in C and they never slip up” manifestly doesnt work.

You can fault individuals for bugs up to a certain point, but if we want to make secure systems we have to change how we are making them. To make the whole process resistant to oopsies.

I don't think this is a case of "both sides have a point" when it comes to blaming individuals for vulnerabilities or suggesting they learn to write secure C. We're way past these things.

> What is your suggestion to a lead developer of a big organisation? Let’s say to the CTO of Apple.

He can pay for my advice if he really wants to hear it, but this isn't really about programming at this point because we're talking about business goals which can have all sorts of priorities besides making quality software.

Do we want software quality? Then we want better programmers.

> if we want to make secure systems we have to change how we are making them.

On this we agree, but thicker training wheels just gets more people on bikes; It doesn't make the roads any safer.

  • Better programmers on C can't effecicely eliminate whole classes of the most common fatal security vulnerabilities.

    Of course after we do eliminate these low hanging fruit, we will be left with a pie of the remaining classes of vulnerabilities that looks different, it's like Amdahls law. But that's no excuse to skip past "step 1".

    • > Better programmers on C can't effecicely eliminate whole classes of the most common fatal security vulnerabilities.

      Sure they can, it just requires discipline. Most of djb's code (in C) has a lower defect count than most other implementations you'll find in any language, and the mistakes he does make are in relaxing his discipline when thinking it doesn't matter (because of privilege isolation -- something he later admitted was a mistake[1] -- or because nobody puts that much memory in a machine, because times change!).

      [1]: https://cr.yp.to/qmail/qmailsec-20071101.pdf

      > But that's no excuse to skip past "step 1".

      Zeno would like a word. I'm arguing a different metaphor, not "try harder".

      If it is true that programs get too big to maintain the level of discipline the language requires, and regardless of the language you're going to be confronted with defects, then the solution (in my mind) is smaller programs because only the small program has a chance of being correct in the first place.

I agree with your main point, just a nitpick about "at that level of abstraction the “make sure every one of your developers know how to write secure code in C and they never slip up” manifestly doesnt work": it kinda worked with Windows post-XPsp2, the amount of security holes fell pretty dramatically in subsequent releases.

When a company puts security first, they can get results. Unfortunately, security doesn't really sell software like features do, so a true hardened-by-default mindset is impossible in practice. Hence, we need better tools and processes to build features, as you say.

Hire more security auditors?

Apple employees would have access to these symbols and much more debugging info that the OP poster "got lucky due to an accidental share", so the process would be a lot easier for them.

EDIT: also, required reading on how vulnerabilities are discovered, important types of bugs, is not unreasonable, nor a long read.

> every one of your developers

You don't need "every one of your developers" to write C neither.