Comment by Animats
4 years ago
"the fact that iMessage will gleefully parse all sorts of complex data received from random strangers, and will do that parsing using crappy libraries written in memory unsafe languages."
C. 30 years of buffer overflows.
Yep, making companies liable for damages would incentivize them to stop relying on C for a lot of things. Apple knows full well iMessage has security relevant bugs they just haven't found yet. Hence their attempts to shield it and mitigate those issues with layers of security. However, the appropriate action would be to reimplement it in something less likely to get exploited. That's expensive. Liability would justify this cost. Companies like Google, MS, Apple, etc. rely on large amounts of legacy C code. There are quite a few repeat offenders in terms of having security vulnerabilities exploited in the wild.
Basically, my reasoning here is that Apple knows it is exposing users to hacks because of quality issues with this and other components. The fact that they try to fix them as fast as they find them is nice but not good enough: people still get hacked. When the damage is mostly PR, it's manageable (to a point). But when users sue and start claiming damages, it becomes a different matter: that gets costly and annoying real quick.
Recently we have seen several companies embrace Rust for OS development. Including Apple even. Both Apple and Google have also introduced languages like Swift and Go that likewise are less likely to have issues with buffer overflows. Switching languages won't solve all the problems but buffer overflows should largely be a thing of the past. So, we should encourage them to speed that process up.
50 years!
49
C is a programming language which born at “AT & T’s Bell Laboratories” of USA in 1972. It was written by Dennis Ritchie.
Buffer overflows are older than C.
One of the reasons for the decline of the British computer industry was Tony Hoare, at one of the big companies (Elliott Brothers, later part of ICL), implemented Fortran by compiling it to Algol, and compiled the Algol with bounds checks. This would have been around 01965, according to his Turing Award lecture. They failed to win customers away from the IBM 7090 (according to https://www.infoq.com/presentations/Null-References-The-Bill...) because the customers' Fortran programs were all full of buffer overflows ("subscript errors", in Hoare's terminology) and so the pesky Algol runtime system was causing them to abort!