Comment by AnimalMuppet
6 hours ago
Repeating a previous comment of mine (https://news.ycombinator.com/item?id=32784959) about an article in Byte Magazine (August 1983) on the C programming language:
From page 52:
> Operating systems have to deal with some very unusual objects and events: interrupts; memory maps; apparent locations in memory that really represent devices, hardware traps and faults; and I/O controllers. It is unlikely that even a low-level model can adequately support all of these notions or new ones that come along in the future. So a key idea in C is that the language model be flexible; with escape hatches to allow the programmer to do the right thing, even if the language designer didn't think of it first.
This. This is the difference between C and Pascal. This is why C won and Pascal lost - because Pascal prohibited everything but what Wirth thought should be allowed, and Wirth had far too limited a vision of what people might need to do. Ritchie, in contrast, knew he wasn't smart enough to play that game, so he didn't try. As a result, in practice C was considerably more usable than Pascal. The closer you were to the metal, the greater C's advantage. And in those days, you were often pretty close to the metal...
Later, on page 60:
> Much of the C model relies on the programmer always being right, so the task of the language is to make it easy what is necessary... The converse model, which is the basis of Pascal and Ada, is that the programmer is often wrong, so the language should make it hard to say anything incorrect... Finally, the large amount of freedom provided in the language means that you can make truly spectacular errors, far exceeding the relatively trivial difficulties you encounter misusing, say, BASIC.
Also true. And it is true that the "Pascal model" of the programmer has quite a bit of truth to it. But programmers collectively chose freedom over restrictions, even restrictions that were intended to be for their own good.
The irony is that all wannabe C and C++ replacements are exactly the "Pascal model" brought back into the 21st century, go figure.
"A consequence of this principle is that every occurrence of every subscript of every subscripted variable was on every occasion checked at run time against both the upper and the lower declared bounds of the array. Many years later we asked our customers whether they wished us to provide an option to switch off these checks in the interests of efficiency on production runs. Unanimously, they urged us not to they already knew how frequently subscript errors occur on production runs where failure to detect them could be disastrous. I note with fear and horror that even in 1980 language designers and users have not learned this lesson. In any respectable branch of engineering, failure to observe such elementary precautions would have long been against the law."
-- C.A.R Hoare's "The 1980 ACM Turing Award Lecture"