← Back to context

Comment by leoc

10 hours ago

While it's narrowly true that CPU instruction sets generally don't have a null-pointer concept, I'm not sure how important that is: the null pointer seems to have been (I don't know enough to be sure) a well-established idiom in assembly programming which carried across naturally to BCPL and C. (In much the same way that record types were, apparently, a common assembly idiom long before they became particularly normal to have in HLLs.) Programmers like being able to null out a pointer field, 0 is an obvious "joker" value, and jump-if-0 instructions tend to be convenient and fast. Whether or not you'd want to say it's "how the hardware works" it does seem to have a certain character of inevitability. Even if the Bell Research guys had disapproved of the idiom they would likely have had difficulty keeping it out of other people's C programs once C became popular. The Hoare ALGOL W thing seems to be more relevant to null pointers in Java and the like.

> Programmers like being able to null out a pointer field, 0 is an obvious "joker" value, and jump-if-0 instructions tend to be convenient and fast.

And there's nothing wrong with that! But you should write it

  union {
    char *ptr;
    size_t scalar;
  } my_nullable_pointer;
  if (my_nullable_pointer.scalar) {
    printf("%s", my_nullable_pointer.ptr);
  }

not:

  char *my_nullable_pointer;
  if (my_nullable_pointer) {
    printf("%s", my_nullable_pointer);
  }

Yes, this takes up more space, but it also makes the meaning of the code clearer. typedef in a header can bring this down to four extra lines per pointer type in the entire program. Add a macro, and it's five extra lines plus one extra line per pointer type. Put this in the standard library, and the programmer has to type a few extra characters – in exchange for it becoming extremely obvious (to an experienced programmer, or a quick-and-dirty linter) when someone's introduced a null pointer dereference, and when a flawed design makes null pointer dereferences inevitable.

> The Hoare ALGOL W thing seems to be more relevant to null pointers in Java and the like.

I believe you are correct; but I like blaming Tony Hoare for things. He keeps scooping me: I come up with something cool, and then Tony Hoare goes and takes credit for it 50 years in the past. Who does he think he is, Euler?