← Back to context

Comment by tdeck

4 years ago

This just makes me think that null-terminated strings are the bad gift that keeps on giving. If we were to design an OS, language, or standard library in 2021 (or even 1999) we probably wouldn't use them, but we're stuck with this relic of a former era.

The thing is, they are even worse for performance than string implementations that store the length.. that extra few bits of memory is much cheaper than checking the size of a string everywhere. For example, copying a string with known length.

Also, c++’s strings even do some clever hacking where they store the text itself for shorter strings in the pointer, barring a pointer lookup. And this is possible only because abstraction.

  • They were designed when an extra byte or so per string cost you a lot of money. Nowadays, when 99% of the systems anyone will program start at 1MB RAM and 90% probably start at 512MB, they're a liability for almost no benefit.

    • You’ve got an extra byte either way, the \0 at the end. Which in many cases will make you copy a string because you can’t just “point” into a string literal and say take n chars from there. Of course I am not that old so I don’t have enough expertise — but seeming that every other language even at the time decided against it is pretty telling.

      2 replies →

Ok, let’s assume that 10mb json source was loaded into a not null-terminated opaque struct str_t {size_t; pchar;}. You have to parse a number from a position `i’ and you have (double parse_number(str_t)). Next obvious step?