Comment by stevenjgarner
4 hours ago
Human-readability is the ultimate error correction for the most expensive link in the system: the human-in-the-loop.
The information-theoretic justification is that binary's efficiency assumes a perfectly known codec, but the entropy of time destroys codecs (bit rot/obsolescence). Text sacrifices transmission efficiency for semantic recovery - it remains decodable even when the specific tooling is lost, making it the most robust encoding for long-term information survival.
Human-readability isn't a feature of ASCII though. It's a feature of any encoding for which the user has sufficient tooling. Sure, that's an easier bar to clear for ASCII than for binary formats in general. But as I said, as long as you have the tooling, binary is no less readable. (Also, many binary formats will store strings as ASCII or UTF-8, so you can use the strings utility or whatever you want against them.)
> the entropy of time destroys codecs (bit rot/obsolescence)
Okay, so you don't mean "entropy" in an information theoretic sense. You're just talking about the decay of time. That's a much more specific claim than your original one, and I grant than that may be true for some use-cases. But you don't need semantic recovery if you don't need to do recovery at all, i.e. if your data format and/or storage medium transparently provide redundancy and/or versioning.