Comment by stevenjgarner
6 hours ago
From an information theory perspective, "Always bet on text" is a plea for symbolic efficiency. It argues that while binary or visual formats might have higher bandwidth, they often have lower meaning-per-bit for the complex, abstract logic that runs civilization. Text is the most entropy-resistant, highly-compressible, and universally-decodable format we have ever invented.
This doesn’t track for me. How can text have lower bandwidth but higher meaning-per-bit? How does that jibe with entropy resistance (in an information theoretic sense)?
Text seems worse to me. First of all, binary encodings are a superset of text encodings. But less abstractly, binary enables content-transparent compression and error correction.
Like other commenters have pointed out, the downside of binary is needing sufficient tooling. Depending on the domain, that can indeed be a downside. But if that critique isn’t relevant for a given context, it’s extremely unlikely that plaintext (ASCII?) is superior.
Text seems more like the answer to a plea for lowest common denominator of tooling.