← Back to context

Comment by techbrovanguard

3 months ago

> 3) The overreliance on dbus turns the “the unix philosophy” ;) away. Text as a universal communication medium, everything is a file, etc.

have you considered the reality that the "unix philosophy" results in incredibly brittle systems? byte streams ("""plain text""") are untyped and prone to mishaps.

Some of the most reliable systems in the world were unix ones.

SunOS was famous for being incredibly reliable, and its a more pure unix than the current linux environment.

And even if we ignore that, the majority of the web was functioning and functionally reliable atop linux with these text stream systems.

bytestreams are less debuggable, which feels silly to say openly since we are all aware that higher level interpreted languages are easier to write/debug/test and iterate on, but we seem not to be bothered by this not being true for the underlying systems.

Systemd clearly is working though, I’m just levying a criticism of opacity.

  • > bytestreams are less debuggable

    Text streams are considered "better", because the standard UNIX userland (unlike e.g. DOS) provided excellent tools for dealing with text streams: grep, sed, awk, find, tr, etc and of course the shell itself.

    But once you get your hands on excellent tools (like jq) for dealing with other kinds of data (like JSON), it turns out everything is even more powerful - you can now work with JSON as easily as with text, it just plugs into the existing ecosystem. But even though JSON has a human-readable text representation, it is no longer just text - it is dynamically-structured, but strongly-typed data. A JSON array is a JSON array, you can't just awk it.

    There are byte stream formats (e.g. msgpack) that have feature parity with JSON. jq can't accept msgpack-encoded byte streams, but suppose a hypothetical tool, msgpack2json, is widely available - just plug it into jq. You're still working on the same level of abstraction (shell pipes), but easily dealing with complex byte streams.

    And of course, what we understand as "text" in the modern era, are UTF8-encoded byte streams. If your "text" kit deals with ASCII rather than Unicode runes, it's that much less powerful, and likely full of painful edge cases that you now have to debug. (Note that UTF is a 1992 thing, it's been invented when UNIX was 20-something yro, and it's been around for 30+ years.)

    Debuggability of anything is entirely up to your toolkit, the quality and comprehensiveness of that toolkit is what decides the battle.

  • I don't think the most reliable systems in the world were Unix ones. At least, if you compare systems at that time, you should compare with what the telephone operators were using. They had legal requirements you would not find in the computing world.