Comment by IshKebab

4 months ago

Exactly this, but not just Python. The traditional way most Linux apps work is that they are splayed over your filesystem with hard coded references to absolute paths and they expect you to provide all of their dependencies for them.

Basically the Linux world was actively designed to apps difficult to distribute.

It wasn't about making apps difficult to distribute at all, that's a later side effect. Originally distros were built around making a coherent unified system of package management that made it easier to manage a system due to everything being built on the same base. Back then Linux users were sysadmins and/or C programmers managing (very few) code dependencies via tarballs. With some CPAN around too.

For a sysadmin, distros like Debian were an innovative godsend for installing and patching stuff. Especially compared to the hell that was Windows server sysadmin back in the 90s.

The developer oriented language ecosystem dependency explosion was a more recent thing. When the core distros started, apps were distributed as tarballs of source code. The distros were the next step in distribution - hence the name.

  • Right but those things are not unrelated. Back in the day if you suggested to the average FOSS developer that maybe it should just be possible to download a zip of binaries, unzip it anywhere and run it with no extra effort (like on Windows), they would say that that is actively bad.

    You should be installing it from a distro package!!

    What about security updates of dependencies??

    And so on. Docker basically overrules these impractical ideas.

    • It’s still actively bad. And security updates for dependencies is easy to do when the dependencies developer is not bundling those with feature changes and actively breaking the API.

    • I would say those are good point, not impractical ideas.

      You make software harder to distribute (so inconvenient for developers and distributors) but gain better security updates and lower resource usage.

      6 replies →

> Basically the Linux world was actively designed to apps difficult to distribute.

It has "too many experts", meaning that everyone has too much decision making power to force their own tiny variations into existing tools. So you end up needing 5+ different Python versions spread all over the file system just to run basic programs.

It was more like, library writers forgot how to provide stable APIs for their software, and applications decided they just wanted to bundle all the dependencies they needed together and damn the consequences on the rest of the system. Hence we got static linked binaries and then containers.

  • even if you have a stable interface... the user might not want to install it and then forget to remove it down the line