Comment by qcnguy
1 day ago
The model of patching+recompiling the world for every OS release is a terrible hack that devs hate and that users hate. 99% of all people hate it because it's a crap model. Devs hate middlemen who silently fuck up their software and leave upstream with the mess, users hate being restricted to whatever software was cool and current two years ago. If they use a rolling distro, they hate the constant brokenness that comes with it. Of the 1% of people who don't hate this situation 99% of those merely tolerate it, and the rest are Debian developers who are blinded by ideology and sunk costs.
Good operating systems should:
1. Allow users to obtain software from anywhere.
2. Execute all programs that were written for previous versions reliably.
3. Not insert themselves as middlemen into user/developer transactions.
Judged from this perspective, Windows is a good OS. It doesn't nail all three all the time, but it gets the closest. Linux is a bad OS.
The answers to your questions are:
(1) It isn't backwards compatible for sophisticated GUI apps. Core APIs like the widget toolkits change their API all the time (GTK 1->2->3->4, Qt also does this). It's also not forwards compatible. Compiling the same program on a new release may yield binaries that don't run on an old release. Linux library authors don't consider this a problem, Microsoft/Apple/everyone else does. This is the origin of the glibc symbol versioning errors everyone experiences sometimes.
(2) Maintaining a stable API/ABI is not fun and requires a capitalist who says "keep app X working or else I'll fire you". The capitalist Fights For The User. Linux is a socialist/collectivist project with nobody playing this role. Distros like Red Hat clone the software ecosystem into a private space that's semi-capitalist again, and do offer stable ABIs, but their releases are just ecosystem forks and the wider issue remains.
(3) It hasn't change and it's still bad.
(4) Docker: "solves" the problem on servers by shipping the entire userspace with every app, and being itself developed by a for-profit company. Only works because servers don't need any shared services from the computer beyond opening sockets and reading/writing files, so the kernel is good enough and the kernel does maintain a stable ABI. Docker obviously doesn't help the moment you move outside the server space and coordination requirements are larger.
It seems like Linux's ethos is also its biggest problem. It's a bunch of free software people reinventing, not just the wheel, but every part of the bus. When someone shows up and wants to install a standard cup holder, it's hard when none of your bus is standard.
> If they use a rolling distro, they hate the constant brokenness that comes with it.
Never happens for me on Arch, which I've run as my primary desktop for 15 years.
Maybe you are running a desktop environment which never changes but Gnome has been constantly broken in many different ways for the last 5+ years. At times it felt more like a developer playground than a usable desktop environment. KDE is more stable nowadays but it still breaks in mysterious ways from time to time. I also had major issues for some time when Qt6 started rolling out.
And Arch itself also needs manual interventions on package updates every so often, just a few weeks ago there was a major change to the NVidia driver packaging.
I've been running GNOME. I've never had breakage from upgrading. Of course there's the fact that GNOME neutered itself, removing many of its own features, but that's a different story and has nothing to do with ABIs or upgrading.
> And Arch itself also needs manual interventions on package updates every so often, just a few weeks ago there was a major change to the NVidia driver packaging.
If you're running a proprietary driver on a 12 year old GPU architecture incapable of modern games or AI, yeah... so I actually haven't needed to care about many of these. Maybe 2 or 3 ever...