Comment by api

3 days ago

Is there a reason glibc can't just do a better job at keeping some legacy symbols around? It's not like it's big stuff. They're things like legacy string functions. We're talking a few kilobytes of code in most cases.

The Linux kernel goes to a lot of effort to not break user space, at least for non-exotic core features and syscalls. It seems like a lot of user-space in Linux-land does not make the same effort.

It's particularly bad when it's the C library doing this, since that's at the center of the dependency graph for almost everything.

The problem is the opposite: they are trying to run executables built using a newer glibc in a system that has an older glibc. glibc keeps all the old function definitions since practically forever.

Frankly, I do not understand who would think glibc symbols themselves would be the challenge in this case. Even if you statically link glibc there's zero guarantee the syscalls will be present in the older Linux (cue .ABI-tag failures). Or even damn ELF format changes (e.g. gnu-style hashes). The simple solution is to build in the older Linux (&glibc).

In my long experience with ancient binaries, glibc has almost never been the problem, and its ability to _run_ ancient binaries is all but excellent; even Linux is more of a problem than glibc is (for starters paths to everywhere in /proc, /sys change every other half-decade).

  • > executables built and using a newer glibc

    It’s an abomination that Linux uses system libraries when building. Catastrophically terrible and stupid decision.

    It should be trivial for any program to compile and specify any arbitrary previous version of glibc as the target.

    Linux got this so incredibly wildly wrong. It’s a shame.

> a lot of user-space in Linux-land does not make the same effort

I believe that, what the article misses is that glibc is maintained and extended with an entirely different community and development model. Windows remains compatible over decades because Microsoft (a) is the sole distributor, and (b) puts an immense effort towards backwards compat. In Linux userspace, it's simply a non-goal across distributions. If you want to ship a binary for a particular distro, you need to build the binary on / for that distro; even within a distro, a major release bump (or especially a major release downgrade) may break a binary.

Ultimately, it's a consequence of Conway’s Law. Microsoft is the sole distributor of Windows, so they can enforce compatibilty with an iron fist, and there are people working for Microsoft whose pay depends on said compatibility. With "Linux" in general, there is no common authority to appeal to, and (again) most vendors don't even promise a seamless userspace upgrade path from one major release to another.

This is unfixable; it will never change -- as long as independent parties are permitted to distribute different operating systems yet call them all "Linux".

Ship multiple binaries, or distribute the source code (and let users build it).

EDIT: you'll notice that "ship multiple binaries" is what distros (especially commercial distros) do. They maintain separate branches, backport fixes to old branches, and employ software maintenance engineers to focus on this kind of work. If you want to target multiple major releases, this is what you have to do, too.

If you (as a commercial ISV) target a commercial distro with long-term support, and can convince your users to use / license the same distro, you'll have a good, stable development experience. You only need to port like once every decade, when you jump major releases.

The Linux user base / the Linux market is fragmented; that's the whole goal. The technical proliferation / inconsistency is just a consequence. Unless you take away the freedom of users to run their own flavors of "Linux", there won't be a uniform Linux target.

In a way, it's funny to even expect otherwise. Why do you expect to ship the same binaries when the foundations are diverse, with no standardization authority that all Linux distributors recognize as such? And even POSIX is an API spec, not an ABI spec.

And, any authority that controls binary aspects will immediately accrue political capital. This is exactly what shouldn't happen in Linux. The fact that anyone can fork (or start) a distro, and contribute to the chaos, is good for freedom.

  • > If you (as a commercial ISV) target a commercial distro with long-term support, and can convince your users to use / license the same distro, you'll have a good, stable development experience. You only need to port like once every decade, when you jump major releases.

    If things go well, it's even better than that: If you target ex. RHEL 8, there's a very good chance that your binaries will work on RHEL 9 and a decent shot at RHEL 10 with zero changes (though of course you should test all versions you want to work). And the same for Ubuntu 20.04/22.04/24.04/... and Debian/SUSE/whatever. Backwards incompatibilities can happen, but within a single stable distro they're not super common so the lazy ISV can probably only really port forward after more than a decade if they really want.

    (Incidentally, this isn't a hypothetical: I once had the joy of working on software that targeted RHEL 5, and those binaries ran on RHEL/CentOS 7 without any problems.)

  • The big Linux distros (EL, Ubuntu) in fact have the opposite incentive, to get proprietary vendors to rather their distribution specifically.

    • This theory would check out if a proprietary vendor could easily get away with shipping a single binary package for all supported versions of, say, Ubuntu.

      Having to build and maintain a binary packege separately for each version of the same distro probably isn't that appealing to them.

      1 reply →

  • > Ship multiple binaries, or distribute the source code (and let users build it).

    And that's why we have package managers and distro maintainers/packagers. You'll get no help from the community if your stuff is proprietary, just the way it is. Ship the code, distros will pick it up and do the packaging for you to make it available in their distro. It's part of the free software culture that surrounds the Linux ecosystem.

    If you absolutely must ship proprietary software, then target an enterprise distro. Ship it for RHEL or Ubuntu LTS and you get, at least, 10 years of a stable base.