← Back to context

Comment by sylware

2 days ago

This article missed a critical point which is "the right way" to select a glibc ABI version: see binutils ld documentation, second part of the page related to VERSION support. This must include glibc internal symbols.

This will allow to craft ELF binaries on a modern distro which will run on "older" distros. This is critical for games and game engines. There is an significant upfront only-once work in order to select an "old" glibc ABI.

The quick and dirty alternative being having a toolchain configured to link with an "old" glibc on the side.

This article missed the -static-libstdc++ critical option for c++ applications (the c++ ABI is hell on earth), but did not miss the -static-libgcc and the dynamic loading of system interface shared libs.

One of the features Zig provides is ability to target any glibc version. See https://github.com/ziglang/glibc-abi-tool/ for more details on how this is solved.

  • You’re my hero for this.

    One of my side projects is building a toolchain to enable C++ cross-compile using the Zig header/source libs.

    I didn’t love Zig as a Clang++ replacement because it has a bit too much magic. And it might go ahead? But the underlying library code is a God send

Is there a reason glibc can't just do a better job at keeping some legacy symbols around? It's not like it's big stuff. They're things like legacy string functions. We're talking a few kilobytes of code in most cases.

The Linux kernel goes to a lot of effort to not break user space, at least for non-exotic core features and syscalls. It seems like a lot of user-space in Linux-land does not make the same effort.

It's particularly bad when it's the C library doing this, since that's at the center of the dependency graph for almost everything.

  • The problem is the opposite: they are trying to run executables built using a newer glibc in a system that has an older glibc. glibc keeps all the old function definitions since practically forever.

    Frankly, I do not understand who would think glibc symbols themselves would be the challenge in this case. Even if you statically link glibc there's zero guarantee the syscalls will be present in the older Linux (cue .ABI-tag failures). Or even damn ELF format changes (e.g. gnu-style hashes). The simple solution is to build in the older Linux (&glibc).

    In my long experience with ancient binaries, glibc has almost never been the problem, and its ability to _run_ ancient binaries is all but excellent; even Linux is more of a problem than glibc is (for starters paths to everywhere in /proc, /sys change every other half-decade).

    • > executables built and using a newer glibc

      It’s an abomination that Linux uses system libraries when building. Catastrophically terrible and stupid decision.

      It should be trivial for any program to compile and specify any arbitrary previous version of glibc as the target.

      Linux got this so incredibly wildly wrong. It’s a shame.

      3 replies →

  • > a lot of user-space in Linux-land does not make the same effort

    I believe that, what the article misses is that glibc is maintained and extended with an entirely different community and development model. Windows remains compatible over decades because Microsoft (a) is the sole distributor, and (b) puts an immense effort towards backwards compat. In Linux userspace, it's simply a non-goal across distributions. If you want to ship a binary for a particular distro, you need to build the binary on / for that distro; even within a distro, a major release bump (or especially a major release downgrade) may break a binary.

    Ultimately, it's a consequence of Conway’s Law. Microsoft is the sole distributor of Windows, so they can enforce compatibilty with an iron fist, and there are people working for Microsoft whose pay depends on said compatibility. With "Linux" in general, there is no common authority to appeal to, and (again) most vendors don't even promise a seamless userspace upgrade path from one major release to another.

    This is unfixable; it will never change -- as long as independent parties are permitted to distribute different operating systems yet call them all "Linux".

    Ship multiple binaries, or distribute the source code (and let users build it).

    EDIT: you'll notice that "ship multiple binaries" is what distros (especially commercial distros) do. They maintain separate branches, backport fixes to old branches, and employ software maintenance engineers to focus on this kind of work. If you want to target multiple major releases, this is what you have to do, too.

    If you (as a commercial ISV) target a commercial distro with long-term support, and can convince your users to use / license the same distro, you'll have a good, stable development experience. You only need to port like once every decade, when you jump major releases.

    The Linux user base / the Linux market is fragmented; that's the whole goal. The technical proliferation / inconsistency is just a consequence. Unless you take away the freedom of users to run their own flavors of "Linux", there won't be a uniform Linux target.

    In a way, it's funny to even expect otherwise. Why do you expect to ship the same binaries when the foundations are diverse, with no standardization authority that all Linux distributors recognize as such? And even POSIX is an API spec, not an ABI spec.

    And, any authority that controls binary aspects will immediately accrue political capital. This is exactly what shouldn't happen in Linux. The fact that anyone can fork (or start) a distro, and contribute to the chaos, is good for freedom.

    • > If you (as a commercial ISV) target a commercial distro with long-term support, and can convince your users to use / license the same distro, you'll have a good, stable development experience. You only need to port like once every decade, when you jump major releases.

      If things go well, it's even better than that: If you target ex. RHEL 8, there's a very good chance that your binaries will work on RHEL 9 and a decent shot at RHEL 10 with zero changes (though of course you should test all versions you want to work). And the same for Ubuntu 20.04/22.04/24.04/... and Debian/SUSE/whatever. Backwards incompatibilities can happen, but within a single stable distro they're not super common so the lazy ISV can probably only really port forward after more than a decade if they really want.

      (Incidentally, this isn't a hypothetical: I once had the joy of working on software that targeted RHEL 5, and those binaries ran on RHEL/CentOS 7 without any problems.)

    • The big Linux distros (EL, Ubuntu) in fact have the opposite incentive, to get proprietary vendors to rather their distribution specifically.

      2 replies →

    • > Ship multiple binaries, or distribute the source code (and let users build it).

      And that's why we have package managers and distro maintainers/packagers. You'll get no help from the community if your stuff is proprietary, just the way it is. Ship the code, distros will pick it up and do the packaging for you to make it available in their distro. It's part of the free software culture that surrounds the Linux ecosystem.

      If you absolutely must ship proprietary software, then target an enterprise distro. Ship it for RHEL or Ubuntu LTS and you get, at least, 10 years of a stable base.

Could you point to an example of a project doing it the right way? I'm a little unclear on the type of incantation needed for the "version script".

Which works if you use binutils ld. Does it work with mold or gold? And then how do you use this with languages other than c++/c like Go or Rust?

  • I thought that Go invoked syscalls directly instead of going through libc.

    • It's Go's best feature, a simple CGO_ENABLED=0 gives you freedom from the tyranny of libc.

  • Rust toolchain has already a bug since 2015: it cannot link statically libgcc, namely it does not have "-static-libgcc" option since 2015.

    I got the bug with "TinyGlade" video game (extremely good BTW), which is written in rust, and with the dev we hit that bug. Namely... better have a libgcc with the right ABI... and I can tell you, this has a been a HUGE issue since valve started to distribute games more than a decade ago.