Comment by adastra22
9 hours ago
A lot of these things can be made to work. That isn't being contested. But if you take a random piece of code of the internet from 20 years ago, it very likely won't compile out of the box on a modern system.
For example, I just took the oldest version of openssl I could find with a quick search (2015, so only 10 years old), and it fails to compile on my Mac. It detects macOS/darwin, and then proceeds to compile for 32-bit Intel, which obviously doesn't work. OpenSSL has fallbacks for straight C implementation to support platforms that haven't been customized, but their build scripts assume that macOS = Intel.
Ok sure, changing the whole freaking CPU architecture will bork a build script. So to prove a point I just downloaded v2.6.11 of the Linux kernel (released in 2005), unpacked (this time on Ubuntu 24.04 on real Intel), and did a `make menuconfig && make`. Of course I don't expect a 20 year old kernel to run on modern hardware, but could I compile it? No, I could not: modern GCC forces PIC by default, which parts of the Linux kernel do not support. I was able to fix that by editing the makefile to pass `-fno-pic` in CFLAGS. Then I get hit with another error due to "multiple definitions" of functions that are declared slightly differently. Turns out old GCC didn't warn about this, but modern GCC handles these declarations differently. This is after pages upon pages of warnings, btw, with only a few source files compiled so far.
I gave up. This is what is meant by archeology required: for anything nontrivial you often have to build the environment in which the code was originally compiled in order to get it to compile again.
No comments yet
Contribute on Hacker News ↗