Comment by adastra22
1 day ago
With respect, I doubt it. Have you tried pulling out a 20 year old tarball and compiling it, without modification, on a modern distro?
1 day ago
With respect, I doubt it. Have you tried pulling out a 20 year old tarball and compiling it, without modification, on a modern distro?
I recently unearthed something that I thought was 20 years old when someone asked me about it. I checked and it was only 14 years old based on mtime (thought I suspect I started the project nearly 20 years ago). Another project I unearthed for a different reason was only 13 years old by mtime (again, it was started before that). I must concede that I haven't actually recently compiled and used anything that was untouched for 20 years.
I should note that the first program I wrote that was actually used for a purpose (it calculates energy based on an internal stopwatch and then typing in values from a volt and ammeter, for a science project in 1992) still works in qb64 today.
The second program I wrote that was actually used for a purpose assumes a parallel-port printer on DOS that uses a fairly old version of PCL, and was written in 16-bit C, so probably won't work today.
A lot of these things can be made to work. That isn't being contested. But if you take a random piece of code of the internet from 20 years ago, it very likely won't compile out of the box on a modern system.
For example, I just took the oldest version of openssl I could find with a quick search (2015, so only 10 years old), and it fails to compile on my Mac. It detects macOS/darwin, and then proceeds to compile for 32-bit Intel, which obviously doesn't work. OpenSSL has fallbacks for straight C implementation to support platforms that haven't been customized, but their build scripts assume that macOS = Intel.
Ok sure, changing the whole freaking CPU architecture will bork a build script. So to prove a point I just downloaded v2.6.11 of the Linux kernel (released in 2005), unpacked (this time on Ubuntu 24.04 on real Intel), and did a `make menuconfig && make`. Of course I don't expect a 20 year old kernel to run on modern hardware, but could I compile it? No, I could not: modern GCC forces PIC by default, which parts of the Linux kernel do not support. I was able to fix that by editing the makefile to pass `-fno-pic` in CFLAGS. Then I get hit with another error due to "multiple definitions" of functions that are declared slightly differently. Turns out old GCC didn't warn about this, but modern GCC handles these declarations differently. This is after pages upon pages of warnings, btw, with only a few source files compiled so far.
I gave up. This is what is meant by archeology required: for anything nontrivial you often have to build the environment in which the code was originally compiled in order to get it to compile again.