Comment by dhosek
14 hours ago
One of the comments:
> Us, ten years after generating the certificate: "Who could have possibly foreseen that a computer science department would still be here ten years later."
This was why there was a Y2K bug. Most of that code was written in the 80s, during the Reagan era. Nobody expected civilization to make it to the year 2000.
No, people thought that storing a year as two digits was fine because computers were advancing so fast that it was unlikely they'd still be used in the year 2000 - or if they were it was someone else's problem.
And they were mostly right! Not many 80s machines were still being used in 1999, but lots of software that had roots to then was being used. Data formats and such have a tendency to stick around.
Software has incredible inertia compared to hardware.
It is effectively trivial to buy millions of dollars of hardware to upgrade your stuff when compared with paying for existing software to be rewritten for a new platform.
Funnily enough I worked at a company with a codebase written in the 1980s - no idea what it originally ran on but someone decided in the mid 2000s to update it to run on modern hardware. Unfortunately they chose Itanium... so 20 years later they're paying lots of money for Itanium hardware.
This is a very SWE-centric perspective. The very names of software/hardwsre would imply the exact opposite.
1 reply →