Comment by N_Lens
5 hours ago
I think most of us learned this from an early age - computer systems often degrade as they keep running and need to be reset from time to time.
I remember when I had my first desktop PC at home (Windows 95) and it would need a fresh install of Windows every so often as things went off the rails.
This has got to be a failure of early Windows versions -- I've had systems online for 5+ years without needing to be restarted, updating and restarting the software running on them without service interruption. RAID storage makes hotswapping failing drives easy, which is the most common part needing periodic replacement.
Yes. With Windows 3.x there wasn’t a lot to go wrong that couldn’t be fixed in a single ini file. Windows 95 through ME was a complete shitshow where many many things could go wrong and the fastest path to fixing it was a fresh install.
Windows XP largely made that irrelevant, and Windows 7 made it almost completely irrelevant.
This only applies to Windows and I think you're referencing desktops.
Ten years ago I think rule of thumb was uptime of not greater than 6 months. But for different reasons. (Windows Server...)
On Solaris, Linux, BSDs etc. it's only necessary for maintenance. Literally. I think my longest uptime production system was a sparc postgres system under sustained high load with uptime of around 6 years.
With cloud infra, people have forgotten just how stable the Unixken are.