← Back to context

Comment by akdev1l

4 days ago

>Windows 95 worked around this by keeping a backup copy of commonly-overwritten files in a hidden C:\Windows\SYSBCKUP directory. Whenever an installer finished, Windows went and checked whether any of these commonly-overwritten files had indeed been overwritten.

This is truly unhinged. I wonder if running an installer under wine in win95 mode will do this.

This is truly unhinged

Granted, but at the same time it's also resolutely pragmatic.

Apparently there was already lots of software out there which expected to be able to write new versions of system components. As well as buggy software that incidentally expected to be able to write old versions, because its developers ignored Microsoft's published best practices (not to mention common sense) and and didn't bother to do a version comparison first.

The choice was to break the old software, or let it think it succeeded then clean up after the mess it made. I'd bet they considered other alternatives (e.g. sandbox each piece of software with its own set of system libraries, or intercept and override DLL calls thus ignoring written files altogether) but those introduce more complexity and redirection with arguably little benefit. (I do wonder if the cleanup still happens if something like an unexpected reboot or power loss happens at exactly the wrong time).

Could the OS have been architected in a more robust fashion from the get-go? Of course.

Could they have simply forbidden software from downgrading system components? Sure, but it'd break installers and degrade the user experience.

Since the OS historically tolerated the broken behavior, they were kind of stuck continuing to tolerate it. One thing I learned leading groups of people is if you make a rule but don't enforce it, then it isn't much of a rule (at least not one you can rely on).

I would argue the deeper mistake was not providing more suitable tooling for developers to ensure the presence of compatible versions of shared libraries. This requires a bit of game theory up front; you want to always make the incorrect path frictiony and the correct one seamless.

  • There was (and still is) VerInstallFile, however this was introduced in Windows 3.1 and it is possible installers wanted to also support Windows 3.0 (since there wasn't much of a time gap between the two many programs tried to support both) so they didn't use it.

  • It is important to remember that Microsoft created some of this chaos to begin with. Other aspects can be attributed to "the industry didn't understand the value of $x or the right way to do $y at the time". And some of this is "nonsense you deal with when the internet and automatic updates is not yet a thing".

    Why did programs overwrite system components? Because Microsoft regularly pushed updates with VC++ or Visual Studio and if you built your program with Microsoft's tools you often had to distribute the updated components for your program to work - especially the Visual C runtime and the Common Controls. This even started in the Win3.11 days when you had to update common controls to get the fancy new "3d" look. And sometimes a newer update broke older programs so installers would try to force the "correct" version to be installed... but there's no better option here. Don't do that and the program the user just installed is busted. Do it and you break something else. There was no auto-update or internet access so you had to make a guess at what the best option was and hope. Mix in general lack of knowledge, no forums or Stack Overflow to ask for help, and general incompetence and you end up with a lot of badly made installers doing absolute nonsense.

    Why force everyone to share everything? Early on primarily for disk space and memory reasons. Early PCs could barely run a GUI so few hundred kilobytes to let programs have their own copy of common controls was a non-starter. There was no such thing as "just wait for everyone to upgrade" or "wait for WindowsUpdate to roll this feature out to everyone". By the early 2000s the biggest reason was because we hadn't realized that sharing is great in theory but often terrible in practice and a system to manage who gets what version of each library is critical. And we also later had the disk space and RAM to allow it.

    But the biggest issue was probably Microsoft's refusal to provide a system installer. Later I assume antitrust concerns prevented them from doing more in this area. Installers did whatever because there were a bunch of little companies making installers and every developer just picked one and built all their packages with it. Often not updating their installer for years (possibly because it cost a lot of money).

    Note: When I say "we" here that's doing a lot of heavy lifting. I think the Unix world understood the need for package managers and control of library versions earlier but even then the list of problems and the solutions to them in these areas varied a lot. Dependency management was far from a solved problem.

> This is truly unhinged.

This is bog-standard boring stuff (when presented with a similar problem, Linux invented containers lol) - read some of his other posts to realize the extent Microsoft went to maintain backwards compatibility - some are insane, some no doubt led to security issues, but you have to respect the drive.

  • It’s not bog-standard. Containers are not equivalent to doing what is described in the article.

    Containers are in fact redirecting writes so an installer script could not replace system libraries.

    The equivalent would be a Linux distro having the assumption that installer scripts will overwrite /usr/lib/libopenssl.so.1 with its own version and just keeping a backup somewhere and copying it back after the script executes.

    No OS that I know of does that because it’s unhinged and well on Linux it would probably break the system due to ABI compatibility.

    If they had taken essentially the same approach as wine and functionally created a WINEPREFIX per application then it would not be unhinged.

    edit: also to be clear, I respect their commitment to backwards compatibility which is what leads to these unhinged decisions. I thoroughly enjoy Raymond Chen’s dev blog because of how unhinged early windows was.

    • Man, after looking at the veritable pile of stinking matter that is claude code, compare it with the NT 4 source leak.

      Windows may have suffered its share of bad architectural decisions, but unhinged is a word that I wouldn't apply to their work on Windows.

      1 reply →

    • It's easy to forget in these discussions that Microsoft didn't have infinity resources available when writing Windows, and often the dodgy things apps were doing only became clear quite late in the project as app compatibility testing ramped up. Additionally, they had to work with the apps and history they had, they couldn't make apps work differently.

      You say, oh, obviously you just should redirect writes to a shadow layer or something (and later Windows can do that), but at the time they faced the rather large problem that there is no formal concept of an installer or package in Windows. An installer is just an ordinary program and the OS has no app identity available. So, how do you know when to activate this redirection, and what is the key identifying the layer to which redirects happen, and how do you handle the case where some writes are upgrades and others are downgrades, etc, and how do you do all that in a short amount of time when shipping (meant literally in those days) will start in just a few months?

      2 replies →

    • Windows 95 was not Windows NT and it still used the FAT32 file system, where it was not really possible to enforce access rights.

      As TFA says:

      You even had installers that took even more extreme measures and said, “Okay, fine, I can’t overwrite the file, so I’m going to reboot the system and then overwrite the file from a batch file, see if you can stop me.”

      1 reply →

    • You are right that it’s not equivalent, but the article explains why redirecting the writes wasn’t a viable option.

    • > If they had taken essentially the same approach as wine and functionally created a WINEPREFIX per application then it would not be unhinged.

      Man, wouldn't it have been nice if everyone had enough hard drive space in those days in order to do something like that...

    • Two words: proprietary installers.

      If an installer expects to be able to overwrite a file and fails to do so, it might crash, leaving the user with a borked installation.

      Of course you can blame the installer, but resolution of the problem might take a long time, or might never happen, depending on the willingness of the vendor to fix it.

> If . . . the replacement has a higher version number than the one in the SYSBCKUP directory, then the replacement was copied into the SYSBCKUP directory for safekeeping.

This as well. I know there are a million ways for a malicious installer to brick Win95, but a particularly funny one is hijacking the OS to perpetually rewrite its own system components back to compromised version number ∞ whenever another installer tries to clean things up.

Whats unhinged about a periodic integrity check? Doesn't seem much different than a startup/boot check. If you're talking about security, you've come to the wrong OS.

Then blindly overwriting the shared libraries despite the guidance what the vendor of the OS provides is actually hinged, yes?

I agree, it's unhinged for applications to overwrite newer versions of system files with older ones.

You'd have to track down some 16bit Win3.x software to install. Probably on floppy disks since CD-ROMs weren't common.