Comment by jmward01
9 days ago
I'm just surprised that it feels like very little deep innovation in the OS world has happened since windows 2k. 3.11 brought networking in. 95 brought true multitasking to the masses and 2k brought multi-processing/multi-user (yes, NT3.1 had it, but 2k is where most normal users jumped in). And, yes, I know these things existed in other OSes out there but I think of these as the mass market kick offs for them. In general I just don't see anything but evolutionary improvements (and back-sliding) in the OS world beyond this time. I had really hoped that true cloud OSes would have become the norm by now (I really want my whole house connected as a seamless collection of stuff) or other major advances like killing filesystems (I think of these as backdoor undocumented APIs). Have we really figured out what an OS is supposed to be or are we just stuck in a rut?
[edit] 3.1 should have been windows for worgkroups 3.11
"Normal users" did not jump into Windows 2000 Workstation. That was still an 'enterprise only' OS. Normal users either suffered with WinMe shipping on their desktop computer or jumped from 98SE to XP, given their computer could handle it (aka they bought a new computer).
I think the major change has been that computers are very stable and secure these days. It's night and day compared to the 2000s.
There's a lot working against fundamental change of PC desktop OSes that corporations use, therefore OSes that Microsoft can make money from.
- Big software vendors (Autodesk, Adobe, etc.) making it difficult for Microsoft to deprecate or evolve APIs and/or design approaches to the OS.
- Cybersecurity/IT security groups strongly discouraging anything new as potentially dangerous (which is not incorrect).
- Non-tech people generally not caring about desktop PCs anymore - phones have that crown now.
- Non-tech people caring much more about interface than the actual underpinnings that make things work.
Outside of the PC there's some innovation happening, at least with the OS itself and not user interfaces. Check out Fuschia sometime.
> 2k brought multi-processing/multi-user
Sorta. It was real pain-in-the-ass to run 2000 as a regular (non-administrator) user. Assuming your software worked at all that way, as even Office 2000 had some issues. UAC was necessary.
It required attention to detail, from a sysadmin / desktop admin perspective, but it was definitely possible and paid dividends in users being unable to completely destroy machines like they could on the DOS-based Windows versions. I put out a ton of Windows NT Workstation 4.0 and Windows 2000 Pro w/ least-privilege users. It was so convenient to be able to blow away a user's profile and start w/ a clean slate, for the user, w/o having to reload the machine.
Yes, I ran that way on principle, and you could mostly make it work. But not really OOB. Registry ACL templates and etc, qualify for a real PITA.
UAC and the other magic on Vista/7 mollified that by a lot.
Looks like there is some negative feelings towards this comment. So if we aren't in a rut, what are the big revolutionary OS advancements that have happened since this time?
This is a forum populated almost entirely by people whose day-to-day existence depends upon building the new stuff that sucks :) (mine too!)
Android (all apps sandboxed). Desktop OSes are still barely catching up to this one.
Desktops have been in a rut for a decade. Windows has sucked post Win7 in ways that are either conspiracy or the most stinging indictment of managerial incompetence possible. Osx is good except it's key bindings are alien and the hardware is closed and apple hasn't improved it really at all in ten years and it has loads of inconsistencies with Linux cli. Linux has been in a huge rewrite of the desktop and graphics lift for no real end user benefit and flubbing the opportunity to make ground on windows while it tried to commit market share suicide.
3d compositing, ssds, mega displays, massive multi core, all completely wasted.
You know what I should be able to do? Hot execute windows and Linux and Osx on the same desktop without containerization that leaves 3d as an afterthought or worse a never thought.
Virtualization. FDE. Hot patching. Io Ring (io_uring), etc.
Virtualization is old. I used VMWare on Windows 2000.
4 replies →
Windows 2k already had an io_uring equivalent. That's more of an example of Linux being out of date due to being based off UNIX.
3 replies →
Windows moved everything to sharepoint now, so documents are stored "somewhere" and can be edited by many users. What often causes strange bugs.
Also a big degradation is the whole "hidden" %appdata% folder that grows and grows in size with no tools to deal with it.
Isn't Sharepoint like an enterprise management tool? I've never interacted with it once.
As for appdata... There's many faults to find in modern Windows changes, but I'm not willing to pin this on MS. Microsoft stuff tends to use %appdata% fairly sensibly, in most cases. On the other hand, the behavior of third-party developers has been really frustrating. What was initially intended as a universal storage location for some program data has become some kind of program container. Now, whenever you download some giant 300-500mb Electron app or whatever, you can be sure that it will force its entirety into appdata with no way to change the location. Every one of these developers has decided that their program is so valuable and Important that it's inconceivable that the user might want to install it on anything but the system drive. No, our program is unique and deserves nothing but the best!
>Now, whenever you download some giant 300-500mb Electron app
There's the mistake right there. Electron is to be avoided like the plague; if all I want is the same dumb touchscreen focused web UI of the creator's website, there's no need to wrap it in a Chrome instance and call it a 'desktop' app when it doesn't follow a single desktop UI convention.
1 reply →
Definitely stuck. We found a pretty strong optimum that no one has been willing to venture outside, strong enough to keep selling and that seems to be all that matters these days.
It was during an era where there was actual competition over Operating Systems. OS/2 definitely pushed Microsoft hard. BeOS woke everyone up even if it wasn't on popular hardware. Bell labs was still experimenting with plan9. There were several commercial Unix vendors.
Monopolies. They ruin markets. Destroy products. Reduce wages. Induce nostalgia.
I am 50/50 on this particular argument for why OSes are in a rut. I think there is actual competition in the form of the various distros out there and they have passionate advocates with real skill trying new things but they don't really take in a more mainstream way and they rarely feel revolutionary. I think this is more of a track gauge problem. It is hard to provide a truly novel OS without building all the infra around it so that people can actually use it. That takes resources at a scale that few can muster. What if I wanted to build something that kills off the idea of a file system? All the apps out there are written at their very core with this concept in them so even if it is a better idea, it is incredibly hard to bring it out to market and only huge companies can do it. At the same time though the huge companies are pushing their versions of things which makes it hard to compete and have even small innovation in the space. I don't have a solution here. I had hoped things like web assembly would have led to OS breakaway by now but it hasn't really happened. Maybe it still will.