← Back to context

Comment by Supermancho

15 hours ago

> Anyone writing applications for users was targeting Windows and using Microsoft.

Developers as users, sure. MSFT was common. Developers as responsible for infrastructure, MSFT anything was considered a huge risk and unreliable in the 90s.

Granted, my memory retains only a general narrative...I remember a shift by 2002ish when I started to see windows servers as perfectly fine machines for closet/under-the-table infra you didn't care too much about anyway. By 2004 they were moving out of the closet, so to speak. Then those machines became more important because more was being done with them and were considered "just as good" as any other OS. Developers that had experience, with their MSFT certs in hand, were cheaper too. It was a slow progression to eat into the corporate marketshare. By 2006 virtual machines were ubiquitous and you could run MSFT virtualized. Many companies do that by default today for workspace controls. I have never and would never choose to use MSFT products (including Azure) for business critical infra. MSFT acquiring Github was great for them, and the death of it for me. I'm probably an old outlier, but I 'member.

I think the first shift was the reckoning with Windows NT actually being decent software. Windows 2000 (AKA NT 5.0) included Active Directory, WebDAV support, and a host of other features that were genuinely useful in a sysadmin setting [0]. Also, it shipped with IE5 which introduced XMLHttpRequest and was the best web browser by a mile. Between their pushy sales reps and so much stuff being included by default, I think it got kind of hard to push for anything else for a while.

https://en.wikipedia.org/wiki/Windows_2000