Back in the day I couldn't even dream of a PC. They were way too expensive. It took my extended family chipping in (~15 people) to buy me a C64 with tape storage. Still it was great fun. It made me learn programming in BASIC and English at the same time (as the Polish language book included was so badly translated and full of errors it was hopeless).
It was pre-internet obviously so obtaining software was very difficult. For years when I was learning assembler I was using a so called "monitor cartridge" that did simple assembly/disassembly, but it didn't support labels and such. I could read about software like "Meta Assembler" that let you use labels and variables and think "wow, I could do so much stuff with that..."
My first PC was sometime in late 90s. A Celeron 233MHz with Windows 95. I wasn't a huge fan of Windows back then. I remember when one of the pc magazines I got had RedHat Linux install CDs. I liked it from the start. The fact my software only modem and Lexmark printer didn't work got me into kernel programming :-)
Fun to think of it now, but I prefer 2026 a 100x :-)
My first computer was a 486sx 25Mhz [1] The rig (tower, monitor, etc.) cost around $3,000. We got the SX instead of the DX because it was $500 cheaper. And I wanted a 16bit sound card. (Note that this is in 1992 dollars. Today it would cost over $7,000)
My parents didn't have a lot of money, but my great-grand father passed and they used some of the inheritance to buy the computer. I was instantly hooked. In hindsight I see how much of a gift my family gave me.
The announcement reminded me of article John Dvorak wrote around the same time. 1GB hard drives had just come out, and he asked what all the extra space would be used for. Even as a young teenager, I remember thinking how short sighted that comment was. That was before I realized how the tech press tends to get stuck in local optimizations, and can't understand the bigger picture.
It's all a good reminder that cutting edge today doesn't stay cutting edge very long, and the world figures out how to squeeze every ounce ounce of power out of hardware. (Also, yes, that leads to bloat...)
> In hindsight I see how much of a gift my family gave me.
True for many, many of us, I suspect. My family bought a 286 in the early 90s and it cost something like $2000 CAD then, which is nearly $4000 now; but salaries were lower then, this would have been something like 5-6% of my single income family's yearly post-tax earnings for the year, and if you think about it as the % of "disposable" income it was probably more like 60% of it for the year.
Obviously it paid off in that it set me on the path for my career, hard to make any other investment as good as that, but who would have known that at the time? I'm glad that there were so many ads positioning computers as being educational and not just game machines; even though in reality I think it was learning about the computer to make the games work that taught me way more than any educational software ever did.
Ha! Same for me: 286 in 9th grade (1990) for about $2k CAD. 286 was a bad call though as I think it was harder to expand compared with 386. I remember 1MB RAM but really only 640k usable. Had to change some BIOS settings to get to about ~700 kB?
I’ve been thinking a lot about these inflation-adjusted prices due to the big Apple Computer anniversary — an Apple // cost $5000 in 2026 dollars, meanwhile a $600 Macbook Neo cost $150 in 1980 cash!
What helped me reconcile this was an observation that we’ve inverted the prices of necessities and luxury goods. Rent and mortgage in particular were a much smaller slice of income back then, but luxury goods were very expensive, so one would save up for a year or two to buy a new TV or a computer for the kids.
Now the necessities take a much larger slice of our income, but TVs and computers are incredibly cheap. It takes very little money to get a nice computer, and not-buying it barely makes a dent in the bills. This isn’t a good thing.
I do disagree a little with your observation regarding the industry “squeezing every ounce of power out of hardware”. Beyond local LLM stuff, there’s basically nothing a modern computer can comfortably do that any laptop since the mainstreaming of SSDs can’t.
I wanted to link his columns "Microsoft Dot Nyet" and "New Architecture Needed" from circa 2000-2001 but it turns out they have been memory-holed. They should be somewhere in the wayback machine.
EDIT: At least one of them has not been deleted, just his name has been removed
Yikes, you're not wrong. And I guess he's never heard of security issues, what with his ROM idea. Neat for a console (where the ROMs are game cartridges, as they used to be) or an appliance not connected to the internet, not a general-purpose OS...
Pretty much the only thing I agree with is that computer architecture could use a complete rework (both from a software as well as hardware side, though primarily the former); as well as said rework being basically impossible in practice.
> In hindsight I see how much of a gift my family gave me.
Gotta tack on to this thread showing appreciation for parents. We could never afford new computers in the 90s, but luckily my dad could bring home obsolete equipment from work. We were thus always at least a generation behind. I remember my friend's Pentium feeling like sci-fi compared to our 386, but my goodness it completely molded my life!
Later, towards the end of the 90s, those sci-fi Pentiums were obsolete, so I got a few to run "that weird Linux stuff" on. Since it was considered junk, nobody cared what I did with it. To this day, if I happen to hear Metallica play and there's early winter's first smell of snow in the air, my mind will be transported back to that school night I secretly stayed up wayyy too late and discovered SSH for the first time. Haven't looked back.
Thank you, dad! I just hope general computing devices owned by regular people are still natural by the time my children come of age.
My grade school friend got a Nintendo and I wanted one so badly. My parents got me an Apple IIGS instead. I was a little disappointed about the Nintendo, but saw there were plenty of games on the thing, and of course it could do so much more than play games. That turned out to be a very good move on their part.
My mother was a stenographer. She used a 286 for processing docs. That baby wasssss alll mine during the day!!! All my friends had hacks for sys/bat/exe files to get wolfenstein at least to load. Best days of my life.
Mine neither although the grandparents were moderately wealthy but my mom understood very early on that it was a match for me and that computers would really take off.
Fun story: first BASIC I ever got was an Atari 2600 cartridge that came with some key of a "keyboard" in two parts you'd plug in the joystick ports. When my parent bought that Atari 2600 they tried it and spent the entire night playing "Tank Attack" on the TV in their bedroom. She only told me that years later.
Then as I was writing tiny BASIC programs on the Atari 2600 gaming console, she realized I needed a "real" computer, so she bought me an Atari 600 XL a bit later. Then I began salivating on the neighbours' Commodore 64, which I could see trough a window. And she thought: "If I buy the exact same computer as the neighbours, maybe my son and the neighbours shall become friends!". 42 years later one of our neighbor just went to visit my brother in another country and his brother we exchange Telegram messages nearly daily.
The 486 killer app was DOOM. It was butter-smooth at 20 fps if you also had a VLB graphic card.
The 486 DX2 66MHz was the target platform for gaming during almost two years (1992-1994). That was an huge achievement back in the days to be at the top that long.
The DX/2 66 is a true legend of a chip. It was so good. The final nail in the coffin for the Amiga and for 68k. I love the Amiga, but it just didn’t Doom.
Before it, you could claim that a 68040 was kinda-sorta keeping up with the 486 and that the nicer design and better operating systems of other computers made up for the delta in raw performance, but the DX/2 66 running Doom was the final piece of proof that the worse-is-better approach of using raw CPU grunt to blast pixels at screen memory instead of relying on clever custom circuitry was winning.
Faced with overwhelming evidence, everyone sold their Amiga 1200s and jumped ship to that hated Wintel platform.
I remember arguments (and benchmarks) around all the variations of the 486 since the bus speed/clock speed was uncoupled (the /2 is clock doubling). For some applications, a 50Mhz 486 with a 50Mhz bus would beat a DX/2 66Mhz with a 33Mhz bus.
And sometimes the DX/4 100Mhz would be slowest of all those at 25Mhz bus.
As I noted in my other comment (1), in 1985 Amiga OCS bitplane graphics (separate each bits of a pixel index into separate areas) was a huge boon in 2d capability since it lowered bandwidth to 6/8ths but made 3d rendering a major pain in the ass.
The Aga chipset of the 1200/4000 stupidly only added 2 more bitplanes. The CD32 chip actually had byte-per-pixel (chunky) graphics modes but the omission from the 1200 was fatal.
Reading in hindsight there was probably too many structural issues for Commodore to remain competitive anyhow, but an alt-history where they would've seen the needs for 3d rendering is tantalizing.
Doom was released end of '93. In 1992 most of us were in the 286 -> 386 upgrade wave and a 486-33 was easily at $2.5k+ ($5.5k in today's terms). The 486 DX2 66 was a good choice even 1994-1996.
My boss then - who's still a very dear friend - purchased a work computer to play Doom. He was already mentally checked out of that job and was looking for his next opportunity. Spent a lot of time at work playing Doom and got quite good at it.
I think it was 1994. It was a loaded 486 with the best 17" CRT monitor money could buy at the time. I think he spent over $7000.
Yes, the latest chips were very expensive back then, and out of reach for most people who would continue buying new computers with older chips. (As opposed to how most people today buy an iPhone or a Mac or whatever with the latest semiconductor technology.) I got my 25MHz 386 in 1991, over two years after the 486 was announced, and I had one of the fastest computers of anybody in school... for a short time.
My first Intel based PC was actually a 486DX/2-66 “Houdini” card for my PowerMac 6100/60 in
late 1994. It had a SB16 daughtercard and could either share RAM with the host Mac or use a 32MB dedicated SIMM. I added a dedicate SIMM when prices dropped to $300 for it.
It's hard to convey to today's generation, who think Ivy Bridge to Haswell was a big jump or whatever, how awesome the 286 -> 386 -> 486 changes were to personal computing. It felt almost like what going from a NES to a Super Nintendo to a N64 felt like. The improvements were astounding.
It wasn't a big jump, but it was a jump. Ivy Bridge lacks the instruction set required to run RHEL 10 [1].
The minimum supported microarchitecture level is x86-64-v3 and Ivy Bridge lacks AVX2 instructions.
I remember trying to run a game, Rise of the Triad, which was built with an improved Wolfenstein engine iirc, and having it struggle on my 386 unless I made the viewport as small as possible. At which point it told me to buy a 486... well I did eventually, I guess it worked.
Had the same experience with Doom II. Got it to run surprisingly well on a brand new Tandy 486DX2 + 4MB RAM, though I seem to recall having issues with SoundBlaster compatibility.
We ran a 3-line BBS (Renegade and then Wildcat) on OS/2 on a 486-33 with 12 MB RAM. This was in 1994 or so. Great way to multitask several dos applications!
486 was my dream. Unfortunately, my parents didn't have money for it. I bought my first PC in 1999 - a Pentium 2. I invested a lot of money in the monitor; computers become obsolete very quickly, while a monitor can serve for many years. Surprisingly, flat monitors appeared soon after...
Funny I'm working with intel 686 right now brutal to get stuff to build eg. rust/cargo related (missing deps but mostly the hardware, slow). Recently trying to fix this maturin problem I ran into. But it is cool the backwards compatibility of python 3.11 to 32bit with debian 12
The CPU I'm working with is Celeron M 900MHz single core no HT struggling to build wheels for python (several hours)
It's great Python is/was well supported on i686. Node on the other hand almost immediately started requiring SSE2 even in the earliest versions. Have not found success with Node + Pentium III yet, maybe need to build an earlier version myself.
I remember getting my first 486 33mhz computer and being able to play Ultima 7 the black gate, and later Ultima 7 part 2. This was a turning point for me as the game was way ahead of others on the console side of things. DOS 6 !
I think the turning-point was that flat-framebuffers and plenty of CPU-power for the first time eclipsed specialized 2d hardware (Amiga,Megadrive, Snes, etc).
Flat framebuffers and "powerful" CPU's also enabled easier software rendering (Doom/Duke) of 3d, compared to the Amiga where writing textured rendering for an Amiga is a PITA due to video memory layout with separted bitplanes spreading bits of each pixel into different memory locations (the total memory bandwidth reduction in 1985 by using 5 or 6 bitplanes became a fatal bottleneck at this point).
It wasn't really always full framerate though and the 2d chipsets did help in "classic" actiongames that were still much in the rage.
The Pentium further widened the gap, but at the same time consoles gained hardware 3d acceleration (PSX/Saturn/Jaguar) yet the Pentium could do graphics better in some respects (As shown with Quake).
Once 3d accelerators landed, PC's has more or less constantly been ahead apart from when it comes to price (and comfort/ease).
I can understand running an old 486 machine for nostalgia reasons, or because you have some old industrial equipment that relies on it and even one second spent replacing it is a second too many, but I struggle to imagine why you'd want or need to run a modern Linux kernel on it.
• Ran my first Linux at home on a i486-DX2 (33 MHz, 4 MB RAM), which supported a decent X11/R6 performance in color in 1992, with a 14" CRT.
• Ran my first real UNIX at home on a PA-RISC (HP 9000-715/75 with HP-UX 9.03 and 96 MB RAM) in 1997, 20" color CRT.
• Today, Linux is still here, but on a 2-CPU, 140-core AMD server with 2 TB RAM, hundreds of TB NAS and a 40" TFT... (and it still takes too long to open the bloated Web browser! - keenly awaiting Ladybird to the rescue in August.)
Hard to imagine now, but this was a huge turning point. A genuinely powerful CPU in a "Pee-Cee" available for less than RISC workstation money. I had to wait a while, mine was an AMD DX2-66 since I didn't have a budget for Intel... add Slackware... and countess hours messing with XF86config and I had a poor-mans Sun workstation.
- tinkered for HOURS to get enough EMM/XMM memory by tweaking Config.Sys & Co to get whatever game running
(and having dedicated boot options configured, because you could unload some drivers from mem and could then run other games)
I've got an AMD branded 286 chip, from my first owned-by-me PC, bluetac-ed to the case of my home desktop PC, powered by a Ryzen something-or-other from a few years ago (with a 1060/6Gb card from a few years before that because I wasn't gaming enough to justify a new graphics card along with the other updates at the time).
I too have one sitting on my desk, 486DX2 66Mhz. I've had it for probably 25 years now, bringing it from job to job like the magical lost artifact it is. I remember how much more capable it was for playing Doom and Descent than the 33Mhz, or heaven forbid the SX. Of course shortly after the Pentium came out and blew everything away. The good 'ol days of giant Gateway 2000 towers.
I loved my 486DX2 66Mhz based IBM PS/1 (2168), which had a whopping 8MB of RAM. Not only did it really enable me to experience the fullness of PC gaming of the era, but it was the first computer I was able to install an internal modem into, and the computer I used to get SLIP dial-in access to the state university mainframe and thus to the Internet (prior I was limited to Prodigy walled garden). It was this computer that let me play early MUDs via telnet, let me play my first graphical MMORPG (Ultima Online), and and introduced me to real visual programming (Visual Basic).
To a significant degree, the 486DX2 was the primary computing platform that created the foundation I needed to learn computing at depth and enabled my later career, and really set many of the formative moments in my life. Thanks Intel, even though you suck now as a shadow of your former self you were a beast in the 90s.
While the speed increases weren't as dramatic, do note that even in single core speed, unlike the clocks would suggest the Ryzen 7 is much, much more than 1.23X faster than the P4. The P4 was a particularly fragile architecture, and achieved IPC on real code was typically well below 1, often closer to 0.5. The x3d variants of Ryzen have been measured at running above 3 average IPC on real, complex loads. So the single-core uplift from that P4 to a modern AMD core is about the same as from that 300MHz Pentium to the 3.8 P4, it just took 20 years, not 8. Of course, now we also have 8 times the cores.
> How was the person incorrect that speed increases won't continue forever?
Through the magic of saying something different in actuality, which really ended up being proven incorrect. From the
blogpost above, verbatim, italicizing the relevant bits:
> Writing in the May 8, 1989 issue of Infoworld, Michael Slater warned that the sixfold speed increase seen from 1981 to 1989, going from 5 MHz to 33 MHz, would not be repeated.
Clock speeds used to be going up in a straight line (the normal "interpretation" of Moore's law) - but once the P4 hit a (kind of useless 3.8GHz) we leveled off for decades.
A switch from the exponential regime to something immensely slower was a qualitative change. The difference is so vast that it's completely reasonable to say that clock speeds haven't changed a single bit since 2006 or so (and even for raw ops/s speeds, which have improved much more, it's debatable).
> But when Word 97 arrived with real-time spelling and grammar checking and Clippy, the 486 couldn’t keep up. You really needed a Pentium or equivalent to do all three at once without noticeable lag as you typed.
In other words, faster hardware was needed because the quality and performance of the software dropped. I was doing spell-checking with WordStar on an CP/M Apple II with zero lag -- and WordStar fit on one side of a 5' floppy.
WordStar originally didn’t have a spell checker. It was an add in product. And even after SpellStar was integrated (a response to the NewStar clone’s built-in spell checker), it was never as-you-type spell checking, which is what we got in Word 97, and what consumed the cycles on a 486.
Word 97 also had as-you-type grammar checking, which wordstar never had. Wordstar did have an add in extra cost grammar checker whose name escapes me at the moment. But again, it was never real time.
Yes, programs have become bloated, but it is worth it to compare apples to apples.
One might argue that real time isn’t necessary, and one might be right. But that’s different from poorly written.
Fair points. I'd argue that realtime spellcheck doesn't provide a lot of value -- when you're writing you want to focus on the writing and go back and fix the spelling when you do the editing.
I'd argue it was a combination of "now we have more processing power lets see how we can use it up" and "we don't have to make so many hard design and programming decisions thanks to the extra power", with the result being that you "had" to get the new chips in order to run the new software that was replacing the old software
Repeat that a number of cycles and we wound up with Windows Vista ;)
Since we're discussing word processors, I would say that WordPerfect5 for DOS was the best word processor I've used to date (Pages on Mac comes in second). It did almost everything that Word does today in terms of word processing (not page layout but Word is terrible at that anyway, you really need InDesign to do that properly), was fast and easy to work with (keyboard shortcuts for operations is much faster than a mouse/GUI), and didn't require nearly as much processing power.
For me, the 486 was right between my (actually my Dad's) first computer, a 386, and my first personal computer (Pentium MMX). During those couple of years my friends had 486s and I was always jealous. I used to drool at the Best Buy catalog that came every Sunday in the mail.
Nowadays, 486 computers are getting rare and relatively expensive. CPUs themselves are 25, 30, 40, sometimes 50 bucks on eBay. Whole working systems are in the low hundreds, and fully working 486 laptops can fetch 400 or 500 bucks.
For a time systems with a 386SX were significantly cheaper than those with a 386DX because the 16-bit data-bus mean cheaper motherboards could be used.
If you were running 16-bit software they were little slower than a 386DX at the same clock and significantly faster than a 286 because of higher clocks (286's usually topped out at 12MHz though there were some 16MHz options, the slowest 386s were running at 16MHz with some as fast as 40MHz), but also in part, when not blocked by instruction ordering issues, to the (albeit small by modern standards) instruction pipeline which the 286 lacked.
32-bit software was a lot slower than on a DX because 32-bit data reads and writes took two trips over the 16-bit data bus, but you could at least run the code as it was a full 386 core otherwise (full enhanced protected mode, page based virtual memory, v8086 mode, etc).
The SX also only used 24 bits of the address bus, limiting it to 16MB of RAM compared to the original's 4GB range, though this was not a big issue for most at the time.
Ahhh but it gave me the opportunity to ran real programs, coming from an XT!
*Edited to add an example: I could for the first time use AutoCAD.
The price difference between a 286 and a 386SX was negligible, but the software I could use, was other league.
Great throwback.. they were awesome proc's. With a few Simms (4 - 16 Mb) it could do multimedia madness never seen before (play a CD-ROM game of mpeg1 video) 486dx4 100 was the latest Intel I had before going to Pentium clones. (AMD K series and the shitty Cyrix 6x86)
Heh, I remember using my first machine, a 486 for a long time after it was obsolete and reading system requirements like, what do you mean pentium recommended and why the hell do you need 16Mb of RAM. It's interesting to reflect that the old games like Settlers, HoMM 2 or Warcraft 2, that are no worse than modern ones gameplay wise, used to run on something that is so vastly underpowered by modern standards the numbers don't even feel like a real spec.
It comes over as so incredibly insane to me that people from the late 80s (people working with computers! Reporting on them!) would look at their current technology stack and basically go: "I have no idea whatsoever what else we can do with these things, we've reached the end"
On the other end, you have people who have no idea how insanely fast computers are today, and how little computing power is "really" needed for most things that computer users do - or how much you can do with one average machine ("Oh no, 1000 requests per second - let's erect another rube goldberg machine to handle that!").
The 80s and 90s were filled with new things computers could do - spreadsheets, wysiwyg word processors, games - things that simply were impossible before (or not done).
In the 2000s through now we've mostly had improvements - 4k Youtube is much better than realplayer, but it's still just "online video". AI is definitely a "new" thing and it's somewhat awoken a similar spirit to the 80s/90s - but not the same breadth. Dad bringing home a computer because he wants to do spreadsheets and you finding it can run DooM or even play music.
It's easy to mock in hindsight, but the failure mode isn't lack of imagination. It's extrapolating linearly from physical limits that were real at the time. In 1989, DRAM refresh cycles and bus bandwidth genuinely were bottlenecks that seemed fundamental. What nobody predicted was that the industry would sidestep those walls entirely (caches, pipelines, out-of-order execution, then multicore). Architectural innovation tends to appear orthogonally to wherever the current wall is.
All we really have to look forward to in the future of increasing-performance personal computing is doing the same things as yesterday, but doing them faster.
The future after today will probably turn out more interesting than that, of course, but we can't know that until it happens.
And the future after 1988 certainly turned out to be a very interesting time in computing -- but they had no idea what was in store. Perhaps you can use your time machine to go back and let them know?
The first 80286-based system (IBM PC AT), 80386 (Compaq Deskpro 386), and 80486 all had people writing about their suitability as servers, with the consensus's implication being that normal people didn't need them.
The Pentium is the first one, I think, that this didn't happen, because by then it turned out that people need a computer that can do what they are currently doing—but faster—much more often than they need servers.
Back in the day I couldn't even dream of a PC. They were way too expensive. It took my extended family chipping in (~15 people) to buy me a C64 with tape storage. Still it was great fun. It made me learn programming in BASIC and English at the same time (as the Polish language book included was so badly translated and full of errors it was hopeless).
It was pre-internet obviously so obtaining software was very difficult. For years when I was learning assembler I was using a so called "monitor cartridge" that did simple assembly/disassembly, but it didn't support labels and such. I could read about software like "Meta Assembler" that let you use labels and variables and think "wow, I could do so much stuff with that..."
My first PC was sometime in late 90s. A Celeron 233MHz with Windows 95. I wasn't a huge fan of Windows back then. I remember when one of the pc magazines I got had RedHat Linux install CDs. I liked it from the start. The fact my software only modem and Lexmark printer didn't work got me into kernel programming :-)
Fun to think of it now, but I prefer 2026 a 100x :-)
My first computer was a 486sx 25Mhz [1] The rig (tower, monitor, etc.) cost around $3,000. We got the SX instead of the DX because it was $500 cheaper. And I wanted a 16bit sound card. (Note that this is in 1992 dollars. Today it would cost over $7,000)
My parents didn't have a lot of money, but my great-grand father passed and they used some of the inheritance to buy the computer. I was instantly hooked. In hindsight I see how much of a gift my family gave me.
The announcement reminded me of article John Dvorak wrote around the same time. 1GB hard drives had just come out, and he asked what all the extra space would be used for. Even as a young teenager, I remember thinking how short sighted that comment was. That was before I realized how the tech press tends to get stuck in local optimizations, and can't understand the bigger picture.
It's all a good reminder that cutting edge today doesn't stay cutting edge very long, and the world figures out how to squeeze every ounce ounce of power out of hardware. (Also, yes, that leads to bloat...)
[1] https://en.wikipedia.org/wiki/I486SX
[2] https://en.wikipedia.org/wiki/John_C._Dvorak
> In hindsight I see how much of a gift my family gave me.
True for many, many of us, I suspect. My family bought a 286 in the early 90s and it cost something like $2000 CAD then, which is nearly $4000 now; but salaries were lower then, this would have been something like 5-6% of my single income family's yearly post-tax earnings for the year, and if you think about it as the % of "disposable" income it was probably more like 60% of it for the year.
Obviously it paid off in that it set me on the path for my career, hard to make any other investment as good as that, but who would have known that at the time? I'm glad that there were so many ads positioning computers as being educational and not just game machines; even though in reality I think it was learning about the computer to make the games work that taught me way more than any educational software ever did.
Ha! Same for me: 286 in 9th grade (1990) for about $2k CAD. 286 was a bad call though as I think it was harder to expand compared with 386. I remember 1MB RAM but really only 640k usable. Had to change some BIOS settings to get to about ~700 kB?
similar, but I got the 486 DX2-66.
I’ve been thinking a lot about these inflation-adjusted prices due to the big Apple Computer anniversary — an Apple // cost $5000 in 2026 dollars, meanwhile a $600 Macbook Neo cost $150 in 1980 cash!
What helped me reconcile this was an observation that we’ve inverted the prices of necessities and luxury goods. Rent and mortgage in particular were a much smaller slice of income back then, but luxury goods were very expensive, so one would save up for a year or two to buy a new TV or a computer for the kids.
Now the necessities take a much larger slice of our income, but TVs and computers are incredibly cheap. It takes very little money to get a nice computer, and not-buying it barely makes a dent in the bills. This isn’t a good thing.
I do disagree a little with your observation regarding the industry “squeezing every ounce of power out of hardware”. Beyond local LLM stuff, there’s basically nothing a modern computer can comfortably do that any laptop since the mainstreaming of SSDs can’t.
Audio, video, and 3D animation are still extremely processor intensive. You need something beefy if you're serious/professional about those.
Office tools and web browsing are less demanding.
2 replies →
John Dvorak has tons of short sighted articles.
I wanted to link his columns "Microsoft Dot Nyet" and "New Architecture Needed" from circa 2000-2001 but it turns out they have been memory-holed. They should be somewhere in the wayback machine.
EDIT: At least one of them has not been deleted, just his name has been removed
https://www.pcmag.com/archive/new-architecture-needed-32570
Yikes, you're not wrong. And I guess he's never heard of security issues, what with his ROM idea. Neat for a console (where the ROMs are game cartridges, as they used to be) or an appliance not connected to the internet, not a general-purpose OS...
Pretty much the only thing I agree with is that computer architecture could use a complete rework (both from a software as well as hardware side, though primarily the former); as well as said rework being basically impossible in practice.
> In hindsight I see how much of a gift my family gave me.
Gotta tack on to this thread showing appreciation for parents. We could never afford new computers in the 90s, but luckily my dad could bring home obsolete equipment from work. We were thus always at least a generation behind. I remember my friend's Pentium feeling like sci-fi compared to our 386, but my goodness it completely molded my life!
Later, towards the end of the 90s, those sci-fi Pentiums were obsolete, so I got a few to run "that weird Linux stuff" on. Since it was considered junk, nobody cared what I did with it. To this day, if I happen to hear Metallica play and there's early winter's first smell of snow in the air, my mind will be transported back to that school night I secretly stayed up wayyy too late and discovered SSH for the first time. Haven't looked back.
Thank you, dad! I just hope general computing devices owned by regular people are still natural by the time my children come of age.
My grade school friend got a Nintendo and I wanted one so badly. My parents got me an Apple IIGS instead. I was a little disappointed about the Nintendo, but saw there were plenty of games on the thing, and of course it could do so much more than play games. That turned out to be a very good move on their part.
My mother was a stenographer. She used a 286 for processing docs. That baby wasssss alll mine during the day!!! All my friends had hacks for sys/bat/exe files to get wolfenstein at least to load. Best days of my life.
> My parents didn't have a lot of money ...
Mine neither although the grandparents were moderately wealthy but my mom understood very early on that it was a match for me and that computers would really take off.
Fun story: first BASIC I ever got was an Atari 2600 cartridge that came with some key of a "keyboard" in two parts you'd plug in the joystick ports. When my parent bought that Atari 2600 they tried it and spent the entire night playing "Tank Attack" on the TV in their bedroom. She only told me that years later.
Then as I was writing tiny BASIC programs on the Atari 2600 gaming console, she realized I needed a "real" computer, so she bought me an Atari 600 XL a bit later. Then I began salivating on the neighbours' Commodore 64, which I could see trough a window. And she thought: "If I buy the exact same computer as the neighbours, maybe my son and the neighbours shall become friends!". 42 years later one of our neighbor just went to visit my brother in another country and his brother we exchange Telegram messages nearly daily.
Then the Amiga. Then the 386, 486, etc.
What a mom. RIP.
The 486 killer app was DOOM. It was butter-smooth at 20 fps if you also had a VLB graphic card.
The 486 DX2 66MHz was the target platform for gaming during almost two years (1992-1994). That was an huge achievement back in the days to be at the top that long.
The DX/2 66 is a true legend of a chip. It was so good. The final nail in the coffin for the Amiga and for 68k. I love the Amiga, but it just didn’t Doom.
Before it, you could claim that a 68040 was kinda-sorta keeping up with the 486 and that the nicer design and better operating systems of other computers made up for the delta in raw performance, but the DX/2 66 running Doom was the final piece of proof that the worse-is-better approach of using raw CPU grunt to blast pixels at screen memory instead of relying on clever custom circuitry was winning.
Faced with overwhelming evidence, everyone sold their Amiga 1200s and jumped ship to that hated Wintel platform.
I remember arguments (and benchmarks) around all the variations of the 486 since the bus speed/clock speed was uncoupled (the /2 is clock doubling). For some applications, a 50Mhz 486 with a 50Mhz bus would beat a DX/2 66Mhz with a 33Mhz bus.
And sometimes the DX/4 100Mhz would be slowest of all those at 25Mhz bus.
8 replies →
As I noted in my other comment (1), in 1985 Amiga OCS bitplane graphics (separate each bits of a pixel index into separate areas) was a huge boon in 2d capability since it lowered bandwidth to 6/8ths but made 3d rendering a major pain in the ass.
The Aga chipset of the 1200/4000 stupidly only added 2 more bitplanes. The CD32 chip actually had byte-per-pixel (chunky) graphics modes but the omission from the 1200 was fatal.
Reading in hindsight there was probably too many structural issues for Commodore to remain competitive anyhow, but an alt-history where they would've seen the needs for 3d rendering is tantalizing.
1: https://news.ycombinator.com/item?id=47717334
16 replies →
At that point in time I would not have called it Wintel yet. That started after Windows 95, IIRC.
Yep. 486DX/2 was when I started seriously looking at moving on from the Amiga. I wound up with a DX/4 100 sometime in 1994.
My classmate kept his Amiga 1200 a bit longer! ...eventually he got a PC with Pentium 60 MHz.
14 replies →
Slightly before DOOM came out, the killer 486 app for me was Fractint (https://en.wikipedia.org/wiki/Fractint)
I distinctly remember having a Strike Commander poster in my bedroom saying “Strike really flies on a 486 DX/2”. Fond memories indeed.
Doom was released end of '93. In 1992 most of us were in the 286 -> 386 upgrade wave and a 486-33 was easily at $2.5k+ ($5.5k in today's terms). The 486 DX2 66 was a good choice even 1994-1996.
My boss then - who's still a very dear friend - purchased a work computer to play Doom. He was already mentally checked out of that job and was looking for his next opportunity. Spent a lot of time at work playing Doom and got quite good at it.
I think it was 1994. It was a loaded 486 with the best 17" CRT monitor money could buy at the time. I think he spent over $7000.
Yes, the latest chips were very expensive back then, and out of reach for most people who would continue buying new computers with older chips. (As opposed to how most people today buy an iPhone or a Mac or whatever with the latest semiconductor technology.) I got my 25MHz 386 in 1991, over two years after the 486 was announced, and I had one of the fastest computers of anybody in school... for a short time.
My first Intel based PC was actually a 486DX/2-66 “Houdini” card for my PowerMac 6100/60 in late 1994. It had a SB16 daughtercard and could either share RAM with the host Mac or use a 32MB dedicated SIMM. I added a dedicate SIMM when prices dropped to $300 for it.
I wonder, I wonder where one could find a good book about the software architecture of that game… oh, well
They need to bring back the turbo button.
You’re in luck!
https://www.silverstonetek.com/en/product/info/computer-chas...
1 reply →
++1
https://en.wikipedia.org/wiki/VESA_Local_Bus for the younger crowd.
...and with 8 MB (-eight- for the youngsters ;-) RAM you were absolutely the king ruler :-D
The 486 and https://www.delorie.com/djgpp/history.html changed everything.
Suddenly, it was possible to imagine running advanced software on a PC, and not have to spend 25,000 USD on a workstation.
It's hard to convey to today's generation, who think Ivy Bridge to Haswell was a big jump or whatever, how awesome the 286 -> 386 -> 486 changes were to personal computing. It felt almost like what going from a NES to a Super Nintendo to a N64 felt like. The improvements were astounding.
It wasn't a big jump, but it was a jump. Ivy Bridge lacks the instruction set required to run RHEL 10 [1]. The minimum supported microarchitecture level is x86-64-v3 and Ivy Bridge lacks AVX2 instructions.
[1]: https://docs.redhat.com/en/documentation/red_hat_enterprise_...
2 replies →
I remember trying to run a game, Rise of the Triad, which was built with an improved Wolfenstein engine iirc, and having it struggle on my 386 unless I made the viewport as small as possible. At which point it told me to buy a 486... well I did eventually, I guess it worked.
Had the same experience with Doom II. Got it to run surprisingly well on a brand new Tandy 486DX2 + 4MB RAM, though I seem to recall having issues with SoundBlaster compatibility.
Amazing to see a webpage "Updated Dec 1998" still up, running and displaying correctly.
Without fancy JS or CSS, sites can last decades easily
4 replies →
and dont forget _legendary_ RHIDE dev environment!
https://ftp.gwdg.de/pub/gnu/www/directory/all/rhide.html
:-)
And you could use VESA linear framebuffer above 256KB - this was a breakthrough back then :-))
It was really the 386 that was the beginning of modern computing, since it had a mmu.
Several operating systems on 286 (eg Xenix, Coherent, OS/2) used its MMU for multitasking and memory protection. See https://en.wikipedia.org/wiki/Intel_80286#Protected_mode
4 replies →
Except the 486 had hardware floating point, essential for technical work.
5 replies →
We ran a 3-line BBS (Renegade and then Wildcat) on OS/2 on a 486-33 with 12 MB RAM. This was in 1994 or so. Great way to multitask several dos applications!
486 was my dream. Unfortunately, my parents didn't have money for it. I bought my first PC in 1999 - a Pentium 2. I invested a lot of money in the monitor; computers become obsolete very quickly, while a monitor can serve for many years. Surprisingly, flat monitors appeared soon after...
Yeah but the first LCD screens sucked. Poor color rendition and not usable for gaming. In the early 2000s you were better off sticking with your CRT.
I didn't have access to a 486 until around 1999. I was making do with a hand-me-down 8088 and then a 386SX.
Back then, 10 years of technological advancement made a huge difference. Today, you can get by just fine with a 2016-era laptop.
Friend of mine is still rocking a 1st gen retina MacBook Pro (from 2012) for music production!
Funny I'm working with intel 686 right now brutal to get stuff to build eg. rust/cargo related (missing deps but mostly the hardware, slow). Recently trying to fix this maturin problem I ran into. But it is cool the backwards compatibility of python 3.11 to 32bit with debian 12
The CPU I'm working with is Celeron M 900MHz single core no HT struggling to build wheels for python (several hours)
It's great Python is/was well supported on i686. Node on the other hand almost immediately started requiring SSE2 even in the earliest versions. Have not found success with Node + Pentium III yet, maybe need to build an earlier version myself.
I got it to work on Intel 270 but that has HT and 1.6GHz still slow (hours to build wheels) specifically temporalio and cryptography.
Yeah node is usually my go to love JS
I remember getting my first 486 33mhz computer and being able to play Ultima 7 the black gate, and later Ultima 7 part 2. This was a turning point for me as the game was way ahead of others on the console side of things. DOS 6 !
I think the turning-point was that flat-framebuffers and plenty of CPU-power for the first time eclipsed specialized 2d hardware (Amiga,Megadrive, Snes, etc).
Flat framebuffers and "powerful" CPU's also enabled easier software rendering (Doom/Duke) of 3d, compared to the Amiga where writing textured rendering for an Amiga is a PITA due to video memory layout with separted bitplanes spreading bits of each pixel into different memory locations (the total memory bandwidth reduction in 1985 by using 5 or 6 bitplanes became a fatal bottleneck at this point).
It wasn't really always full framerate though and the 2d chipsets did help in "classic" actiongames that were still much in the rage.
The Pentium further widened the gap, but at the same time consoles gained hardware 3d acceleration (PSX/Saturn/Jaguar) yet the Pentium could do graphics better in some respects (As shown with Quake).
Once 3d accelerators landed, PC's has more or less constantly been ahead apart from when it comes to price (and comfort/ease).
Linux kernel version 7.1 will drop support for 486: "Linux devs think even one second spent on 486 support is a second too many." https://arstechnica.com/gadgets/2026/04/linux-kernel-maintai...
>This chip was originally introduced in 1989, was replaced by the first Intel Pentium in 1993, and was fully discontinued in 2007
That's really long compared to 1yr refresh cycles we have today with phones etc.
I can understand running an old 486 machine for nostalgia reasons, or because you have some old industrial equipment that relies on it and even one second spent replacing it is a second too many, but I struggle to imagine why you'd want or need to run a modern Linux kernel on it.
• Ran my first Linux at home on a i486-DX2 (33 MHz, 4 MB RAM), which supported a decent X11/R6 performance in color in 1992, with a 14" CRT.
• Ran my first real UNIX at home on a PA-RISC (HP 9000-715/75 with HP-UX 9.03 and 96 MB RAM) in 1997, 20" color CRT.
• Today, Linux is still here, but on a 2-CPU, 140-core AMD server with 2 TB RAM, hundreds of TB NAS and a 40" TFT... (and it still takes too long to open the bloated Web browser! - keenly awaiting Ladybird to the rescue in August.)
I remember compiling Linux Kernel on SuSE 6.3 on a AMD 486DX5 133mhz ... good times , and I don't forget to do "make mrpropper"
> and it still takes too long to open the bloated Web browser! - keenly awaiting Ladybird to the rescue in August
chromium browsers launch pretty fast. If you're talking about memory usage, Ladybird isn't aimed at minimal memory usage from what I've seen.
Wouldn't the DX2 be 66 MHz? Or did you intentionally run it at 33 MHz?
Hard to imagine now, but this was a huge turning point. A genuinely powerful CPU in a "Pee-Cee" available for less than RISC workstation money. I had to wait a while, mine was an AMD DX2-66 since I didn't have a budget for Intel... add Slackware... and countess hours messing with XF86config and I had a poor-mans Sun workstation.
Raise your hand if you have been there and:
- tinkered for HOURS to get enough EMM/XMM memory by tweaking Config.Sys & Co to get whatever game running (and having dedicated boot options configured, because you could unload some drivers from mem and could then run other games)
:-D
I still have a 486 linux system from those days - has not been turned on in this century but I'll try some day together with a glas of whisky :-)
And of course, support for this venerable processor will be dropped in Linux kernel 7.1 in a couple of months time.
Microsoft was and still is the reason why average people needed more powerful chips lol, maybe with the exception of browser bloat.
I've got one sitting on the shelf above my desk, a 33 Mhz dx, I don't even remember what machine it came out of.
I've got an AMD branded 286 chip, from my first owned-by-me PC, bluetac-ed to the case of my home desktop PC, powered by a Ryzen something-or-other from a few years ago (with a 1060/6Gb card from a few years before that because I wasn't gaming enough to justify a new graphics card along with the other updates at the time).
I too have one sitting on my desk, 486DX2 66Mhz. I've had it for probably 25 years now, bringing it from job to job like the magical lost artifact it is. I remember how much more capable it was for playing Doom and Descent than the 33Mhz, or heaven forbid the SX. Of course shortly after the Pentium came out and blew everything away. The good 'ol days of giant Gateway 2000 towers.
I got a paper route just to get a hold of the dx2.
It was a life-changing machine.
Ordered, I believe, from the depths of a Computer Shopper magazine.
I loved my 486DX2 66Mhz based IBM PS/1 (2168), which had a whopping 8MB of RAM. Not only did it really enable me to experience the fullness of PC gaming of the era, but it was the first computer I was able to install an internal modem into, and the computer I used to get SLIP dial-in access to the state university mainframe and thus to the Internet (prior I was limited to Prodigy walled garden). It was this computer that let me play early MUDs via telnet, let me play my first graphical MMORPG (Ultima Online), and and introduced me to real visual programming (Visual Basic).
To a significant degree, the 486DX2 was the primary computing platform that created the foundation I needed to learn computing at depth and enabled my later career, and really set many of the formative moments in my life. Thanks Intel, even though you suck now as a shadow of your former self you were a beast in the 90s.
How was the person incorrect that speed increases won't continue forever? Pentium 4 was 3.8GHz and Ryzen 7 has 4.7Ghz some 20 odd years later?
While the speed increases weren't as dramatic, do note that even in single core speed, unlike the clocks would suggest the Ryzen 7 is much, much more than 1.23X faster than the P4. The P4 was a particularly fragile architecture, and achieved IPC on real code was typically well below 1, often closer to 0.5. The x3d variants of Ryzen have been measured at running above 3 average IPC on real, complex loads. So the single-core uplift from that P4 to a modern AMD core is about the same as from that 300MHz Pentium to the 3.8 P4, it just took 20 years, not 8. Of course, now we also have 8 times the cores.
> How was the person incorrect that speed increases won't continue forever?
Through the magic of saying something different in actuality, which really ended up being proven incorrect. From the blogpost above, verbatim, italicizing the relevant bits:
> Writing in the May 8, 1989 issue of Infoworld, Michael Slater warned that the sixfold speed increase seen from 1981 to 1989, going from 5 MHz to 33 MHz, would not be repeated.
Clock speeds used to be going up in a straight line (the normal "interpretation" of Moore's law) - but once the P4 hit a (kind of useless 3.8GHz) we leveled off for decades.
More specifically, it was the end of Dennard scaling [0] that killed off the growth in clock speeds in the mid-2000s.
[0] https://en.wikipedia.org/wiki/Dennard_scaling
(To make it clear, straight line on a log scale. Exponential on a linear scale.)
A switch from the exponential regime to something immensely slower was a qualitative change. The difference is so vast that it's completely reasonable to say that clock speeds haven't changed a single bit since 2006 or so (and even for raw ops/s speeds, which have improved much more, it's debatable).
> But when Word 97 arrived with real-time spelling and grammar checking and Clippy, the 486 couldn’t keep up. You really needed a Pentium or equivalent to do all three at once without noticeable lag as you typed.
In other words, faster hardware was needed because the quality and performance of the software dropped. I was doing spell-checking with WordStar on an CP/M Apple II with zero lag -- and WordStar fit on one side of a 5' floppy.
WordStar originally didn’t have a spell checker. It was an add in product. And even after SpellStar was integrated (a response to the NewStar clone’s built-in spell checker), it was never as-you-type spell checking, which is what we got in Word 97, and what consumed the cycles on a 486.
Word 97 also had as-you-type grammar checking, which wordstar never had. Wordstar did have an add in extra cost grammar checker whose name escapes me at the moment. But again, it was never real time.
Yes, programs have become bloated, but it is worth it to compare apples to apples.
One might argue that real time isn’t necessary, and one might be right. But that’s different from poorly written.
Fair points. I'd argue that realtime spellcheck doesn't provide a lot of value -- when you're writing you want to focus on the writing and go back and fix the spelling when you do the editing.
I'd argue it was a combination of "now we have more processing power lets see how we can use it up" and "we don't have to make so many hard design and programming decisions thanks to the extra power", with the result being that you "had" to get the new chips in order to run the new software that was replacing the old software
Repeat that a number of cycles and we wound up with Windows Vista ;)
Since we're discussing word processors, I would say that WordPerfect5 for DOS was the best word processor I've used to date (Pages on Mac comes in second). It did almost everything that Word does today in terms of word processing (not page layout but Word is terrible at that anyway, you really need InDesign to do that properly), was fast and easy to work with (keyboard shortcuts for operations is much faster than a mouse/GUI), and didn't require nearly as much processing power.
Apples to apples? More like Windows to Windows. LOL
For me, the 486 was right between my (actually my Dad's) first computer, a 386, and my first personal computer (Pentium MMX). During those couple of years my friends had 486s and I was always jealous. I used to drool at the Best Buy catalog that came every Sunday in the mail.
Nowadays, 486 computers are getting rare and relatively expensive. CPUs themselves are 25, 30, 40, sometimes 50 bucks on eBay. Whole working systems are in the low hundreds, and fully working 486 laptops can fetch 400 or 500 bucks.
sigh
dx2 gang
486 SX 33Mhz, could not afford the DX
My experience too, as I dimly remember it.
The 486 SX was a fine chip, just no math copro.
The 386 SX was a crap, 16 bit wide bus IIRC.
For a time systems with a 386SX were significantly cheaper than those with a 386DX because the 16-bit data-bus mean cheaper motherboards could be used.
If you were running 16-bit software they were little slower than a 386DX at the same clock and significantly faster than a 286 because of higher clocks (286's usually topped out at 12MHz though there were some 16MHz options, the slowest 386s were running at 16MHz with some as fast as 40MHz), but also in part, when not blocked by instruction ordering issues, to the (albeit small by modern standards) instruction pipeline which the 286 lacked.
32-bit software was a lot slower than on a DX because 32-bit data reads and writes took two trips over the 16-bit data bus, but you could at least run the code as it was a full 386 core otherwise (full enhanced protected mode, page based virtual memory, v8086 mode, etc).
The SX also only used 24 bits of the address bus, limiting it to 16MB of RAM compared to the original's 4GB range, though this was not a big issue for most at the time.
Ahhh but it gave me the opportunity to ran real programs, coming from an XT! *Edited to add an example: I could for the first time use AutoCAD. The price difference between a 286 and a 386SX was negligible, but the software I could use, was other league.
1 reply →
I can't remember, could you buy a math coprocessor for it?
I know my 286 you could pair with a 287 next to it.. not sure if it really made a difference you could discern outside of hyper-specific uses though.
2 replies →
Great throwback.. they were awesome proc's. With a few Simms (4 - 16 Mb) it could do multimedia madness never seen before (play a CD-ROM game of mpeg1 video) 486dx4 100 was the latest Intel I had before going to Pentium clones. (AMD K series and the shitty Cyrix 6x86)
Heh, I remember using my first machine, a 486 for a long time after it was obsolete and reading system requirements like, what do you mean pentium recommended and why the hell do you need 16Mb of RAM. It's interesting to reflect that the old games like Settlers, HoMM 2 or Warcraft 2, that are no worse than modern ones gameplay wise, used to run on something that is so vastly underpowered by modern standards the numbers don't even feel like a real spec.
don't forget the original Command&Conquer
Hard to convey these days how the 486 felt like an absolute quantum leap in computing power.
I built a 486 Compaq Novell server for the company I worked for and named it Godzilla - gives a sense of how the 486 was seen.
Uuh! I recall i had this setup, not in 89, but sometime in the early 90s.
Played some awesome games, like DOOM, Wolfenstein. Later duke3d was the shit. But i cant remember if i run on the same setup or something newer.
It comes over as so incredibly insane to me that people from the late 80s (people working with computers! Reporting on them!) would look at their current technology stack and basically go: "I have no idea whatsoever what else we can do with these things, we've reached the end"
The lack of imagination is just disturbing.
On the other end, you have people who have no idea how insanely fast computers are today, and how little computing power is "really" needed for most things that computer users do - or how much you can do with one average machine ("Oh no, 1000 requests per second - let's erect another rube goldberg machine to handle that!").
The 80s and 90s were filled with new things computers could do - spreadsheets, wysiwyg word processors, games - things that simply were impossible before (or not done).
In the 2000s through now we've mostly had improvements - 4k Youtube is much better than realplayer, but it's still just "online video". AI is definitely a "new" thing and it's somewhat awoken a similar spirit to the 80s/90s - but not the same breadth. Dad bringing home a computer because he wants to do spreadsheets and you finding it can run DooM or even play music.
It's easy to mock in hindsight, but the failure mode isn't lack of imagination. It's extrapolating linearly from physical limits that were real at the time. In 1989, DRAM refresh cycles and bus bandwidth genuinely were bottlenecks that seemed fundamental. What nobody predicted was that the industry would sidestep those walls entirely (caches, pipelines, out-of-order execution, then multicore). Architectural innovation tends to appear orthogonally to wherever the current wall is.
That's not so different than today, wherein:
All we really have to look forward to in the future of increasing-performance personal computing is doing the same things as yesterday, but doing them faster.
The future after today will probably turn out more interesting than that, of course, but we can't know that until it happens.
And the future after 1988 certainly turned out to be a very interesting time in computing -- but they had no idea what was in store. Perhaps you can use your time machine to go back and let them know?
The first 80286-based system (IBM PC AT), 80386 (Compaq Deskpro 386), and 80486 all had people writing about their suitability as servers, with the consensus's implication being that normal people didn't need them.
The Pentium is the first one, I think, that this didn't happen, because by then it turned out that people need a computer that can do what they are currently doing—but faster—much more often than they need servers.
Um. That never happened. No-one ever felt that. Not a soul.
Everyone - everyone knew it was the start of a revolution.