Comment by skerit
1 day ago
It comes over as so incredibly insane to me that people from the late 80s (people working with computers! Reporting on them!) would look at their current technology stack and basically go: "I have no idea whatsoever what else we can do with these things, we've reached the end"
The lack of imagination is just disturbing.
On the other end, you have people who have no idea how insanely fast computers are today, and how little computing power is "really" needed for most things that computer users do - or how much you can do with one average machine ("Oh no, 1000 requests per second - let's erect another rube goldberg machine to handle that!").
I was born in 1981 and my first computer was an Andrea’s 1512 IBM XT clone. I then had a 386SX-16 in 1991 and a 486DX2-66 in 1994.
Anyway a while ago I was reading an article authored by a guy who lived through the same era as I grew up laughing at modern developers whom he had asked to size a machine to add all the integers from 1 to 100. Setting aside that 7 year old Gauss found the closed form of that sum (the triangular number formula) in about ten minutes and got the correct result of 5050 without any of the arithmetic busywork, it’s totally insane what some of the answers involved… with some involving the terms “Big Data” (yes, it was that era of hype, before “Crypto” and “AI”) and some even (allegedly) mentioning ‘clusters’. I really wish I could find you a link.
The 80s and 90s were filled with new things computers could do - spreadsheets, wysiwyg word processors, games - things that simply were impossible before (or not done).
In the 2000s through now we've mostly had improvements - 4k Youtube is much better than realplayer, but it's still just "online video". AI is definitely a "new" thing and it's somewhat awoken a similar spirit to the 80s/90s - but not the same breadth. Dad bringing home a computer because he wants to do spreadsheets and you finding it can run DooM or even play music.
It's easy to mock in hindsight, but the failure mode isn't lack of imagination. It's extrapolating linearly from physical limits that were real at the time. In 1989, DRAM refresh cycles and bus bandwidth genuinely were bottlenecks that seemed fundamental. What nobody predicted was that the industry would sidestep those walls entirely (caches, pipelines, out-of-order execution, then multicore). Architectural innovation tends to appear orthogonally to wherever the current wall is.
That's not so different than today, wherein:
All we really have to look forward to in the future of increasing-performance personal computing is doing the same things as yesterday, but doing them faster.
The future after today will probably turn out more interesting than that, of course, but we can't know that until it happens.
And the future after 1988 certainly turned out to be a very interesting time in computing -- but they had no idea what was in store. Perhaps you can use your time machine to go back and let them know?
The first 80286-based system (IBM PC AT), 80386 (Compaq Deskpro 386), and 80486 all had people writing about their suitability as servers, with the consensus's implication being that normal people didn't need them.
The Pentium is the first one, I think, that this didn't happen, because by then it turned out that people need a computer that can do what they are currently doing—but faster—much more often than they need servers.
Um. That never happened. No-one ever felt that. Not a soul.
Everyone - everyone knew it was the start of a revolution.