Comment by ahartmetz

1 day ago

On the other end, you have people who have no idea how insanely fast computers are today, and how little computing power is "really" needed for most things that computer users do - or how much you can do with one average machine ("Oh no, 1000 requests per second - let's erect another rube goldberg machine to handle that!").

I was born in 1981 and my first computer was an Andrea’s 1512 IBM XT clone. I then had a 386SX-16 in 1991 and a 486DX2-66 in 1994.

Anyway a while ago I was reading an article authored by a guy who lived through the same era as I grew up laughing at modern developers whom he had asked to size a machine to add all the integers from 1 to 100. Setting aside that 7 year old Gauss found the closed form of that sum (the triangular number formula) in about ten minutes and got the correct result of 5050 without any of the arithmetic busywork, it’s totally insane what some of the answers involved… with some involving the terms “Big Data” (yes, it was that era of hype, before “Crypto” and “AI”) and some even (allegedly) mentioning ‘clusters’. I really wish I could find you a link.

The 80s and 90s were filled with new things computers could do - spreadsheets, wysiwyg word processors, games - things that simply were impossible before (or not done).

In the 2000s through now we've mostly had improvements - 4k Youtube is much better than realplayer, but it's still just "online video". AI is definitely a "new" thing and it's somewhat awoken a similar spirit to the 80s/90s - but not the same breadth. Dad bringing home a computer because he wants to do spreadsheets and you finding it can run DooM or even play music.