Comment by somenameforme

5 days ago

Advances in video cards and graphics tech were overwhelmingly driven by video games. John Carmack, for instance, was directly involved in these processes and 'back in the day' it wasn't uncommon for games, particularly from him, to be developed to run on tech that did not yet exist, in collaboration with the hardware guys. Your desktop was outdated after a year and obsolete after 2, so it was a very different time than modern times where you example is not only completely accurate, but really understating it - a good computer from 10 years ago can still do 99.9% of what people need, even things like high end gaming are perfectly viable with well dated cards.

> a good computer from 10 years ago can still do 99.9% of what people need, even things like high end gaming are perfectly viable with well dated cards.

HN is strange. I have an old gaming build from 7-8 years ago and while it can do high end games on low settings and resolution, it doesn’t hold a candle to even a mid-range modern build.

“viable” is doing a lot of work in that claim. You can tolerate it at low res and settings and if you’re okay with a lot of frame rate dips, but nobody is going to mistake it for a modern build.

You’re also exaggerating how fast video cards became obsolete in the past. Many of us gamed just fine on systems that weren’t upgraded for 5-6 years at a time.

  • I'll take the absurd extreme end of my claim. Here [1] is a video of somebody running modern games on a GeForce GTX 1080 Ti, a card that was high end... 8 years ago. And he's doing it on high-ultra settings in 4k, and it's still doing fine. Spend a couple of hundred on a "new" video card and he'd be rocking a stable 60+FPS on everything, with some games he's still hitting that even with his card!

    And back in the early 2000s, even bleeding edge current-year rigs would struggle with new games like Doom 3, Far Cry, Crysis, and so on. Hardware was advancing so rapidly that games were being built in anticipation of upcoming hardware, so you had this scenario where high end systems bought in one year would struggle with games released that year, let alone systems from 5-6 years prior.

    Obviously if you're referencing CRPGs and the like, then yeah - absolutely anything could run them. The same remains even more true today. Baldur's Gate 3's minimum requirement is a GTX 970, a card more than 11 years old. Imagine a 1989 computer trying to run Baldur's Gate 2!

    [1] - https://www.youtube.com/watch?v=PRHjzDg_VHw

yes. a good reason to upgrade was PCIe 4.0 for I/O. GPU and SSD needs caused PCIe 5.0 to follow soon after.

  • I'm still on PCIe 3.0 on my main machine and the RX580 works fine for my needs. Outside of the scope of OP, I recently picked up a (new) 5060 not due to the impending memory production apocalypse but because I wanted to extend my current setup with something I recently read about on LSFG, previously posted here but garnered no interest/comments.

    - https://news.ycombinator.com/item?id=44499245

  • I wonder about this...I had thought I would be on PCIe 5.0 by now but I'm still on my AM4 PCIe 4.0 board since AM5 and PCIe 5.0 seem...glitchy and heat prone. And apparently I'm still not saturating PCIe 4.0...