← Back to context

Comment by forrestthewoods

1 day ago

> I think it would be a good thing.

This is an insane thing to say.

> Game and engine devs simply don't bother anymore to optimize for the low end

All games carefully consider the total addressable market. You can build a low end game that runs great on total ass garbage onboard GPU. Suffice to say these gamers are not an audience that spend a lot of money on games.

It’s totally fine and good to build premium content that requires premium hardware.

It’s also good to run on low-end hardware to increase the TAM. But there are limits. Building a modern game and targeting a 486 is a wee bit silly.

If Nvidia gamer GPUs disappear and devs were forced to build games that are capable of running on shit ass hardware the net benefit to gamers would be very minimal.

What would actually benefit gamers is making good hardware available at an affordable price!

Everything about your comment screams “tall poppy syndrome”. </rant>

> This is an insane thing to say.

I don't think it's insane. In that hypothetical case, it would be a slightly painful experience for some people that the top end is a bit curtailed for a few years while game developers learn to target other cards, hopefully in some more portable way. But also feeling hard done by because your graphics hardware is stuck at 2025 levels for a bit is not that much of hardship really, is it? In fact, if more time is spent optimising for non-premium cards, perhaps the premium card that you already have will work better then the next upgrade would have.

It's not inconceivable that the overall result is a better computing ecosystem in the long run. The open source space in particular, where Nvidia has long been problematic. Or maybe it'll be a multi decade gaming winter, but unless gamers stop being willing to throw large amounts of money chasing the top end, someone will want that money even if Nvidia didn't.

  • There is a full actual order of magnitude difference between a modern discrete GPU and a high end card. Almost two orders of magnitude (100x) compare to an older (~2019) integrated GPU.

    > In fact, if more time is spent optimising for non-premium cards, perhaps the premium card that you already have will work better then the next upgrade would have.

    Nah. The stone doesn’t have nearly that much blood to squeeze. And optimizations for ultralow-end may or may not have any benefit to high end. This isn’t like optimizing CPU instruction count that benefits everyone.

I wonder what Balatro dos that wouldn’t be possible on a 486.

  • The swirly background (especially on the main screen), shiny card effects, and the CRT distortion effect would be genuinely difficult to implement on a system from that era. Balatro does all three with a couple hundred lines of GLSL shaders.

    (The third would, of course, be redundant if you were actually developing for a period 486. But I digress.)

  • Some fans ported Balatro to the VBA if you want a comparison of what is possible.

    But the buffer for a full HD screen fill most of the memory of a typical 486 computer I think

  • Virtually all the graphics? Modern computers are very fast.

    • Sure the effects, etc aren’t possible, nor would the resolutions of modern phones.

      But solitare ran on a 486 and I don’t see what of the gameplay requires massive CPU.

I always chuckle when I see an entitled online rant from a gamer. Nothing against them, it's just humorous. In this one, we have hard-nosed defense of free market principles in the first part worthy of Reagan himself, followed by a Marxist appeal for someone (who?) to "make hardware available at an affordable price!".