Comment by sombragris

4 days ago

I doubt that this would ever happen. But...

If it does, I think it would be a good thing.

The reason is that it would finally motivate game developers to be more realistic in their minimum hardware requirements, enabling games to be playable on onboard GPUs.

Right now, most recent games (for example, many games built on Unreal Engine 5) are unplayable on onboard GPUs. Game and engine devs simply don't bother anymore to optimize for the low end and thus they end up gatekeeping games and excluding millions of devices because for recent games, a discrete GPU is required even for the lowest settings.

They're not targeting high-end PCs. They're targeting current generation consoles, specifically the PS5 + 1080p. It just turns out that when you take those system requirements and put them on a PC—especially a PC with a 1440p or 2160p ultrawide—it turns out to mean pretty top of the line stuff. Particularly if as a PC gamer you expect to run it at 90fps and not the 30-40 that is typical for consoles.

  • Without disagreeing with the broad strokes of your comment, it feels like 4K should be considered standard for consoles nowadays - a very usable 4K HDR TV can be had for $150-500.

    • Thats a waste of image quality for most people. You have to sit very close to a 4k display to be able to perceive the full resolution. On PC you could be 2 feet from a huge gaming monitor, but an extremely small percentage of console players have the tv size and distance ratio where they would get much out of full 4k. Much better to spend the compute on higher framerate or higher detail settings.

      6 replies →

  • 1440p and 2160p is a total waste of pixels, when 1080p is already at the level of human visual acuity. You can argue that 1440p is a genuine (slight) improvement for super crisp text, but not for a game. HDR and more ray tracing/path tracing, etc. are more sensible ways of pushing quality higher.

    • You sounded like someone who doesn’t have 1440p or 2160p.

      I have a 77’ S95D and my 1080p Switch looked horrible. Try it also with a 1080p screen bigger than 27 inch.

      1 reply →

    • Text rendering alone makes it worthwhile. 1080p densities are not high enough to render text accurately without artefacts. If you double pixel density, then it becomes (mostly) possible to renderi text weight accurately, and things like "rythm" and "density" which were things that real typographers concerned themselves with start to become apparent.

    • > 1440p and 2160p is a total waste of pixels, when 1080p is already at the level of human visual acuity.

      Wow, what a load of bullshit. I bet you also think the human eye can't see more than 30 fps?

      If you're sitting 15+ feet away from your screen, yeah, you can't tell the difference. But for most people, with their eyes only being 2-3 feet away from their monitor, the difference is absolutely noticeable.

      > HDR and more ray tracing/path tracing, etc. are more sensible ways of pushing quality higher.

      HDR is an absolute game-changer, for sure. Ray-tracing is as well, especially once you learn to notice the artifacts created by shortcuts required to get reflections in raster-based rendering. It's like bad kerning. Something you never noticed before will suddenly stick out like a sore thumb and will bother the hell out of you.

    • I'm sorry, you need to go to an optician. I can see the pixels at a comfortable distance at 1440p.

      Alternatively, you play modern games with incredibly blurry AA solutions. Try looking at something older from when AA actually worked.

      1 reply →

    • You are absolutely wrong on this subject. Importantly, what matters is PPI, not resolution. 1080P would look like crap in a movie theater or on a 55" TV, for example, while it'll look amazing on a 7" monitor.

You wish. Games will just be published cloud-only and you can only play them via thin clients.

  • It's pretty consistently been shown that this just can't provide low-enough latency for gamers to be comfortable with it. Every attempt at providing this has experience has failed. There's few games where this can even theoretically be viable.

    The economics of it also have issues, as now you have to run a bunch more datacenters full of GPUs, and with an inconsistent usage curve leaving a bunch of them being left idle at any given time. You'd have to charge a subscription to justify that, which the market would not accept.

    • I am pretty sure that the current demand of gpu's can pretty much eat the left idle time issue at major datacenters because of the AI craze.

      Not that its good or bad tho but we could probably have something more akin to spot instances of gpu being given for gaming purposes.

      I do see a lot of company are having GPU access costs per second/instant shutdown/restart I suppose but overall I agree

      My brother recently came for the holidays and I played ps5 for the first time on his mac connected to his room 70-100 kms away and honestly, the biggest factor of latency was how far the wifi connection (which was his phone's carrier) and overall, it was a good enough experience but I only played mortal kombat for a few minutes :)

      3 replies →

  • More like they wish. That would mean a globally good internet infrastructure which is absolutely never happening.

True. Optimization is completely dead. Long gone are the days of a game being amazing because the devs managed to pull crazy graphics for the current hardware.

Nowadays a game is only poorly optimized if it's literally unplayable or laggy, and you're forced to constantly upgrade your hardware with no discernible performance gain otherwise.

  • Crazy take, in the late 90s/early 00s your GPU could be obsolete 9 months after buying. The “optimisation” you talk about was the CPU in the ps4 generation was so weak and tech was moving so fast that any pc bought in 2015 onwards would easily brute force overpower anything that had been built for that generation.

    • > Crazy take, in the late 90s/early 00s your GPU could be obsolete 9 months after buying.

      Not because the developers were lazy, but because newer GPUs were that much better.

      12 replies →

    • Obsolete in that you’d probably not BUY it if building new, and in that you’d probably be able to get a noticeably better one, but even then games were made to run in a wide gamut of hardware.

      For awhile there you did have noticeable gameplay differences- those with GL quake could play better kind of thing.

      4 replies →

    • > your GPU could be obsolete 9 months after buying

      Or even before hitting the shelves, cue Trio3D and Mystique, but tha's another story.

  • A lot of people who were good at optimizing games have aged out and/or 'got theirs' and retired early or just got out of the demanding job and secured a better paying job in a sector with more economic upside and less churn. On the other side there's an unending almost exponential group of newcomers into the industry who are believe the hype given by engine makers who hide the true cost of optimimal game making and sell on 'ease'.

  • That's not how it ACTUALLY worked. How it actually worked is that top video card manufactures would make multi-million dollar bids to to the devs of the three or four AAA games that were predicted to be best-sellers in order to get the devs to optimize their rendering for whatever this year's top video adapter was going to be. And nobody really cared if it didn't run on your crappy old last-year's card, because everybody undrerstood that the vast majority of games revenue comes from people who have just bought expensive new systems. (Inside experience, I lived it).

    I don't think it has ever been the case that this year's AAA games play well on last year's video cards.

  • > Long gone are the days of a game being amazing because the devs managed to pull crazy graphics for the current hardware.

    DOOM and Battlefield 6 are praised for being surprisingly well optimized for the graphics they offer, and some people bought these games for that reason alone. But I guess in the good old days good optimization would be the norm, not the exception.

  • I feel like Steam Deck support is making developers optimize again.

    • Not to mention one of the "big three" console manufacturers building their business on older mobile hardware.

  • This is such a funny take, I remember all throughout the 90s and 00s (and maybe even 10s, not playing much then anymore) you often could new games on acceptable settings with a 1-2 year old high spec machine, in fact to play at highest settings you often needed ridiculously spec'ed machines. Now you can play the biggest titles (CP77, BG3 ...) on 5-10 year old hardware (not even top spec), with non or minimal performance/quality impact. I mean I've been playing BG3 and CP77 on highest settings on a PC that I bought 2 years ago used for $600 (BG3 I was playing when it had just come out).

One wonders what would happen in a SHtF situation or someone stubs their toe on the demolition charges switch at TSMC and all the TwinScans get minced.

Would there be a huge drive towards debloating software to run again on random old computers people find in cupboards?

  • Until we end up spending trillions recreating the fab capacity of tsmc, they dont have a full monppoly (yet)

Consoles and their install base set the target performance envelope. If your machine can't keep up with a 5 year old console then you should lower expectations.

And like, when have onboard GPUs ever been good? The fact that they're even feasible these days should be praised but you're imagining some past where devs left them behind.

> The reason is that it would finally motivate game developers to be more realistic in their minimum hardware requirements, enabling games to be playable on onboard GPUs.

They'll just move to remote rendering you'll have to subscribe to. Computers will stagnate as they are, and all new improvements will be reserved for the cloud providers. All hail our gracious overlords "donating" their compute time to the unwashed masses.

Hopefully AMD and Intel would still try. But I fear they'd probably follow Nvidia's lead.

  • Is remote rendering a thing? I would have imagined the lag would make something like that impractical.

    • The lag is high. Google was doing this with stadia. A huge amount of money comes from online multiplayer games and almost all of them require minimal latency to play well. So I doubt EA, Microsoft, Activision is going to effectively kill those cash cows.

      Game streaming works well for puzzle, story-esque games where latency isn't an issue.

      2 replies →

    • GeForce NOW is supposedly decent for a lot of games (depending on connection and distance to server), although if Nvidia totally left gaming they'd probably drop the service too.

    • It will be if personal computing becomes unaffordable. The lag is simply mitigated by having PoP everywhere.

> I think it would be a good thing.

This is an insane thing to say.

> Game and engine devs simply don't bother anymore to optimize for the low end

All games carefully consider the total addressable market. You can build a low end game that runs great on total ass garbage onboard GPU. Suffice to say these gamers are not an audience that spend a lot of money on games.

It’s totally fine and good to build premium content that requires premium hardware.

It’s also good to run on low-end hardware to increase the TAM. But there are limits. Building a modern game and targeting a 486 is a wee bit silly.

If Nvidia gamer GPUs disappear and devs were forced to build games that are capable of running on shit ass hardware the net benefit to gamers would be very minimal.

What would actually benefit gamers is making good hardware available at an affordable price!

Everything about your comment screams “tall poppy syndrome”. </rant>

  • > This is an insane thing to say.

    I don't think it's insane. In that hypothetical case, it would be a slightly painful experience for some people that the top end is a bit curtailed for a few years while game developers learn to target other cards, hopefully in some more portable way. But also feeling hard done by because your graphics hardware is stuck at 2025 levels for a bit is not that much of hardship really, is it? In fact, if more time is spent optimising for non-premium cards, perhaps the premium card that you already have will work better then the next upgrade would have.

    It's not inconceivable that the overall result is a better computing ecosystem in the long run. The open source space in particular, where Nvidia has long been problematic. Or maybe it'll be a multi decade gaming winter, but unless gamers stop being willing to throw large amounts of money chasing the top end, someone will want that money even if Nvidia didn't.

    • There is a full actual order of magnitude difference between a modern discrete GPU and a high end card. Almost two orders of magnitude (100x) compare to an older (~2019) integrated GPU.

      > In fact, if more time is spent optimising for non-premium cards, perhaps the premium card that you already have will work better then the next upgrade would have.

      Nah. The stone doesn’t have nearly that much blood to squeeze. And optimizations for ultralow-end may or may not have any benefit to high end. This isn’t like optimizing CPU instruction count that benefits everyone.

  • I wonder what Balatro dos that wouldn’t be possible on a 486.

    • The swirly background (especially on the main screen), shiny card effects, and the CRT distortion effect would be genuinely difficult to implement on a system from that era. Balatro does all three with a couple hundred lines of GLSL shaders.

      (The third would, of course, be redundant if you were actually developing for a period 486. But I digress.)

    • Some fans ported Balatro to the VBA if you want a comparison of what is possible.

      But the buffer for a full HD screen fill most of the memory of a typical 486 computer I think

  • I always chuckle when I see an entitled online rant from a gamer. Nothing against them, it's just humorous. In this one, we have hard-nosed defense of free market principles in the first part worthy of Reagan himself, followed by a Marxist appeal for someone (who?) to "make hardware available at an affordable price!".

I agree re "optimizations", but I dont think there should be compromises on quality (if set to max/ultra settings)

Gaming performance is so much more than hardware specs. Thinking that game devs optimizing their games on their own could fundamentally change the gaming experience is delusional.

And anyone who knows just a tiny bit of history of nvidia would know how much investment they have put into gaming and the technology they pioneered.

I haven't been on HN even 60 seconds this morning and I've already found a pro-monopoly take. Delightful.

  • I fail to see how my comment could be construed as being pro-monopoly.

    There are a huge number of onboard GPUs being left out of even minimum requirements for most recent games. I'm just saying that maybe this situation could led game devs to finally consider such devices as legitimate targets and thus make their games playable on such devices. This is by no means a pro-monopolistic take.

I have a 9 year old gaming PC with an RX480 and it is only now starting to not be able to run certain games at all (recent ones that require ray tracing). It can play Cyberpunk and Starfield on low settings very acceptably.