← Back to context

Comment by batiudrami

4 days ago

Crazy take, in the late 90s/early 00s your GPU could be obsolete 9 months after buying. The “optimisation” you talk about was the CPU in the ps4 generation was so weak and tech was moving so fast that any pc bought in 2015 onwards would easily brute force overpower anything that had been built for that generation.

> Crazy take, in the late 90s/early 00s your GPU could be obsolete 9 months after buying.

Not because the developers were lazy, but because newer GPUs were that much better.

  • There were lazy devs back then too but I feel lazy devs have become the norm now.

    • I work in gamedev, historically AAA gamedev.

      If you think that the programmers are unmotivated (lazy) or incompetent; you’re wrong on both counts.

      The amount of care and talent is unmatched in my professional career, and they are often working from incomplete (and changing) specifications towards a fixed deadline across multiple hardware targets.

      The issue is that games have such high expectations that they didn’t have before.

      There are very few “yearly titles” that allow you to nail down the software in a nicer way over time, its always a mad dash to get it done, on a huge 1000+ person project that has to be permanently playable from MAIN and where unit/integration tests would be completely useless the minute they were built.

      The industry will end, but not because of “lazy devs”, its the ballooned expectations, stagnant revenue opportunity, increased team sizes and a pathological contingent of people using games as a (bad) political vehicle without regard for the fact that they will be laid off if they can’t eventually generate revenue.

      —-

      Finally, back in the early days of games, if the game didn’t work, you assumed you needed better hardware and you would put the work in fixing drivers and settings or even upgrading to something that worked. Now if it doesn’t work on something from before COVID the consensus is that it is not optimised enough. I’m not casting aspersions at the mindset, but it’s a different mentality.

      9 replies →

  • Because GPU developers started to be less lazy :)

    No T&L meant everything was culled, clipped, transformed and per-vertex divided (perspective, lighting) on CPU.

    Then you have brute force approach. Voodoo 1/2/3 doesnt employ any obvious speedup tricks in its pipeline. Every single triangle pushed into it is going to get textured (bilinear filtering, per pixel divide), shaded (lighting, blending, FOG applied) and then in the last step the card finally checks Z-buffer to decide between writing all this computed data to buffer or simply throwing it away.

    It took a while before GPU devs started implementing low-hanging fruit optimizations https://therealmjp.github.io/posts/to-earlyz-or-not-to-early...

    Hierarchical-Z, Fast Z clearing, Compressed Z buffer, Compressed Textures, Tiled shading. It all got added slowly one step at a time in early 2000.

Obsolete in that you’d probably not BUY it if building new, and in that you’d probably be able to get a noticeably better one, but even then games were made to run in a wide gamut of hardware.

For awhile there you did have noticeable gameplay differences- those with GL quake could play better kind of thing.

  • The GP was talking about Unreal Engine 5 as if that engine doesn't optimize for low end. That's a wild take, I've been playing Arc Raiders with a group of friends in the past month, and one of them hadn't upgraded their PC in 10 years, and it still ran fine (20+ fps) on their machine. When we grew up it would be absolutely unbelievable that a game would run on a 10 year old machine, let alone at bearable FPS. And the game is even on an off-the-shelf game engine, they possibly don't even employ game engine experts at Embark Studios.

    • >And the game is even on an off-the-shelf game engine, they possibly don't even employ game engine experts at Embark Studios.

      Perhaps, but they also turned off Nanite, Lumen and virtual shadow maps. I'm not a UE5 hater but using its main features does currently come at a cost. I think these issues will eventually be fixed in newer versions and with better hardware, and at that point Nanite and VSM will become a no-brainer as they do solve real problems in game development.

    • > it still ran fine (20+ fps)

      20 fps is not fine. I would consider that unplayable.

      I expect at least 60, ideally 120 or more, as that's where the diminishing returns really start to kick in.

      I could tolerate as low as 30 fps on a game that did not require precise aiming or reaction times, which basically eliminates all shooters.

      1 reply →

> your GPU could be obsolete 9 months after buying

Or even before hitting the shelves, cue Trio3D and Mystique, but tha's another story.