Comment by ronsor
1 day ago
> Crazy take, in the late 90s/early 00s your GPU could be obsolete 9 months after buying.
Not because the developers were lazy, but because newer GPUs were that much better.
1 day ago
> Crazy take, in the late 90s/early 00s your GPU could be obsolete 9 months after buying.
Not because the developers were lazy, but because newer GPUs were that much better.
There were lazy devs back then too but I feel lazy devs have become the norm now.
I work in gamedev, historically AAA gamedev.
If you think that the programmers are unmotivated (lazy) or incompetent; you’re wrong on both counts.
The amount of care and talent is unmatched in my professional career, and they are often working from incomplete (and changing) specifications towards a fixed deadline across multiple hardware targets.
The issue is that games have such high expectations that they didn’t have before.
There are very few “yearly titles” that allow you to nail down the software in a nicer way over time, its always a mad dash to get it done, on a huge 1000+ person project that has to be permanently playable from MAIN and where unit/integration tests would be completely useless the minute they were built.
The industry will end, but not because of “lazy devs”, its the ballooned expectations, stagnant revenue opportunity, increased team sizes and a pathological contingent of people using games as a (bad) political vehicle without regard for the fact that they will be laid off if they can’t eventually generate revenue.
—-
Finally, back in the early days of games, if the game didn’t work, you assumed you needed better hardware and you would put the work in fixing drivers and settings or even upgrading to something that worked. Now if it doesn’t work on something from before COVID the consensus is that it is not optimised enough. I’m not casting aspersions at the mindset, but it’s a different mentality.
Most gamers don't have the faintest clue regarding how much work and effort a game requires these days to meet even the minimum expectations they have.
4 replies →
> The amount of care and talent is unmatched in my professional career, and they are often working from incomplete (and changing) specifications towards a fixed deadline across multiple hardware targets.
I fully agree and I really admire people working on the industry. When I see great games which are unplayable in the low end because of stupidly high minimum hardware requirements, I understand game devs are simply responding to internal trends within the industry, and especially going for a practical outcome by using an established game engine (such as Unreal 5).
But at some time I hope this GPU crunch forces this same industry to allocate time and resources either at the engine or at the game level to truly optimize for a realistic low end.
3 replies →
Because GPU developers started to be less lazy :)
No T&L meant everything was culled, clipped, transformed and per-vertex divided (perspective, lighting) on CPU.
Then you have brute force approach. Voodoo 1/2/3 doesnt employ any obvious speedup tricks in its pipeline. Every single triangle pushed into it is going to get textured (bilinear filtering, per pixel divide), shaded (lighting, blending, FOG applied) and then in the last step the card finally checks Z-buffer to decide between writing all this computed data to buffer or simply throwing it away.
It took a while before GPU devs started implementing low-hanging fruit optimizations https://therealmjp.github.io/posts/to-earlyz-or-not-to-early...
Hierarchical-Z, Fast Z clearing, Compressed Z buffer, Compressed Textures, Tiled shading. It all got added slowly one step at a time in early 2000.