← Back to context

Comment by caseyy

3 days ago

We have much hardware on the secondary market (resale) that's only 2-3x slower than pristine new primary market devices. It is cheap, it is reuse, and it helps people save in a hyper-consumerist society. The common complaint is that it doesn't run bloated software anymore. And I don't think we can make non-bloated software for a variety of reasons.

As a video game developer, I can add some perspective (N=1 if you will). Most top-20 game franchises spawned years ago on much weaker hardware, but their current installments demand hardware not even a few years old (as recommended/intended way to play the game). This is due to hyper-bloating of software, and severe downskilling of game programmers in the industry to cut costs. The players don't often see all this, and they think the latest game is truly the greatest, and "makes use" of the hardware. But the truth is that aside from current-generation graphics, most games haven't evolved much in the last 10 years, and current-gen graphics arrived on PS4/Xbox One.

Ultimately, I don't know who or what is the culprit of all this. The market demands cheap software. Games used to cost up to $120 in the 90s, which is $250 today. A common price point for good quality games was $80, which is $170 today. But the gamers absolutely decry any game price increases beyond $60. So the industry has no option but to look at every cost saving, including passing the cost onto the buyer through hardware upgrades.

Ironically, upgrading a graphics card one generation (RTX 3070 -> 4070) costs about $300 if the old card is sold and $500 if it isn't. So gamers end up paying ~$400 for the latest games every few years and then rebel against paying $30 extra per game instead, which could very well be cheaper than the GPU upgrade (let alone other PC upgrades), and would allow companies to spend much more time on optimization. Well, assuming it wouldn't just go into the pockets of publishers (but that is a separate topic).

It's an example of Scott Alexander's Moloch where it's unclear who could end this race to the bottom. Maybe a culture shift could, we should perhaps become less consumerist and value older hardware more. But the issue of bad software has very deep roots. I think this is why Carmack, who has a practically perfect understanding of software in games, doesn't prescribe a solution.

One only needs to look at Horizon: Zero Dawn to note that the truth of this is deeply uneven across the games industry. World streaming architectures are incredible technical achievements. So are moddable engines. There are plenty of technical limits being pushed by devs, it's just not done at all levels.

  • You are right, but you picked a game by a studio known for its technical expertise, with plenty of points to prove about quality game development. I'd like them to be the future of this industry.

    But right now, 8-9/10 game developers and publishers are deeply concerned with cash and rather unconcerned by technical excellence or games as a form of interactive art (where, once again, Guerrilla and many other Sony studios are).

> Ultimately, I don't know who or what is the culprit of all this. The market demands cheap software. Games used to cost up to $120 in the 90s, which is $250 today. A common price point for good quality games was $80, which is $170 today. But the gamers absolutely decry any game price increases beyond $60. So the industry has no option but to look at every cost saving, including passing the cost onto the buyer through hardware upgrades.

Producing games doesn't cost anything on a per-unit basis. That's not at all the reason for low quality.

Games could cost $1000 per copy and big game studios (who have investors to worry about) would still release buggy slow games, because they are still going to be under pressure to get the game done by Christmas.