← Back to context

Comment by poisonborz

3 days ago

The industry, and at large the gaming community is just long past being interested in graphics advancement. AAA games are too complicated and expensive, the whole notion of ever more complex and grandiose experiences doesn't scale. Gamers are fractured along thousands of small niches, even in sense of timeline in terms of 80s, 90s, PS1 era each having a small circle of businesses serving them.

The times of console giants, their fiefdoms and the big game studios is coming to an end.

I'll take the other side of this argument and state that people are interested in higher graphics, BUT they expect to see an equally higher simulation to go along with it. People aren't excited for GTA6 just because of the graphics, but because they know the simulation is going to be better then anything they've seen before. They need to go hand in hand.

  • That's totally where all this is going. More horsepower on a GPU doesn't necessarily mean it's all going towards pixels on the screen. People will get creative with it.

  • I'm almost certain that we'll see comments that GTA6 feels like a downgrade to big GTA5 fans, as there was a decade of content created for the online version of GTA5.

I disagree - current gen console aren't enough to deliver smooth immersive graphics - I played BG3 on PS first and then on PC and there's just no comparing the graphics. Cyberpunk same deal. I'll pay to upgrade to consistent 120/4k and better graphics, and I'll buy the games.

And there are AAA that make and will make good money with graphics being front and center.

  • >aren't enough to deliver smooth immersive graphics

    I'm just not sold.

    Do I really think that BG3 being slightly prettier than, say, Dragon Age / Skyrim / etc made it a more enticing game? Not to me certainly. Was cyberpunk prettier than Witcher 3? Did it need to be for me to play it?

    My query isn't about whether you can get people to upgrade to play new stuff (always true). But whether they'd still upgrade if they could play on the old console with worse graphics.

    I also don't think anyone is going to suddenly start playing video games because the graphics improve further.

    • > Do I really think that BG3 being slightly prettier than, say, Dragon Age / Skyrim / etc made it a more enticing game?

      Absolutely - graphical improvements make the game more immersive for me and I don't want to go back and replay the games I spent hundreds of hours in mid two thousands, like say NVN or Icewind Dale (never played BG 2). It's just not the same feeling now that I've played games with incomparable graphics, polished mechanics and movie level voice acting/mocap cutscenes. I even picked up Mass Effect recently out of nostalgia but gave up fast because it just isn't as captivating as it was back when it was peak graphics.

      13 replies →

    • Two aspects I keep thinking about:

      -How difficult it must be for the art/technical teams at game studios to figure out for all the detail they are capable of putting on screen how much of it will be appreciated by gamers. Essentially making sure that anything they're going to be budgeting significant amount of worker time to creating, gamers aren't going to run right past it and ignore or doesn't contribute meaningfully to 'more than the sum of its parts'.

      -As much as technology is an enabler for art, alongside the install base issue how well does pursuing new methods fit how their studio is used to working, and is the payoff there if they spend time adapting. A lot of gaming business is about shipping product, and the studios concern is primarily about getting content to gamers than chasing tech as that is what lets their business continue, selling GPUs/consoles is another company's business.

  • Being an old dog that still cares about gaming, I would assert many games are also not taking advantage of current gen hardware, coded in Unreal and Unity, a kind of Electron for games, in what concerns taking advantage of existing hardware.

    There is a reason there are so many complaints in social media about being obvious to gamers in what game engine a game was written on.

    It used to be that game development quality was taken more seriously, when they were sold via storage media, and there was a deadline to burn those discs/cartridges.

    Now they just ship whatever is done by the deadline, and updates will come later via a DLC, if at all.

    • They're both great engines. They're popular and gamers will lash out at any popular target.

      If it was so simple to bootstrap an engine no one would pay the percentage points to Unity and Epic.

      The reality is the quality bar is insanely high.

      1 reply →

    • Unreal and Unity would be less problematic if these engines were engineered to match the underlying reality of graphics APIs/drivers, but they're not. Neither of these can systematically fix the shader stuttering they are causing architecturally, and so essentially all games built on these platforms are sentenced to always stutter, regardless of hardware.

      Both of these seem to suffer from incentive issues similar to enterprise software: They're not marketing and selling to either end users or professionals, but studio executives. So it's important to have - preferably a steady stream of - flashy headline features (e.g. nanite, lumen) instead of a product that actually works on the most basic level (consistently render frames). It doesn't really matter to Epic Games that UE4/5 RT is largely unplayable; even for game publishers, if you can pull nice-looking screenshots out of the engine or do good-looking 24p offline renders (and slap "in-game graphics" on them), that's good enough.

      15 replies →

  • Just get a PC then? ;) In the end, game consoles haven't been much more than "boring" subsidized low-end PCs for quite a while now.

    • PC costs a lot and depreciates fast, by the end of a console lifecycle I can still count on developers targeting it - PC performance for 6+ year hardware is guaranteed to suck. And I'm not a heavy gamer - I'll spend ~100h on games per year, but so will my wife and my son - PC sucks for multiple people using it - PS is amazing. I know I could concoct some remote play setup via lan on TV to let my wife and kids play but I just want something I spend a few hundred eur and I plug into the TV and then it works.

      Honestly the only reason I caved with the GPU purchase (which cost the equivalent of a PS pro) was the local AI - but in retrospect that was useless as well.

      2 replies →

  • > current gen console aren't enough to deliver smooth immersive graphics

    The Last of Us franchise, especially part 2 have been the most immersive experiences that I have had in gaming.

    This game pretty much told me that the PlayStation is more than capable of delivering this kind of experiences.

    Now, if some of those high budget so-called AAA games cannot deliver not even a fraction of that - I believe - is on them.

  • > current gen console aren't enough to deliver smooth immersive graphics

    They were enough since PS4 era to deliver smooth, immersive graphics.

Advancements in lighting can help all games, not just AAA ones.

For example, Tiny Glade and Teardown have ray traced global illumination, which makes them look great with their own art style, rather than expensive hyper-realism.

But currently this is technically hard to pull off, and works only within certain constrained environments.

Devs are also constrained by the need to support multiple generations of GPUs. That's great from perspective of preventing e-waste and making games more accessible. But technically it means that assets/levels still have to be built with workarounds for rasterized lights and inaccurate shadows. Simply plugging in better lighting makes things look worse by exposing the workarounds, while also lacking polish for the new lighting system. This is why optional ray tracing effects are underwhelming.

Nintendo dominated last generation with switch. The games were only HD and many at 30fps. Some AAA didn't even get ported to them. But they sold a ton of units and a ton of games and few complained because they were having fun which is what gaming is all about anyways.

  • That is a different audience than people playing on pc/xbox/ps5. Although arguably each console has a different audience, so there is that.

    • > That is a different audience than people playing on pc/xbox/ps5.

      Many PC users also own a switch. It is in fact one of the most common pairings. There is very little I want get on PC from PS/Xbox so very little point in owning one, I won't get any of the Nintendo titles so keeping one around makes significantly more sense if I want to cover my bases for exclusives.

      1 reply →

idk, battlefield 6 came out today to very positive reviews and it's absolutely gorgeous.

  • It's fine, but definitely a downgrade compared to previous titles like Battlefield 1. At moments it looks pretty bad.

    I'm curious why graphics are stagnating and even getting worse in many cases.

    • Exploding production cost is pretty much the only reason (eg we hit diminishing returns in overall game asset quality vs production cost at least a decade ago) plus on the tech side a brain drain from rendering tech to AI tech (or whatever the current best-paid mega-hype is). Also, working in gamedev simply isn't "sexy" anymore since it has been industrialized to essentially assembly line jobs.

      1 reply →

    • Have you played it? I haven't so I'm just basing my opinion on some YouTube footage I've seen.

      BF1 is genuinely gorgeous, I can't lie. I think it's the photogrammetry. Do you think the lighting is better in BF1? I'm gonna go out on a limb and say that BF6's lighting is more dynamic.

      1 reply →

  • It looks like Frostbite 4.0 is so much better than Unreal 5.x. I cant wait to see comparison.