← Back to context

Comment by rafaelmn

3 days ago

I disagree - current gen console aren't enough to deliver smooth immersive graphics - I played BG3 on PS first and then on PC and there's just no comparing the graphics. Cyberpunk same deal. I'll pay to upgrade to consistent 120/4k and better graphics, and I'll buy the games.

And there are AAA that make and will make good money with graphics being front and center.

>aren't enough to deliver smooth immersive graphics

I'm just not sold.

Do I really think that BG3 being slightly prettier than, say, Dragon Age / Skyrim / etc made it a more enticing game? Not to me certainly. Was cyberpunk prettier than Witcher 3? Did it need to be for me to play it?

My query isn't about whether you can get people to upgrade to play new stuff (always true). But whether they'd still upgrade if they could play on the old console with worse graphics.

I also don't think anyone is going to suddenly start playing video games because the graphics improve further.

  • > Do I really think that BG3 being slightly prettier than, say, Dragon Age / Skyrim / etc made it a more enticing game?

    Absolutely - graphical improvements make the game more immersive for me and I don't want to go back and replay the games I spent hundreds of hours in mid two thousands, like say NVN or Icewind Dale (never played BG 2). It's just not the same feeling now that I've played games with incomparable graphics, polished mechanics and movie level voice acting/mocap cutscenes. I even picked up Mass Effect recently out of nostalgia but gave up fast because it just isn't as captivating as it was back when it was peak graphics.

    • Well this goes to show that, as some other commenter said, the gamer community (whatever that is) is indeed very fragmented.

      I routinely re-play games like Diablo 2 or BG1/2 and I couldn't care less about graphics, voice acting or motion capture.

    • > Absolutely - graphical improvements make the game more immersive for me

      Exactly. Graphics are not the end all be all for assessing games, but it’s odd how quickly people handwave away graphics in a visual medium.

      9 replies →

    • For me, the better graphics, mocap etc., the stroger the uncanny valley feeling - i.e. I stop perceiving it as a video game, but instead see it as an incredibly bad movie.

    • > I don't want to go back and replay the games I spent hundreds of hours in mid two thousands, like say NVN or Icewind Dale (never played BG 2). It's just not the same feeling now that I've played games with incomparable graphics, polished mechanics and movie level voice acting/mocap cutscenes. I even picked up Mass Effect recently out of nostalgia but gave up fast because it just isn't as captivating as it was back when it was peak graphics.

      And yet many more have no such issue doing exactly this. Despite having a machine capable of the best graphics at the best resolution, I have exactly zero issues going back and playing older games.

      Just in the past month alone with some time off for surgery I played and completed Quake, Heretic and Blood. All easily as good, fun and as compelling as modern titles, if not in some ways better.

  • Two aspects I keep thinking about:

    -How difficult it must be for the art/technical teams at game studios to figure out for all the detail they are capable of putting on screen how much of it will be appreciated by gamers. Essentially making sure that anything they're going to be budgeting significant amount of worker time to creating, gamers aren't going to run right past it and ignore or doesn't contribute meaningfully to 'more than the sum of its parts'.

    -As much as technology is an enabler for art, alongside the install base issue how well does pursuing new methods fit how their studio is used to working, and is the payoff there if they spend time adapting. A lot of gaming business is about shipping product, and the studios concern is primarily about getting content to gamers than chasing tech as that is what lets their business continue, selling GPUs/consoles is another company's business.

Being an old dog that still cares about gaming, I would assert many games are also not taking advantage of current gen hardware, coded in Unreal and Unity, a kind of Electron for games, in what concerns taking advantage of existing hardware.

There is a reason there are so many complaints in social media about being obvious to gamers in what game engine a game was written on.

It used to be that game development quality was taken more seriously, when they were sold via storage media, and there was a deadline to burn those discs/cartridges.

Now they just ship whatever is done by the deadline, and updates will come later via a DLC, if at all.

  • They're both great engines. They're popular and gamers will lash out at any popular target.

    If it was so simple to bootstrap an engine no one would pay the percentage points to Unity and Epic.

    The reality is the quality bar is insanely high.

    • It is pretty simple to bootstrap an engine. What isn’t simple is supporting asset production pipelines on which dozen/hundreds of people can work on simultaneously, and on which new hires/contractors can start contributing right away, which is what modern game businesses require and what unity/unreal provide.

  • Unreal and Unity would be less problematic if these engines were engineered to match the underlying reality of graphics APIs/drivers, but they're not. Neither of these can systematically fix the shader stuttering they are causing architecturally, and so essentially all games built on these platforms are sentenced to always stutter, regardless of hardware.

    Both of these seem to suffer from incentive issues similar to enterprise software: They're not marketing and selling to either end users or professionals, but studio executives. So it's important to have - preferably a steady stream of - flashy headline features (e.g. nanite, lumen) instead of a product that actually works on the most basic level (consistently render frames). It doesn't really matter to Epic Games that UE4/5 RT is largely unplayable; even for game publishers, if you can pull nice-looking screenshots out of the engine or do good-looking 24p offline renders (and slap "in-game graphics" on them), that's good enough.

    • The shader stutter issues are non-existent on console, which is where most of their sales are. PC, as it has been for almost two decades, is an afterthought rather than a primary focus.

      12 replies →

    • Imagine living in a reality where the studio exec picks the engine based on getting screenshots 3 years later when there's something interesting to show.

      I mean, are you actually talking from experience at all here?

      It's really more that engines are an insane expense in money and time and buying one gets your full team in engine far sooner. That's why they're popular.

Just get a PC then? ;) In the end, game consoles haven't been much more than "boring" subsidized low-end PCs for quite a while now.

  • PC costs a lot and depreciates fast, by the end of a console lifecycle I can still count on developers targeting it - PC performance for 6+ year hardware is guaranteed to suck. And I'm not a heavy gamer - I'll spend ~100h on games per year, but so will my wife and my son - PC sucks for multiple people using it - PS is amazing. I know I could concoct some remote play setup via lan on TV to let my wife and kids play but I just want something I spend a few hundred eur and I plug into the TV and then it works.

    Honestly the only reason I caved with the GPU purchase (which cost the equivalent of a PS pro) was the local AI - but in retrospect that was useless as well.

    • > by the end of a console lifecycle I can still count on developers targeting it

      And I can count on those games still being playable on my six year old hardware because they are in fact developed for 6 year old hardware.

      > PC performance for 6+ year hardware is guaranteed to suck

      For new titles at maximum graphics level sure. For new titles at the kind of fidelity six year old consoles are putting out? Nah. You just drop your settings from "ULTIMATE MAXIMUM HYPER FOR NEWEST GPUS ONLY" to "the same low to medium at best settings the consoles are running" and off you go.

> current gen console aren't enough to deliver smooth immersive graphics

The Last of Us franchise, especially part 2 have been the most immersive experiences that I have had in gaming.

This game pretty much told me that the PlayStation is more than capable of delivering this kind of experiences.

Now, if some of those high budget so-called AAA games cannot deliver not even a fraction of that - I believe - is on them.

> current gen console aren't enough to deliver smooth immersive graphics

They were enough since PS4 era to deliver smooth, immersive graphics.