Comment by Negitivefrags

3 days ago

There are a lot of theoretical arguments I could give you about how almost all cases where hardware BVH can be used, there are better and smarter algorithms to be using instead. Being proud of your hardware BVH implementation is kind of like being proud of your ultra-optimised hardware bubblesort implementation.

But how about a practical argument instead. Enabling raytracing in games tends to suck. The graphical improvements on offer are simply not worth the performance cost.

A common argument is that we don't have fast enough hardware yet, or developers haven't been able to use raytracing to it's fullest yet, but it's been a pretty long damn time since this hardware was mainstream.

I think the most damning evidence of this is the just released Battlefield 6. This is a franchise that previously had raytracing as a top-level feature. This new release doesn't support it, doesn't intend to support it.

And in a world where basically every AAA release is panned for performance problems, BF6 has articles like this: https://www.pcgamer.com/hardware/battlefield-6-this-is-what-...

> But how about a practical argument instead. Enabling raytracing in games tends to suck. The graphical improvements on offer are simply not worth the performance cost.

Pretty much this - even in games that have good ray tracing, I can't tell when it's off or on (except for the FPS hit) - I cared so little I bought a card not known to be good at it (7900XTX) because the two games I play the most don't support it anyway.

They oversold the technology/benefits and I wasn't buying it.

  • There were and always are people who swear to not see the difference with anything above 25hz, 30hz, 60hz, 120hz, HD, Full HD, 2K, 4K. Now it's ray-tracing, right.

    • Glad you intimately know how my perception of lighting in games works better than I do - though I'm curious how you do.

    • I can see the difference in all of those. I can even see the difference between 120hz and 240hz, and now I play on 240hz.

      Ray tracing looks almost indistinguishable from really good rasterized lighting in MOST conditions. In scenes with high amounts of gloss and reflections, it's a little more pronounced. A little.

      From my perspective, you're getting, like, a 5% improvement in only one specific aspect of graphics in exchange for a 200% cost.

      It's just not worth it.

      2 replies →

    • There’s an important distinction between being able to see the difference and caring about it. I can tell the difference between 30Hz and 60Hz but it makes no difference to my enjoyment of the game. (What can I say - I’m a 90s kid and 30fps was a luxury when I was growing up.) Similarly, I can tell the difference between ray traced reflections and screen space reflections because I know what to look for. But if I’m looking, that can only be because the game itself isn’t very engaging.

  • I think one of the challenges is that game designers have trained up so well at working within the non-RT constraints (and pushing back those constraints) that it's a tall order to make paying the performance cost (and new quirks of rendering) be paid back by RT improvements. There's also how a huge majority of companies wouldn't want to cut off potential customers in terms of whether their hardware can do RT at all or performance while doing so. The other big one is whether they're trying to recreate a similar environment with RT, or if they're taking advantage of what is only possible on the new technique, such as dynamic lighting and whether that's important to the game they want to make.

  • To me, the appeal is that game environments that can now be way more dynamic because we're not being limited by prebaked lighting. The Finals does this, but doesn't require ray tracing and it's pretty easy to tell when ray tracing is enabled: https://youtu.be/MxkRJ_7sg8Y

    But that's a game design change that takes longer

> Enabling raytracing in games tends to suck.

Because enabling raytracing means the game supports non-raytracing too. Which limits the game's design on how they can take advantage of raytracing being realtime.

The only exception to this I've seen The Finals: https://youtu.be/MxkRJ_7sg8Y . Made by ex-Battlefield devs, the dynamic environment from them 2 years ago is on a whole other level even compared to Battlefield 6.

> But how about a practical argument instead.

With raytracing lighting a scene goes from taking hours-days to just designating objects that emit light

naive q: could games detect when the user is "looking around" at breathtaking scenery and raytrace those? offer a button to "take picture" and let the user specify how long to raytrace? then for heavy action and motion, ditch the raytracing? even better, as the user passes through "scenic" areas, automatically take pictures in the background. Heck, this could be an upsell kind of like the RL pictures you get on the roller coaster... #donthate

(sorry if obvious / already done)

  • Even without RT I think it'd be beneficial to tune graphics settings depending on context, if it's an action/combat scene there's likely aspects the player isn't paying attention to. I think the challenge is it's more developer work whether it's done by implementing some automatic detection or manually being set scene by scene during development (which studios probably do already where they can set up specific arenas). I'd guess an additional task is making sure there's no glaring difference between tuning levels, and setting a baseline you can't go beneath.