← Back to context

Comment by Negitivefrags

3 days ago

I really hope that this doesn't come to pass. It's all in on the two worst trends in graphics right now. Hardware Raytracing and AI based upscaling.

The amount of drama about AI based upscaling seems disproportionate. I know framing it in terms of AI and hallucinated pixels makes it sound unnatural, but graphics rendering works with so many hacks and approximations.

Even without modern deep-learning based "AI", it's not like the pixels you see with traditional rendering pipelines were all artisanal and curated.

  • AI upscaling is equivalent to lowering bitrate of compressed video.

    Given netflix popularity, most people obviously don’t value image quality as much as other factors.

    And it’s even true for myself. For gaming, given the choice of 30fps at a higher bitrate, or 60fps at a lower one, I’ll take the 60fps.

    But I want high bitrate and high fps. I am certainly not going to celebrate the reduction in image quality.

    • > AI upscaling is equivalent to lowering bitrate of compressed video.

      When I was a kid people had dozens of CDs with movies, while pretty much nobody had DVDs. DVD was simply too expensive, while Xvid allowed to compress entire movie into a CD while keeping good quality. Of course original DVD release would've been better, but we were too poor, and watching ten movies at 80% quality was better than watching one movie at 100% quality.

      DLSS allows to effectively quadruple FPS with minimal subjective quality impact. Of course natively rendered image would've been better, but most people are simply too poor to buy game rig that plays newest games 4k 120FPS on maximum settings. You can keep arguing as much as you want that natively rendered image is better, but unless you send me money to buy a new PC, I'll keep using DLSS.

    • I would rather play at 60fps with no upscaling or frame generation than 120fps.

    • > I am certainly not going to celebrate the reduction in image quality

      What about perceived image quality? If you are just playing the game chances of you noticing anything (unless you crank up the upscaling to the maximum) are near zero.

      1 reply →

  • The contentious part from what I get is the overhead for hallucinating these pixels, on cards that also cost a lot more than the previous generation for otherwise minimal gains outside of DLSS.

    Some [0] are seeing 20 to 30% drop in actual frames when activating DLSS, and that means as much latency as well.

    There's still games where it should be a decent tradeoff (racing or flight simulators ? Infinite Nikki ?), but it's definitely not a no-brainer.

    [0] https://youtu.be/EiOVOnMY5jI

I also find them completely useless for any games I want to play. I hope that AMD would release a card that just drops both of these but that's probably not realistic.

  • They will never drop ray tracing, some new games require ray tracing. The only case where I think it's not needed is some kind of specialized office prebuilt desktops or mini PCs.

What's wrong with hardware raytracing?

  • There are a lot of theoretical arguments I could give you about how almost all cases where hardware BVH can be used, there are better and smarter algorithms to be using instead. Being proud of your hardware BVH implementation is kind of like being proud of your ultra-optimised hardware bubblesort implementation.

    But how about a practical argument instead. Enabling raytracing in games tends to suck. The graphical improvements on offer are simply not worth the performance cost.

    A common argument is that we don't have fast enough hardware yet, or developers haven't been able to use raytracing to it's fullest yet, but it's been a pretty long damn time since this hardware was mainstream.

    I think the most damning evidence of this is the just released Battlefield 6. This is a franchise that previously had raytracing as a top-level feature. This new release doesn't support it, doesn't intend to support it.

    And in a world where basically every AAA release is panned for performance problems, BF6 has articles like this: https://www.pcgamer.com/hardware/battlefield-6-this-is-what-...

    • > But how about a practical argument instead. Enabling raytracing in games tends to suck. The graphical improvements on offer are simply not worth the performance cost.

      Pretty much this - even in games that have good ray tracing, I can't tell when it's off or on (except for the FPS hit) - I cared so little I bought a card not known to be good at it (7900XTX) because the two games I play the most don't support it anyway.

      They oversold the technology/benefits and I wasn't buying it.

      8 replies →

    • > Enabling raytracing in games tends to suck.

      Because enabling raytracing means the game supports non-raytracing too. Which limits the game's design on how they can take advantage of raytracing being realtime.

      The only exception to this I've seen The Finals: https://youtu.be/MxkRJ_7sg8Y . Made by ex-Battlefield devs, the dynamic environment from them 2 years ago is on a whole other level even compared to Battlefield 6.

      1 reply →

    • > But how about a practical argument instead.

      With raytracing lighting a scene goes from taking hours-days to just designating objects that emit light

    • naive q: could games detect when the user is "looking around" at breathtaking scenery and raytrace those? offer a button to "take picture" and let the user specify how long to raytrace? then for heavy action and motion, ditch the raytracing? even better, as the user passes through "scenic" areas, automatically take pictures in the background. Heck, this could be an upsell kind of like the RL pictures you get on the roller coaster... #donthate

      (sorry if obvious / already done)

      3 replies →

  • It will never be fast enough to work in real time without compromising some aspect of the player's experience.

    Ray tracing is solving the light transport problem in the hardest way possible. Each additional bounce adds exponentially more computational complexity. The control flows are also very branchy when you start getting into the wild indirect lighting scenarios. GPUs prefer straight SIMD flows, not wild, hierarchical rabbit hole exploration. Disney still uses CPU based render farms. There's no way you are reasonably emulating that experience in <16ms.

    The closest thing we have to functional ray tracing for gaming is light mapping. This is effectively just ray tracing done ahead of time, but the advantage is you can bake for hours to get insanely accurate light maps and then push 200+ fps on moderate hardware. It's almost like you are cheating the universe when this is done well.

    The human brain has a built in TAA solution that excels as frame latencies drop into single digit milliseconds.

    • The problem is the demand for dynamic content in AAA games. Large exterior and interior worlds with dynamic lights, day and night cycle, glass and translucent objects, mirrors, water, fog and smoke. Everything should be interactable and destructable. And everything should be easy to setup by artists.

      I would say, the closest we can get are workarounds like radiance cascades. But everything else than raytracing is just an ugly workaround which falls apart in dynamic scenarios. And don't forget that baking times and storing those results, leading to massive game sizes, are a huge negative.

      Funnily enough raytracing is also just an approximation to the real world, but at least artists and devs can expect it to work everywhere without hacks (in theory).

    • Manually placed lights and baking not only takes time away from iteration but also takes a lot of disk space for the shadow maps. RT makes development faster for the artists, I think DF even mentioned that doing Doom Eternal without RT would take so much disk space it wouldn’t be possible to ship it.

      edit: not Doom Etenral, it’s Doom The Dark Ages, the latest one.

      3 replies →

    • It's fast enough today. Metro Exodus, an RT-only game runs just fine at around 60 fps for me on a 3060 Ti. Looks gorgeous.

      Light mapping is a cute trick and the reason why Mirror's Edge still looks so good after all these years, but it requires doing away with dynamic lighting, which is a non-starter for most games.

      I want my true-to-life dynamic lighting in games thank you very much.

      1 reply →

    • How is Metro Exodus Enhanced Edition (that is purely raytraced) compromised compared to regular version that uses traditional lighting?

    • > It will never be fast enough to work in real time ...

      640Kb surely is enough!

  • Much higher resource demands, which then requires tricks like upscaling to compensate. Also you get uneven competition between GPU vendors because it is not hardware ray tracing but Nvidia raytracing in practice.

    On a more subjective note, you get less interesting art styles because studio somehow have to cram raytracing as a value proposition in there.

  • Not OP, but a lot of the current kvetching about hardware based ray tracing is that it’s basically an nvidia-exclusive party trick, similar to DLSS and physx. AMD has this inferiority complex where nvidia must not be allowed to innovate with a hardware+software solution, it must be pure hardware so AMD can compete on their terms.

  • 1. People somehow think that just because today's hardware can't handle RT all that well it will never be able to. A laughable position of course.

    2. People turn on RT in games not designed with it in mind and therefore observe only minor graphical improvements for vastly reduced performance. Simple chicken-and-egg problem, hardware improvements will fix it.

The gimmicks aren't the product, and the customers of frontier technologies aren't the consumers. The gamers and redditors and smartphone fanatics, the fleets of people who dutifully buy, are the QA teams.

In accelerated compute, the largest areas of interest for advancement are 1) simulation and modeling and 2) learning and inference.

That's why this doesn't make sense to a lot of people. Sony and AMD aren't trying to extend current trends, they're leveraging their portfolios to make the advancements that will shape future markets 20-40 years out. It's really quite bold.

So far the AI upscaling/interpolating has just been used to ship horribly optimized games with a somewhat acceptable framerate

  • And they're achieving "acceptable" frame rates and resolutions by sacrificing image quality in ways that aren't as easily quantified, so those downsides can be swept under the rug. Nobody's graphics benchmark emits metrics for how much ghosting is caused by the temporal antialiasing, or how much blurring the RT denoiser causes (or how much noise makes it past the denoiser). But they make for great static screenshots.

I disagree. From what I’ve read if the game can leverage RT the artists save a considerable amount of time when iterating the level designs. Before RT they had to place lights manually and any change to the level involved a lot of rework. This also saves storage since there’s no need to bake shadow maps.

  • So what stops the developers from iterating on a raytraced version of the game during development, and then executing a shadow precalcualtion step once the game is ready to be shipped? Make it an option to download, like the high resolution texture packs. They are offloading processing power and energy requirements to do so on consumer PCs, and do so in an very inefficient manner