Comment by Pannoniae

13 days ago

I mostly understand your perspective here from a developer point-of-view. I'd like to give you some perspective of /r/FuckTAA and similar views from the gamers' perspective.

What many gamers see in TAA and other anti-aliasing options is higher system requirements and worse image quality. Obviously, this isn't just caused by one factor but it's very noticeable if you compare games now to the games of yesteryear.

Here is a good collection of anti-aliasing methods. https://www.pcgamingwiki.com/wiki/Glossary:Anti-aliasing_(AA...

Modern graphics are generally less "accurate" in terms of rendering in exchange of way better visual effects like raytracing and better shading. There's a fairly linear progression of rendering quality throughout the history of 3D rendering. Games like Half-Life 2, Halo and the first CoDs had mostly exact forward rendering, the shading was fairly simple but the rendered image did not have spatial or temporal artifacts. Individual pixels could be discerned too. If you increased the resolution, you got a crisp image with crisp UI. These games usually used SSAA, MSAA or CSAA. These methods were very expensive because they effectively rendered at a higher resolution, they could have had double the shading cost. (I know that MSAA only does it for the edges but I'm keeping the explanation fairly simple)

Next, developers started to switch to framebuffer-based methods (FXAA and SMAA). These introduced blur to the edges of objects which was a downgrade in terms of image clarity. However, these methods are fairly fast and the games didn't exhibit motion-based artifacts. The shader is relatively fast and the only real cost is an extra framebuffer in memory - which the game already probably had for other post-processing effects.

Developers also started to use deferred rendering and more buffers in general. This led to an explosive growth of VRAM usage, so for optimisation purposes, they started using the console methods of checkerboard rendering and lower-resolution buffers. A good example of this is GTA V where you can see many dithering artifacts in the distance and a generally blurrier picture. These games still didn't have temporal artifacts - looking around did not generally reduce the image quality.

Graphical demands only got more intense, so developers started to "cheat" with forms of temporal anti-aliasing, which (as in the name) introduce temporal artifacts to the image. The stereotypical poster-child for this is Unreal Engine - it's seriously impressive in many aspects but developers just started to slap TAA on everything instead of optimising their shaders and LoDs. TAA is not really one single method - it can really vary in quality based on how many past frames are used, how artifacts are removed, etc.

A while ago, NVIDIA introduced DLSS and (real-time) Ray Tracing, which changed things quite a lot. With these techniques, not even the final framebuffer is at native resolution, but is instead being upscaled from a lower-resolution framebuffer using ML-based methods. Ray tracing also requires denoising because it's infeasible to raytrace every pixel in native resolution. It works surprisingly well, but there's another reduction in clarity. And of course, since the image is effectively being reconstructed (twice!), the temporal effects are horrible, even moving the cursor around can produce nauseating artifacts on more complex objects.

Currently, the gaming industry is at a stage where the massively better GPUs are not used to progressively enhance games but to cut corners and reduce the rendering quality for everyone. Games even use DLSS for their recommended system requirements now instead of being an optional enhancer for weaker GPUs. Currently, if you don't have a top-end GPU (4080 or 4090, basically), you have to make heavy compromises in the newest games in terms of image quality or use DLSS, even without RT. I predict that in the future, even top-end GPUs like the 5090 or 6090 will have to use DLSS to get playable framerates in 1440p with acceptable graphics, not just in 4K.

There is a reason why quite a few people are pissed off. Obviously, they might not know the exact causes of these problems but it's easy to see a correlation of TAA in game - bad rendering quality and choppy FPS, so they scapegoat that. I can't really blame them, I don't think it's realistic to expect gamers to be intimately familiar with the render pipeline.

GP here, did you look at the Digital Foundry video [1]? If so what about motion stability, i.e. not having flickering artifacts when looking around, something most traditional AA solutions fail terribly at.

I think the current wave of using temporal upscalers that take care of AA by design as a way to unlock free performance and or allow lazier development approaches is ill guided.

My bigger gripe with the state of the art of real-time 3D rendering is frametime inconsistency. UE5 is a particularly bad offender here. Counterpoint "Ratchet and Clank: Rift Apart" [2], it uses TAA, dynamic resolution, ray tracing and checkerboard rendering for certain passes and I find it looks stunning. No blurriness to be found here, and rock solid frametimes. If anything we are seeing the plague of developers chasing foto realism and Epic being their dealer.

[1] https://youtu.be/WG8w9Yg5B3g

[2] https://youtu.be/7xtJYpwvHjY