← Back to context

Comment by pfranz

1 day ago

Check out this old post: https://www.realtimerendering.com/blog/thought-for-the-day/

HDR in games would frequently mean clipping highlights and adding bloom. Prior the "HDR" exposure looked rather flat.

That's not what it means since 2016 or so when consumer TVs got support for properly displaying brighter whites and colors.

It definitely adds detail now, and for the last 8-9 years.

Though consumer TVs obviously still fall short of being as bright at peak as the real world. (We'll probably never want our TV to burn out our vision like the sun, though, but probably hitting highs at least in the 1-2000nit range vs the 500-700 that a lot peak at right now would be nice for most uses.

OK, so it doesn't mean real HDR but simulated HDR.

Maybe when proper HDR support becomes mainstream in 3D engines, that problem will go away.

  • Right. Just like the article, HDR is too vague to mean anything specific and a label that's slapped onto products. In gaming, it often meant they were finally simulating light and exposure separately--clipping highlights that would have previously been shown. In their opinion, reducing the fidelity. Same with depth of field blurring things that used to not have blur.

  • It's HDR at the world data level, but SDR at the rendering level. It's simulating the way film cannot handle real-life high dynamic range and clips it instead of compressing it like "HDR" in photography.

    • > Instead of compressing it like "HDR" in photography

      That's not HDR either, that's tone mapping to SDR. The entire point of HDR is that you don't need to compress it because your display can actually make use of the extra bits of information. Most modern phones take true HDR pictures that look great on an HDR display.