Comment by brokenmachine
1 day ago
I'm with you on depth of field, but I don't understand why you think HDR reduces the fidelity of a game.
If you have a good display (eg an OLED) then the brights are brighter and simultaneously there is more detail in the blacks. Why do you think that is worse than SDR?
Check out this old post: https://www.realtimerendering.com/blog/thought-for-the-day/
HDR in games would frequently mean clipping highlights and adding bloom. Prior the "HDR" exposure looked rather flat.
That's not what it means since 2016 or so when consumer TVs got support for properly displaying brighter whites and colors.
It definitely adds detail now, and for the last 8-9 years.
Though consumer TVs obviously still fall short of being as bright at peak as the real world. (We'll probably never want our TV to burn out our vision like the sun, though, but probably hitting highs at least in the 1-2000nit range vs the 500-700 that a lot peak at right now would be nice for most uses.
OK, so it doesn't mean real HDR but simulated HDR.
Maybe when proper HDR support becomes mainstream in 3D engines, that problem will go away.
Right. Just like the article, HDR is too vague to mean anything specific and a label that's slapped onto products. In gaming, it often meant they were finally simulating light and exposure separately--clipping highlights that would have previously been shown. In their opinion, reducing the fidelity. Same with depth of field blurring things that used to not have blur.
It's HDR at the world data level, but SDR at the rendering level. It's simulating the way film cannot handle real-life high dynamic range and clips it instead of compressing it like "HDR" in photography.
1 reply →
The “HDR” here is in the sense of “tone mapping to SDR”. Should also be said that even “H” DR displays only have a stop or two of more range, still much less than in a real-world high-contrast scenes
It's still better though.
HDR displays are >1000nits while SDR caps out at less than 500nits even on the best displays.
Eg for the Samsung s90c, HDR is 1022nits, SDR is 487nits: https://www.rtings.com/tv/reviews/samsung/s90c-oled#test_608 https://www.rtings.com/tv/reviews/samsung/s90c-oled#test_4
Double the range is undeniably still better.
And also 10bit instead of 8bit, so less posterization as well.
Just because the implementations have been subpar until now doesn't mean it's worthless tech to pursue.