← Back to context

Comment by Sharlin

2 months ago

The “HDR” here is in the sense of “tone mapping to SDR”. Should also be said that even “H” DR displays only have a stop or two of more range, still much less than in a real-world high-contrast scenes

It's still better though.

HDR displays are >1000nits while SDR caps out at less than 500nits even on the best displays.

Eg for the Samsung s90c, HDR is 1022nits, SDR is 487nits: https://www.rtings.com/tv/reviews/samsung/s90c-oled#test_608 https://www.rtings.com/tv/reviews/samsung/s90c-oled#test_4

Double the range is undeniably still better.

And also 10bit instead of 8bit, so less posterization as well.

Just because the implementations have been subpar until now doesn't mean it's worthless tech to pursue.

  • > HDR displays are >1000nits

    Displays as low as 400nits have been marketed as "HDR".

    But nits are only part of the story. What really matters at the end is the range between the darkest and brightest color the display can show under the lighting conditions you want to use it as. 400 nits in a darkened room where blacks are actually black can have much more actual range than 1000nits with very bright "blacks" due to shitty display tech or excessive external illumination.