Comment by esperent
1 day ago
What we really need is some standards that everybody follows. The reason normal displays work so well is that everyone settled on sRGB, and as long as a display gets close to that, say 95% sRGB, everyone except maybe a few graphics designers will have a n equivalent experience.
But HDR, it's a minefield of different display qualities, color spaces, standards. It's no wonder that nobody gets it right and everyone feels confused.
HDR on a display that has peak brightness of 2000 nits will look completely different than a display with 800 nits, and they both get to claim they are HDR.
We should have a standard equivalent to color spaces. Set, say, 2000 nits as 100% of HDR. Then a 2000 nit display gets to claim it's 100% HDR. A 800 nit display gets to claim 40% HDR, etc. A 2500 nit display could even use 125% HDR in it's marketing.
It's still not perfect - some displays (OLED) can only show peak brightness over a portion of the screen. But it would be an improvement.
DisplayHDR standard is supposed to be it, but they've ruined its reputation by allowing HDR400 to exist when HDR1000 should have been the minimum.
Besides, HDR quality is more complex than just max nits, because it depends on viewing conditions and black levels (and everyone cheats with their contrast metrics).
OLEDs can peak at 600 nits and look awesome — in a pitch black room. LCD monitors could boost to 2000 nits and display white on grey.
We have sRGB kinda working for color primaries and gamma, but it's not the real sRGB at 80 nits. It ended up being relative instead of absolute.
A lot of the mess is caused by the need to adapt content mastered for pitch black cinema at 2000 nits to 800-1000 nits in daylight, which needs very careful processing to preserve highlights and saturation, but software can't rely on the display doing it properly, and doing it in software sends false signal and risks display correcting it twice.