← Back to context

Comment by pornel

2 days ago

DisplayHDR standard is supposed to be it, but they've ruined its reputation by allowing HDR400 to exist when HDR1000 should have been the minimum.

Besides, HDR quality is more complex than just max nits, because it depends on viewing conditions and black levels (and everyone cheats with their contrast metrics).

OLEDs can peak at 600 nits and look awesome — in a pitch black room. LCD monitors could boost to 2000 nits and display white on grey.

We have sRGB kinda working for color primaries and gamma, but it's not the real sRGB at 80 nits. It ended up being relative instead of absolute.

A lot of the mess is caused by the need to adapt content mastered for pitch black cinema at 2000 nits to 800-1000 nits in daylight, which needs very careful processing to preserve highlights and saturation, but software can't rely on the display doing it properly, and doing it in software sends false signal and risks display correcting it twice.