Comment by mort96
3 months ago
What does that achieve? Isn't it simpler to just not support HDR than to support HDR but tone map away the HDR effect?
Anyway, which web browsers have a setting to tone map HDR images such that they look like SDR images? (And why should "don't physically hurt my eyes" be an opt-in setting anyway instead of just the default?)
> What does that achieve?
Because then a user who wants to see the HDR image in all its full glory can do so. If the base image is not HDR, then there is nothing they can do about it.
> And why should "don't physically hurt my eyes" be an opt-in setting anyway instead of just the default?
While I very much support more HDR in the online world, I fully agree with you here.
However, I suspect the reason will boil down to what it usually does: almost no users change the default settings ever. And so, any default which goes the other way will invariably lead to a ton of support cases of "why doesn't this work".
However, web browsers are dark-mode aware, they could be HDR aware and do what you prefer based on that.
What user wants the web to look like this? https://floss.social/@mort/115147174361502259
That video is clearly not encoded correctly. If it were the levels would match the background, given there is no actual HDR content visible in that video frame.
Anyway, even if the video was of a lovely nature scene in proper HDR, you might still find it jarring compared to the surrounding non-HDR desktop elements. I might too, depending on the specifics.
However, like I said, it's up to the browser to handle this.
One suggestion I saw mentioned by some browser devs was to make the default to tone map HDR if the page is not viewed in fullscreen mode, and switch to full HDR range if it is fullscreen.
Even if that doesn't become the default, it could be a behavior the browser could let the user select.
3 replies →
If you want to avoid eye pain then you want caps on how much brightness can be in what percent of the image, not to throw the baby out with the bathwater and disable it entirely.
And if you're speaking from iphone experience, my understanding is the main problem there isn't extra bright things in the image, it's the renderer ignoring your brightness settings when HDR shows up, which is obviously stupid and not a problem with HDR in general.
If the brightness cap of the HDR image is full SDR brightness, what value remains in HDR? As far as I can see, it's all bath water, no baby
> If the brightness cap of the HDR image is full SDR brightness, what value remains in HDR?
If you set #ffffff to be a comfortable max, then that would be the brightness cap for HDR flares that fill the entire screen.
But filling the entire screen like that rarely happens. Smaller flares would have a higher cap.
For example, let's say an HDR scene has an average brightness that's 55% of #ffffff, but a tenth of the screen is up at 200% of #ffffff. That should give you a visually impressive boosted range without blinding you.
17 replies →
it actually is somewhat an HDR problem because the HDR standards made some dumb choices. SDR standardizes relative brightness, but HDR uses absolute brightness even though that's an obviously dumb idea and in practice no one with a brain actually implements it.
In a modern image chain, capture is more often than not HDR.
These images are then graded for HDR or SDR. I.e., sacrifices are made on the image data such that it is suitable for a display standard.
If you have an HDR image, it's relatively easy to tone-map that into SDR space, see e.g. BT.2408 for an approach in Video.
The underlying problem here is that the Web isn't ready for HDR at all, and I'm almost 100% confident browsers don't do the right things yet. HDR displays have enormous variance. From "Slightly above SDR" to experimental displays at Dolby Labs. So to display an image correctly, you need to render it properly to the displays capabilities. Likewise if you want to display a HDR image on an SDR monitor. I.e., tone mapping is a required part of the solution.
A correctly graded HDR image taken of the real world will have like 95% of the pixel values falling within your typical SDR (Rec.709/sRGB) range. You only use the "physically hurt my eyes" values sparingly, and you will take the room conditions into consideration when designing the peak value. As an example: cinemas using DCI-P3 peaks at 48 nits because the cinema is completely dark. 48 nits is more than enough for a pure white in that environment. But take that image and put it on a display sitting inside during the day, and it's not nearly enough for a white. Add HDR peaks into this, and it's easy to see that in a cinema, you probably shouldn't peak at 1000 nits (which is about 4.x stops of light above the DCI-P3 peak). In short: your rendering to the displays capabilities require that you probe the light conditions in the room.
It's also why you shouldn't be able to manipulate brightness on an HDR display. We need that to be part of the image rendering chain such that the right decisions can be made.
How about a user stylesheet that uses https://www.w3.org/TR/css-color-hdr-1/#the-dynamic-range-lim... ?
How about websites just straight up aren't allowed to physically hurt me, by default?
Web sites aren’t made for just you. If images from your screen are causing you issues, that is a you / your device problem, not a web site problem.
1 reply →
Note the spec does recommend providing a user option: https://drafts.csswg.org/css-color-hdr-1/#a11y
You asked “which web browsers have a setting to tone map HDR images such that they look like SDR images?”; I answered. Were you not actually looking for a solution?
1 reply →