← Back to context

Comment by thrdbndndn

18 hours ago

The whole HDR scene still feels like a mess.

I know how bad the support for HDR is on computers (particularly Windows and cheap monitors), so I avoid consuming HDR content on them.

But I just purchased a new iPhone 17 Pro, and I was very surprised at how these HDR videos on social media still look like shit on apps like Instagram.

And even worse, the HDR video I shoot with my iPhone looks like shit even when playing it back on the same phone! After a few trials I had to just turn it off in the Camera app.

I wonder if it fundamentally only really makes sense for film, video games, etc. where a person will actually tune the range per scene. Plus, only when played on half decent monitors that don’t just squash BT.2020 so they can say HDR on the brochure.

  • Even without tuning it shouldn't look worse than squishing to SDR at capture time. There are are significant ecosystem failures that could be fixed.

The HDR implementation in Windows 11 is fine. And it's not even that bad in 11 in terms of titles and content officially supporting HDR. Most of the ideas that it's "bad" comes from the "cheap monitor" part, not windows.

I have zero issues and only an exceptional image on W11 with a PG32UQX.

  • Also if you get flashbanged by SDR content on Windows 11 there is a slider in HDR settings that lets you turn down the brightness of SDR content. I didn't know about this at first and had HDR disable because of this for a long time.

  • IIRC Windows still uses the sRGB curve for tone mapping of SDR content in HDR, so you have to toggle it on and off all the time.

    KDE Wayland went the better route and uses Gamma 2.2

The only time I shoot HDR on anything is because I plan on crushing the shadows/raising highlights after the fact. S curves all the way. Get all the dynamic range you can and then dial in the look. Otherwise it just looks like a flat washed out mess most of the time