Comment by gyomu
2 days ago
As a photographer, I get the appeal of (this new incarnation of) HDR content, but the practical reality is that the photos I see posted in my feeds go from making my display looking normal to having photos searing my retinas, while other content that was uniform white a second prior now looks dull gray.
It's late night here so I was reading this article in dark mode, at a low display brightness - and when I got to the HDR photos I had to turn down my display even more to not strain my eyes, then back up again when I scrolled to the text.
For fullscreen content (games, movies) HDR is alright, but for everyday computing it's a pretty jarring experience as a user.
While it isn't touched on in the post, I think the issue with feeds is that platforms like Instagram have no interest in moderating HDR.
For context: YouTube automatically edits the volume of videos that have an average loudness beyond a certain threshold. I think the solution for HDR is similar penalization based on log luminance or some other reasonable metric.
I don't see this happening on Instagram any time soon, because bad HDR likely makes view counts go up.
As for the HDR photos in the post, well, those are a bit strong to show what HDR can do. That's why the Mark III beta includes a much tamer HDR grade.
> YouTube automatically edits the volume of videos that have an average loudness beyond a certain threshold.
For anyone else who was confused by this, it seems to be a client-side audio compressor feature (not a server-side adjustment) labeled as "Stable Volume". On the web, it's toggleable via the player settings menu.
https://support.google.com/youtube/answer/14106294
I can't find exactly when it appeared but the earliest capture of the help article was from May 2024, so it is a relatively recent feature: https://web.archive.org/web/20240523021242/https://support.g...
I didn't realize this was a thing until just now, but I'm glad they added it because (now that I think about it) it's been awhile since I felt the need to adjust my system volume when a video was too quiet even at 100% player volume. It's a nice little enhancement.
Youtube has been long normalizing videos standard feed, switching to a -14 LUFS target in 2019. But LUFS is a global target, and is meant to allow higher peaks and troughs over the whole video, and the normalization does happen on a global level - if you exceed it by 3dB, then the whole video gets it's volume lowered by 3dB, no matter if the part is quiet or not.
The stable volume thing is meant to essentially level out all of the peaks and troughs, and IIRC it's actually computed server-side, I think yt-dlp can download stable volume streams if asked to.
The client side toggle might be new since 2024 but the volume normalisation has been a thing for a long time.
3 replies →
Instagram has to allow HDR for the same reason that Firefox spent the past twenty years displaying web colors like HN orange at maximum display gamut rather than at sRGB calibrated: because a brighter red than anyone else’s draws people in, and makes the competition seem lifeless by comparison, especially in a mixed-profiles environment. Eventually that is regarded as ‘garishly bright’, so to speak, and people push back against it. I assume Firefox is already fixing this to support the latest CSS color spec (which defines #rrggbb as sRGB and requires it to be presented as such unless stated otherwise in CSS), but I doubt Instagram is willing to literally dim their feed; instead, I would expect them to begin AI-HDR’ing SDR uploads in order that all videos are captivatingly, garishly, bright.
> I think the solution for HDR is similar penalization based on log luminance or some other reasonable metric.
I completely understand the desire to address the issue of content authors misusing or intentionally abusing HDR with some kind of auto-limiting algorithm similar to the way the radio 'loudness wars' were addressed. Unfortunately, I suspect it will be difficult, if not impossible, to achieve without also negatively impacting some content applying HDR correctly for artistically expressive purposes. Static photos may be solvable without excessive false positive over-correction but cinematic video is much more challenging due to the dynamic nature of the content.
As a cinemaphile, I'm starting to wonder if maybe HDR on mobile devices simply isn't a solvable problem in practice. While I think it's solvable technically and certainly addressable from a standards perspective, the reality of having so many stakeholders in the mobile ecosystem (hardware, OS, app, content distributors, original creators) with diverging priorities makes whatever we do from a base technology and standards perspective unlikely to work in practice for most users. Maybe I'm too pessimistic but as a high-end home theater enthusiast I'm continually dismayed how hard it is to correctly display diverse HDR content from different distribution sources in a less complex ecosystem where the stakeholders are more aligned and the leading standards bodies have been around for many decades (SMPTE et al).
I believe everything could be solved the same way we solved high dynamic range in audio, with a volume control.
I find it pretty weird that all tvs and most monitors hide the brightness adjustment under piles and piles of menus when it could be right there in the remote alongside the sound volume buttons. Maybe phones could have hardware brightness buttons too, at least something as easy as it is on adjusting brightness in notebooks that have dedicated brightness fn buttons.
Such brightness slider could also control the amount of tonemapping applied to HDR content. High brightness would mean no to low tonemapping and low brightness would use a very agressive tonemapper producing a similar image to the SDR content along it.
Also note that good audio volume attenuation requires proper loudness contour compensation (as you lower the volume you also increase the bass and treble) for things to sound reasonably good and the "tone" sound well balanced. So, adjusting the tonemapping based on the brightness isn't that far off what we do with audio.
Why is nobody talking about the standards development? They (OS, image formats) could just say all stuff by default assumes SDR and if a media file explicitly calls for HDR even then it cannot have sharp transitions except in special cases, and the software just blocks or truncates any non conforming images. The OS should have had something like this for sound, about 25-30 years ago. For example a brightness aware OS/monitor combo could just outright disallow anything about x nits. And disallow certain contrast levels, in the majority of content.
Btw, YouTube doesn't moderate HDR either. I saw one video of a child's violin recital that was insanely bright, and probably just by accident of using a bad HDR recorder.
FYI: You wrote Chrome 14 in the post, but I believe you meant Android 14.
Thanks. Updated.
The effect of HDR increasing views is explicitly mentioned in the article
You are replying to the article's author.
> because bad HDR likely makes view counts go up
Another related parallel trend recently is that bad AI images get very high view and like counts, so much so that I've lost a lot of motivation for doing real photography because the platforms cease to show them to anyone, even my own followers.
Completely agree. To me, HDR feels like the system is ignoring my screen brightness settings.
I set my screen brightness to a certain level for a reason. Please don’t just arbitrarily turn up the brightness!
There is no good way to disable HDR on photos for iPhone, either. Sure, you can turn off the HDR on photos on your iphone. But then, when you cast to a different display, the TV tries to display the photos in HDR, and it won’t look half as good.
> To me, HDR feels like the system is ignoring my screen brightness settings.
You might be on to something there. Technically, HDR is mostly about profile signaling and therefore about interop. To support it in mpeg dash or hls media you need to make sure certain codec attributes are mentioned in the xml or m3u8 but the actual media payload stays the same.
Any bit or Bob being misconfigured or misinterpreted in the streaming pipeline will result in problems ranging from slightly suboptimal experience to nothing works.
Besides HDR, "spatial audio" formats like Dolby Atmos are notorious for interop isuues
> To me, HDR feels like the system is ignoring my screen brightness settings.
On both Android & iOS/MacOS it's not that HDR is ignoring your screen brightness, but rather the brightness slider is controlling the SDR range and then yes HDR can exceed that, that's the singular purpose of HDR to be honest. All the other purported benefits of HDR are at best just about HDR video profiles and at worst just nonsense bullshit. The only thing HDR actually does is allow for brighter colors vs. SDR. When used selectively this really enhances a scene. But restraint is hard, and most forms of HDR content production are shit. The HDR images that newer iPhones and Pixel phones are capturing are generally quite good because they are actually restrained, but then ironically both of them have horrible HDR video that's just obnoxiously bright.
"On both Android & iOS/MacOS it's not that HDR is ignoring your screen brightness, but rather the brightness slider is controlling the SDR range and then yes HDR can exceed that"
Doesn't this mean HDR is ignoring my brightness setting? Looking at the Mac color profiles, the default HDR has some fixed max brightness regardless of the brightness slider. And it's very bright, 1600 nits vs the SDR max of 600 nits. At least I was able to pick another option capping HDR to 600, but that still allows HDR video to force my screen to its normal full brightness even if I dimmed it.
2 replies →
>HDR can exceed that
It's not just the HDR content that gets brighter, but SDR content too. When I test it in Chrome on Android, if an HDR image shows up on screen the phone start overriding the brightness slider completely and making everything brighter, including the phone's system UI.
>The only thing HDR actually does is allow for brighter colors vs. SDR.
Not just brighter, but also darker, so it can preserve detail in dark areas rather than crushing them.
1 reply →
you are right but at least in my experience it's very easy for a modern iPhone to capture a bad HDR photo, usually because there is some small strong highlight (often a form of specular reflection from a metallic object) that causes everything to be HDR while the photo content wouldn't need it
1 reply →
It isn't just about the brightness (range).
In practice the 'HDR' standards are also about wider color gamuts (than sRGB), and (as mentioned in parallel) packed into more bits, in a different way, so as to minimise banding while keeping file sizes in check.
1 reply →
I’m under the impression this is caused by the use of “HDR mode”(s) and poor adaptive brightness implementations on devices. Displays such as the iPad Pro w/ OLED are phenomenal and don’t seem to implement an overactive adaptive brightness. HDR content has more depth without causing brightness distortion.
In contrast, my TV will change brightness modes to display HDR content and disables some of the brightness adjustments when displaying HDR content. It can be very uncomfortably bright in a dark room while being excessively dim in a bright room. It requires adjusting settings to a middle ground resulting in a mixed/mediocre experience overall. My wife’s laptop is the worst of all our devices, while reviews seem to praise the display, it has an overreactive adaptive brightness that cannot be disabled (along with decent G2G response but awful B2W/W2B response that causes ghosting).
Apple’s method involves a good deal of what they call “EDR”, wherein the display gamma is ramped down in concert with ramping the brightness up, so that the brighter areas get brighter while the non-bright areas remain dark due to gamma math; that term is helpful for searching their WWDC developer videos for more details.
It still looks like how they say where its way too bright and makes formerly “bright” whites appear even neutral grey surrounding the hdr content. Personally I find it extremely jarring when it is just a frame that is hdr within the overall screen. It is much nicer when the hdr content is full screen. Imo I wish I could just disable partial screen hdr and keep the full screen implementation because it is that distracting.
2 replies →
I experience the same thing you do — but my take on it is different. Being hit with HDR images (and videos on YouTube), while unsettling, makes me then realize how just damned dull the SDR world I had been forced to succumb to has been.
Let the whole experience be HDR and perhaps it won't be jarring.
That's not inherent to HDR though. BFV (unless I'm confusing it with something else) has a HDR adjustment routine where you push a slider until the HDR white and the SDR white are identical. Same could be done for desktop environments. In my experience, HDR support is very lacking in PCs atm. You can't even play Dolby Vision on Windows, which is the only widely-used HDR format with dynamic metadata.
If you mean https://i.imgur.com/0LtYuDZ.jpeg that is probably the slider GP wants but it's not about matching HDR white to SDR white, it's just about clamping the peak HDR brightness in its own consideration. The white on the left is the HDR brightness according to a certain value in nits set via the bottom slider. The white on the right is the maximally bright HDR signal. The goal of adjusting the slider is to find how bright of an HDR white your display can actually produce, which is the lowest slider value at which the two whites appear identical to a viewer.
Some games also have a separate slider https://i.imgur.com/wenBfZY.png for adjusting "paper white", which is the HDR white one might normally associate with matching to SDR reference white (100 nits when in a dark room according to the SDR TV color standards, higher in other situations or standards). Extra note: the peak brightness slider in this game (Red Dead Redemption 2) is the same knob as the brightness slider in the above Battlefield V screenshot)
Thanks for clarifying this!
>HDR support is very lacking in PCs atm.
I think it's because no one wants it.
No; Windows 10 barely supports it, and of my hundreds of Steam games, exactly none have any code making use of it; seemingly only AAA mega-budget games have the pipeline bandwidth, but e.g. Dune Imperium and Spaceways have no reason to use it and so don’t bother. Windows 11 focused on improving wide color support which is much more critical, as any game shipping for Mac/Win already has dealt with that aspect of the pipeline and has drabified their game for the pre-Win11 ICC nightmares. Even games like Elite Dangerous, which would be a top candidate for both HDR and wide color, don’t handle this yet; their pipeline is years old and I don’t envy them the work of trying to update it, given how they’re overlaying the in-game UI in the midst of their pipeline.
(It doesn’t help that Windows only allows HDR to be defined in EDID and monitor INF files, and that PC monitors start shutting off calibration features when HDR is enabled because their chipsets can’t keep up — just as most modern Sony televisions can’t do both Dolby Vision and VRR because that requires too much processing power for their design budget.)
5 replies →
I want it. And, I'm hardpressed to imagine a multimedia consumer that doesn't care about color and brightness ranges, unless they don't understand what that is. Every movie is recorded and many phones now capture in a range that can't be encoded in 8-bits.
On the browser spec side this is just starting to get implemented as a CSS property https://caniuse.com/mdn-css_properties_dynamic-range-limit so I expect it might start to be a more common thing in web tech based feeds given time.
This happens on Snapchat too with HDR videos. Brightness increases while everything else dims... including the buttons.
This seems more like a "your feeds" problem than an HDR problem. Much in the same way people screencap and convert images willy nilly. I suggest blocking non HDR content