Comment by sandofsky
2 days ago
While it isn't touched on in the post, I think the issue with feeds is that platforms like Instagram have no interest in moderating HDR.
For context: YouTube automatically edits the volume of videos that have an average loudness beyond a certain threshold. I think the solution for HDR is similar penalization based on log luminance or some other reasonable metric.
I don't see this happening on Instagram any time soon, because bad HDR likely makes view counts go up.
As for the HDR photos in the post, well, those are a bit strong to show what HDR can do. That's why the Mark III beta includes a much tamer HDR grade.
> YouTube automatically edits the volume of videos that have an average loudness beyond a certain threshold.
For anyone else who was confused by this, it seems to be a client-side audio compressor feature (not a server-side adjustment) labeled as "Stable Volume". On the web, it's toggleable via the player settings menu.
https://support.google.com/youtube/answer/14106294
I can't find exactly when it appeared but the earliest capture of the help article was from May 2024, so it is a relatively recent feature: https://web.archive.org/web/20240523021242/https://support.g...
I didn't realize this was a thing until just now, but I'm glad they added it because (now that I think about it) it's been awhile since I felt the need to adjust my system volume when a video was too quiet even at 100% player volume. It's a nice little enhancement.
Youtube has been long normalizing videos standard feed, switching to a -14 LUFS target in 2019. But LUFS is a global target, and is meant to allow higher peaks and troughs over the whole video, and the normalization does happen on a global level - if you exceed it by 3dB, then the whole video gets it's volume lowered by 3dB, no matter if the part is quiet or not.
The stable volume thing is meant to essentially level out all of the peaks and troughs, and IIRC it's actually computed server-side, I think yt-dlp can download stable volume streams if asked to.
The client side toggle might be new since 2024 but the volume normalisation has been a thing for a long time.
I know they've automatically boosted brightness in dark scenes for a long time too. It's not rare for people to upload a clip from a video game with a very dark scene and it's way brighter after upload than it was when they played or how it looks in the file they uploaded.
Yes, and I love it! Finally, the volume knob control was pulled from producers all about gimmicks to push their productions.
There are still gimmicks, but at least they do not include music so badly clipped as to be unlistenable... hint: go get the DVD or Blu-Ray release of whatever it is and you are likely to enjoy a not clipped album.
It is all about maximizing the overall sonic impact the music is capable of. Now when levels are sane, song elements well differentiated and equalized such that no or only a minor range of frequencies are crushed due to many sounds all competing for them, it will sound, full, great and not tiring!
Thanks audio industry. Many ears appreciate what was done.
1 reply →
Instagram has to allow HDR for the same reason that Firefox spent the past twenty years displaying web colors like HN orange at maximum display gamut rather than at sRGB calibrated: because a brighter red than anyone else’s draws people in, and makes the competition seem lifeless by comparison, especially in a mixed-profiles environment. Eventually that is regarded as ‘garishly bright’, so to speak, and people push back against it. I assume Firefox is already fixing this to support the latest CSS color spec (which defines #rrggbb as sRGB and requires it to be presented as such unless stated otherwise in CSS), but I doubt Instagram is willing to literally dim their feed; instead, I would expect them to begin AI-HDR’ing SDR uploads in order that all videos are captivatingly, garishly, bright.
> I think the solution for HDR is similar penalization based on log luminance or some other reasonable metric.
I completely understand the desire to address the issue of content authors misusing or intentionally abusing HDR with some kind of auto-limiting algorithm similar to the way the radio 'loudness wars' were addressed. Unfortunately, I suspect it will be difficult, if not impossible, to achieve without also negatively impacting some content applying HDR correctly for artistically expressive purposes. Static photos may be solvable without excessive false positive over-correction but cinematic video is much more challenging due to the dynamic nature of the content.
As a cinemaphile, I'm starting to wonder if maybe HDR on mobile devices simply isn't a solvable problem in practice. While I think it's solvable technically and certainly addressable from a standards perspective, the reality of having so many stakeholders in the mobile ecosystem (hardware, OS, app, content distributors, original creators) with diverging priorities makes whatever we do from a base technology and standards perspective unlikely to work in practice for most users. Maybe I'm too pessimistic but as a high-end home theater enthusiast I'm continually dismayed how hard it is to correctly display diverse HDR content from different distribution sources in a less complex ecosystem where the stakeholders are more aligned and the leading standards bodies have been around for many decades (SMPTE et al).
I believe everything could be solved the same way we solved high dynamic range in audio, with a volume control.
I find it pretty weird that all tvs and most monitors hide the brightness adjustment under piles and piles of menus when it could be right there in the remote alongside the sound volume buttons. Maybe phones could have hardware brightness buttons too, at least something as easy as it is on adjusting brightness in notebooks that have dedicated brightness fn buttons.
Such brightness slider could also control the amount of tonemapping applied to HDR content. High brightness would mean no to low tonemapping and low brightness would use a very agressive tonemapper producing a similar image to the SDR content along it.
Also note that good audio volume attenuation requires proper loudness contour compensation (as you lower the volume you also increase the bass and treble) for things to sound reasonably good and the "tone" sound well balanced. So, adjusting the tonemapping based on the brightness isn't that far off what we do with audio.
> because bad HDR likely makes view counts go up
Another related parallel trend recently is that bad AI images get very high view and like counts, so much so that I've lost a lot of motivation for doing real photography because the platforms cease to show them to anyone, even my own followers.
Why is nobody talking about the standards development? They (OS, image formats) could just say all stuff by default assumes SDR and if a media file explicitly calls for HDR even then it cannot have sharp transitions except in special cases, and the software just blocks or truncates any non conforming images. The OS should have had something like this for sound, about 25-30 years ago. For example a brightness aware OS/monitor combo could just outright disallow anything about x nits. And disallow certain contrast levels, in the majority of content.
FYI: You wrote Chrome 14 in the post, but I believe you meant Android 14.
Thanks. Updated.
Btw, YouTube doesn't moderate HDR either. I saw one video of a child's violin recital that was insanely bright, and probably just by accident of using a bad HDR recorder.
The effect of HDR increasing views is explicitly mentioned in the article
You are replying to the article's author.