Comment by jiggawatts

3 months ago

2026 is nearly upon us, and Google, Microsoft, and Apple remain steadfast in the refusal to ever allow anyone to share wide-gamut or HDR images.

Every year, I go on a rant about how my camera can take HDR images natively, but the only way to share these with a wider audience is to convert them to a slideshow and make a Rec.2020 HDR movie that I upload to YouTube.

It's absolutely bonkers to me that we've all collectively figured out how to stream a Hollywood movie to a pocket device over radio with a quality exceeding that of a typical cinema theatre, but these multi-trillion market cap corporations have all utterly failed to allow users to reliably send a still image with the same quality to each other!

Any year now, maybe in 2030s, someone will get around to a ticket that is currently at position 11,372 down the list below thousands of internal bullshit that nobody needed done, rearranging a dashboard nobody has ever opened, or whatever, and get around to letting computers be used for images. You know, utilising the screen, the only part billions of users ever look at, with their human eyes.

I can't politely express my disgust at the ineptitude, the sloth, the foot dragging, the uncaring unprofessionalism of people that get paid more annually then I get in a decade who are all too distracted making Clippy 2.0 instead of getting right the most utterly fundamental aspect of consumer computing.

If I could wave a magic wand, I would force a dev team from each of these companies to remain locked in a room until this was sorted out.

I’m wondering if HDR means something different to me, because I see HDR images all the time. I can share HDR images via phones (this seems to be the default behavior on iPhone/Mac messages), I can see HDR PNG stills on the web (https://github.com/swankjesse/hdr-emojis), I can see wide gamut P3 images on the web as well (https://webkit.org/blog-files/color-gamut/).

What am I missing?

  • > I can share HDR images via phones

    Sure, me too! I can take a HDR P3 gamut picture with my iPhone and share it with all my friends and relatives... that have iPhones.

    What I cannot do is take a picture with a $4000 Nikon DSLR and share it in the same way... unless I also buy a Mac so I can encode it in the magic Apple-only format[1] that works... for Mac and IOS users. I have a Windows PC. Linux users are similarly out in the cold.

    This situation so incredibly bad that I can pop the SD card of my camera into an reader plugged into my iPhone, process the RAW image on the iPhone with the Lightroom iPhone app in full, glorious HDR... and then be unable to export the HDR image onto the same device for viewing because oh-my-fucking-god-why!?

    [1] They claim it is a standards-compliant HEIF file. No, it isn't. That's a filthy lie. My camera produces a HDR HEIF file natively, in-body. Everything opens it just fine, except all Apple ecosystem devices. I suspect the only way to get Apple to budge is to sue them for false advertising. But... sigh... they'll just change their marketing to remove "HEIF" and move on.

It is incredibly annoying that instead of adopting JpegXL they decided to use UltraHDR. A giant hack which works very poorly.

  • That's backwards compatibility for you.

    I think Ultra HDR (and Apple's take on it, ISO 21496-1) make a lot of sense in a scenario where shipping alternate formats/codecs is not viable because renderer capabilities are not known or vary, similarly to how HDR was implemented on Blu-Ray 4K discs with the backwards-compatible Dolby Vision profiles.

    It's also possible to do what Apple has done for HEIC on iOS: Store the modern format, convert to the best-known supported format at export/sharing time.

  • > A giant hack which works very poorly.

    Indeed. I tried every possible export format from Adobe Lightroom including JPG + HDR gainmaps, and it looks... potato.

    With a narrow gamut like sRGB it looks only slightly better than JPG, but with a wider gamut you get terrible posterization. People's faces turn grey and green and blue skies get bands across them.

    Meanwhile my iPhone creates flawless 10-bit Dolby Vision video with the press of a button that I can share with anyone without it turning into a garbled mess.

    Just last week I checked up on the "state of the art" for HDR still image sharing with Gemini Deep Research and after ten minutes of trawling through obscure forum posts it came back with a blunt "No".

    We've figured out how to make machines think, but not how to exchange pictures in the quality that my 12-year-old DSLR is capable of capturing!

    ... unless I make a YouTube video with the images. That -- and only that -- works!

  • JPEG XL supports UltraHDR.

    JPEG XL's normal HDR capabilities were not harmed in the process when UltraHDR was added.

    It was added for reaching parity with JPEG1 and HEIF/AVIF for the needs of UltraHDR developers and believers.

Just use PNG: https://www.w3.org/TR/png-3/ (for HDR content, see the cICP, mDCV and cLLI chunks; also note that PNG supports up to 16-bit channel depth out of the box).

  • PNG encoding a 16-bit HDR photo from my old camera needs about 150 MB, from the new camera they're something like 220 MB.

    That's a size of a 30 minute TV show for a single happy snap.

    Not exactly user-friendly for sharing in a text message.

> 2026 is nearly upon us, and Google, Microsoft, and Apple remain steadfast in the refusal to ever allow anyone to share wide-gamut or HDR images.

Huh? Safari seems to render HDR JPEG XLs without any issues these days (e.g. [1]), and supports wide gamut in even more formats as far a I remember.

[1] https://jpegxl.info/resources/hdr-test-page.html

  • "Share" is the key word in my rant. I know spotty support exists here and there for one format or another.

    The problem is that I can't, in general widely share a HDR image and have it be correctly displayed via ordinary chat applications, social media, email, or what have you. If it works at all, it only works with that One Particular Format in One Specific Scenario.

    If you disagree, find me something "trivial", such as a photo sharing site that supports HDR image uploads and those images are viewable as wide-gamut HDR on mobile devices, desktops, etc... without any endpoint ever displaying the image incorrectly such a very dark, very bright, or shifted colors.

> the only way to share these with a wider audience is to convert them to a slideshow and make a Rec.2020 HDR movie that I upload to YouTube

i understand some of this frustration, but really you just have to use ffmpeg to convert it to a web format (which can be done by ffmpeg.js running in a service worker if your cpu is expensive) and spell <img as <video muted autoplay playsinline which is only a little annoying

> I can't politely express my disgust at the ineptitude, the sloth, the foot dragging, the uncaring unprofessionalism of people that get paid more annually then I get in a decade who are all too distracted making Clippy 2.0 instead of getting right the most utterly fundamental aspect of consumer computing.

hear hear

> If I could wave a magic wand, I would force a dev team from each of these companies to remain locked in a room until this was sorted out.

i can think of a few better uses for such a wand...

  • > <img as <video muted autoplay playsinline which is only a little annoying

    Doesn't work for sharing images in text messages, social media posts, email, Teams, Wikipedia, etc...

    > i can think of a few better uses for such a wand...

    We all have our priorities.

The web has supported 16 bit pngs for decades. This is enough bits for more dynamic range than a human eye with a fixed pupil size.

I wish I could upvote this multiple times. Spot on, the situation is completely batshit bonkers insane.

> It's absolutely bonkers to me that we've all collectively figured out how to stream a Hollywood movie to a pocket device over radio with a quality exceeding that of a typical cinema theatre, but these multi-trillion market cap corporations have all utterly failed to allow users to reliably send a still image with the same quality to each other!

You act like this is some kind of mistake or limit of technology, but really it's an obvious intentional business decision.

Under late stage capitalism, it'd be weird if this wasn't the case in 2026.

Address the underlying issue, or don't be surprised by the race to the bottom.

  • This theory utterly fails Hanlon's razor (or whatever the organizational/societal equivalent is).

    On one hand, there have been (and still are!) several competing HDR formats for videos (HDR+, Dolby Vision, "plain" HLG, Dolby Vision in HLG etc.), and it tooks years for a winner to pull ahead – that race just started earlier, and the set of stakeholders is different (and arguably a bit smaller) than that for still images.

    On the other hand, there are also several still image HDR formats competing with each other right now (JPEG with depth map metadata, i.e. Ultra HDR and ISO 21496-1, Apple's older custom metadata, HEIF, AVIF, JPEG XL...), and JPEG XL isn't the clear winner yet.

    Format wars are messy, and always have been. Yes, to some extent they are downstream of the lack of a central standardization body, but there's no anti-HDR cabal anywhere. If anything, it's the opposite – new AV formats requiring new hardware is just about the best thing that can happen to device manufacturers.

What are you talking about? You extract 3 exposure values from the raw camera buffer and merge and tone map them manually into a single HDR image. The final exported image format may not have the full supported color space, but that’s on you. Apple uses the P3 space by default.

This has been supported by both Apple and third party apps for over a decade. I’ve implemented it myself.

  • That's not HDR. That's pretend HDR in an SDR file, an artistic effect, nothing more.

    Actual HDR needs at least 10 bits per channel and a modern display with peak brightness far in excess of traditional monitors. Ideally over 1,000 nits compared to typical LCD brightness of about 200.

    You also don't need "three pictures". That was a hack used for the oldest digital cameras that had about 8 bits of precision in their analog to digital converters (ADC). Even my previous camera had a 14-bit ADC and in practice could capture about 12.5 bits of dynamic range, which is plenty for HDR imaging.

    Lightroom can now edit and export images in "true" HDR, basically the same as a modern HDR10 or Dolby Vision movie.

    The problem is that the only way to share the exported HDR images is to convert them to a movie file format, and share them as a slide show.

    There is no widely compatible still image format that can preserve 10-bit-per-channel colours, wide-gamut, and HDR metadata.

    • > Actual HDR needs at least 10 bits per channel and a modern display with peak brightness far in excess of traditional monitors. Ideally over 1,000 nits compared to typical LCD brightness of about 200.

      In the Apple Silicon era, the MacBook Pro has a 1,000 nit display, with peak brightness at 1,600 nits when displaying HDR content.

      Affinity Studio [1] also supports editing and exporting "true" HDR images.

      [1]: https://www.affinity.studio

      1 reply →

    • I never mentioned a file format. These operations are performed on the raw buffer, there is no hack. There is no minimum bit depth for HDR (except for maybe 2) that’s just silly. High dynamic range images just remap the physical light waves to match human perception, but collecting those waves can be done at any resolution or bit depth.

      I wrote camera firmware. I’ve implemented HDR on the both the firmware level, and later at the higher client level when devices became faster. You’re either overloading terminology to the point where we are just talking past each other, or you’re very confused.

      4 replies →