← Back to context

Comment by jiggawatts

3 months ago

That's not HDR. That's pretend HDR in an SDR file, an artistic effect, nothing more.

Actual HDR needs at least 10 bits per channel and a modern display with peak brightness far in excess of traditional monitors. Ideally over 1,000 nits compared to typical LCD brightness of about 200.

You also don't need "three pictures". That was a hack used for the oldest digital cameras that had about 8 bits of precision in their analog to digital converters (ADC). Even my previous camera had a 14-bit ADC and in practice could capture about 12.5 bits of dynamic range, which is plenty for HDR imaging.

Lightroom can now edit and export images in "true" HDR, basically the same as a modern HDR10 or Dolby Vision movie.

The problem is that the only way to share the exported HDR images is to convert them to a movie file format, and share them as a slide show.

There is no widely compatible still image format that can preserve 10-bit-per-channel colours, wide-gamut, and HDR metadata.

> Actual HDR needs at least 10 bits per channel and a modern display with peak brightness far in excess of traditional monitors. Ideally over 1,000 nits compared to typical LCD brightness of about 200.

In the Apple Silicon era, the MacBook Pro has a 1,000 nit display, with peak brightness at 1,600 nits when displaying HDR content.

Affinity Studio [1] also supports editing and exporting "true" HDR images.

[1]: https://www.affinity.studio

  • I have a 4K HDR OLED plugged into my Windows PC that works just fine for editing and viewing my photos.

    I have no way, in general, to share those photos with you, not without knowing ahead of time what software you’re using. I’ll also have to whip up a web server with custom HTML and a bunch of hacks to encode my images that will work for you but not my friends with Android phones or Linux PCs.

I never mentioned a file format. These operations are performed on the raw buffer, there is no hack. There is no minimum bit depth for HDR (except for maybe 2) that’s just silly. High dynamic range images just remap the physical light waves to match human perception, but collecting those waves can be done at any resolution or bit depth.

I wrote camera firmware. I’ve implemented HDR on the both the firmware level, and later at the higher client level when devices became faster. You’re either overloading terminology to the point where we are just talking past each other, or you’re very confused.

  • What you are taking about is also called HDR, but has nothing to do with what the other person is talking about. The other person is talking about the still image equivalent of HDR video formats. When displayed on an HDR capable monitor, it will map the brightest parts of the image to the extended headroom of the monitor instead of tone mapping it to be displayed on a standard SDR monitor. So to be even more clear: it defines brightness levels beyond what is normally 100%.

    • Even when HDR tone mapping in real time, such as a game engine or raw video feed, you would still be merging two or four multi-sampled tile memory blocks into a single output image. This is not fundamentally different, just a fancier pipeline on modern GPUs. And it’s completely unrelated to OPs rant about stupid developers preventing them for sharing their HDR images or whatever.

      1 reply →