Comment by fingerlocks

3 months ago

I never mentioned a file format. These operations are performed on the raw buffer, there is no hack. There is no minimum bit depth for HDR (except for maybe 2) that’s just silly. High dynamic range images just remap the physical light waves to match human perception, but collecting those waves can be done at any resolution or bit depth.

I wrote camera firmware. I’ve implemented HDR on the both the firmware level, and later at the higher client level when devices became faster. You’re either overloading terminology to the point where we are just talking past each other, or you’re very confused.

What you are taking about is also called HDR, but has nothing to do with what the other person is talking about. The other person is talking about the still image equivalent of HDR video formats. When displayed on an HDR capable monitor, it will map the brightest parts of the image to the extended headroom of the monitor instead of tone mapping it to be displayed on a standard SDR monitor. So to be even more clear: it defines brightness levels beyond what is normally 100%.

  • Even when HDR tone mapping in real time, such as a game engine or raw video feed, you would still be merging two or four multi-sampled tile memory blocks into a single output image. This is not fundamentally different, just a fancier pipeline on modern GPUs. And it’s completely unrelated to OPs rant about stupid developers preventing them for sharing their HDR images or whatever.

    • HDR photos taken on iOS or Android devices are displayed as SDR images when opened on Windows. The gain map that they contain (see ISO 21496-1) is ignored. Before the ISO standard it didn’t even work between iOS and Android. This is what OP’s frustration is about.