← Back to context

Comment by the__alchemist

2 days ago

Not in the more general sense! It can refer to what its acronym spells out directly: Bigger range between dimmest and brightest capabilities of a display, imaging technique etc.

No. HDR can encode high dynamic range because (typically) it uses floating point encoding.

From a technical point of view, HDR is just a set of standards and formats for encoding absolute-luminance scene-referred images and video, along with a set of standards for reproduction.

  • No. HDR video (and images) don't use floating point encoding. They generally use a higher bit depth (10 bits or more vs 8 bits) to reduce banding and different transfer characteristics (i.e. PQ or HLG vs sRGB or BT.709), in addition to different YCbCr matrices and mastering metadata.

    And no, it's not necessarily absolute luminance. PQ is absolute, HLG is not.

    • Isn’t HLG using floating point(s)?

      Also DCI-P3 should fit in here somewhere, as it seems to be the most standardized color space for HDR. I would share more insight, if I had it. I thought I understood color profiles well, but I have encountered some challenges when trying to display in one, edit in another, and print “correctly”. And every device seems to treat color profiles a little bit differently.

      2 replies →

  • I think most HDR formats do not typically use 32 bit floating point. The first HDR file format I can remember is Greg Ward’s RGBE format, which is also now more commonly known as .HDR and I think is pretty widely used.

    https://www.graphics.cornell.edu/~bjw/rgbe.html

    It uses a type of floating point, in a way, but it’s a shared 8 bit exponent across all 3 channels, and the channels are still 8 bits each, so the whole thing fits in 32 bits. Even the .txt file description says it’s not “floating point” per-se since that implies IEEE single precision floats.

    Cameras and displays don’t typically use floats, and even CG people working in HDR and using, e.g., OpenEXR, might use half floats more often that float.

    Some standards do exist, and it’s improving over time, but the ideas and execution of HDR in various ways preceded any standards, so I think it’s not helpful to define HDR as a set of standards. From my perspective working in CG, HDR began as a way to break away from 8 bits per channel RGB, and it included improving both color range and color resolution, and started the discussion of using physical metrics as opposed to relative [0..1] ranges.