← Back to context

Comment by dahart

2 days ago

It seems like a mistake to lump HDR capture, HDR formats and HDR display together, these are very different things. The claim that Ansel Adams used HDR is super likely to cause confusion, and isn’t particularly accurate.

We’ve had HDR formats and HDR capture and edit workflows since long before HDR displays. The big benefit of HDR capture & formats is that your “negative” doesn’t clip super bright colors and doesn’t lose color resolution in super dark color. As a photographer, with HDR you can re-expose the image when you display/print it, where previously that wasn’t possible. Previously when you took a photo, if you over-exposed it or under-exposed it, you were stuck with what you got. Capturing HDR gives the photographer one degree of extra freedom, allowing them to adjust exposure after the fact. Ansel Adams wasn’t using HDR in the same sense we’re talking about, he was just really good at capturing the right exposure for his medium without needing to adjust it later. There is a very valid argument to be made for doing the work up-front to capture what you’re after, but ignoring that for a moment, it is simply not possible to re-expose Adams’ negatives to reveal color detail he didn’t capture. That’s why he’s not using HDR, and why saying he is will only further muddy the water.

Arguably, even considering HDR a distinct thing is itself weird an inaccurate.

All mediums have a range, and they've never all matched. Sometimes we've tried to calibrate things to match, but anyone watching SDR content for the past many years probably didn't do so on a color-calibrated and brightness calibrated screen - that wouldn't allow you to have a brightness slider.

HDR on monitors is about communicating content brightness and monitor capabilities, but then you have the question of whether to clip the highlights or just map the range when the content is mastered for 4000 nits but your monitor manages 1000-1500 and only in a small window.

  • This! Yes I think you’re absolutely right. The term “HDR” is in part kind of an artifact of how digital image formats evolved, and it kind of only makes sense relative to a time when the most popular image formats and most common displays were not very sophisticated about colors.

    That said, there is one important part that is often lost. One of the ideas behind HDR, sometimes, is to capture absolute values in physical units, rather than relative brightness. This is the distinguishing factor that film and paper and TVs don’t have. Some new displays are getting absolute brightness features, but historically most media display relative color values.

    • Absolute is also a funny size. From the perspective of human visual perception, an absolute brightness only matters if the entire viewing environment is also controlled to the same absolute values. Visual perception is highly contextual, and we are not only seeing the screen.

      It's not fun being unable to watch dark scenes during the day or evening in a living room, nor is vaporizing your retinas if the ambient environment went dark in the meantime. People want good viewing experience in the available environment that is logically similar to what the content intended, but that is not always the same as reproducing the exact same photons as the directors's mastering monitor sent towards their their eyeballs at the time of production.

      2 replies →

  • The term "HDR" arguably makes more sense for the effect achieved by tone mapping multiple exposures of the same subject onto a "normal" (e.g. SRGB) display. In this case, the "high" in "HDR" just means "from a source with higher dynamic range than the display."

    • Remember "wide gamut" screens ?

      This is part of 'HDR' standards too...

      And it's quite annoying that 'HDR' (and which specific one ?) is treated as just being 'on' or 'off' even for power users...

  • > but your monitor manages 1000-1500 and only in a small window.

    Owning a display that can do 1300+ nits sustained across a 100% window has been the biggest display upgrade I think I have ever had. It's given me a tolerance for LCD, a technology I've hated since the death of CRTs and turned me away from OLED.

    There was a time I would have said i'd never own a non OLED display again. But a capable HDR display changed that logic in a big way.

    Too bad the motion resolution on it, especially compared to OLED is meh. Again, at one point, motion was the most important aspect to me (its why I still own CRTs) but this level of HDR...transformative for lack of a better word.

    • Motion resolution? Do you mean the pixel response time?

      CRTs technically have quite a few artifacts in this area, but as content displayed CRTs tend to be built for CRTs this is less of an issue, and in many case even required. The input is expecting specific distortions and effects from scanlines and phosphor, which a "perfect" display wouldn't exhibit...

      The aggressive OLED ABL is simply a thermal issue. It can be mitigated with thermal design in smaller devices, and anything that increases efficiency (be it micro lens arrays, stacked "tandem" panels, quantum dots, alternative emitter technology) will lower the thermal load and increase the max full panel brightness.

      (LCD with zone dimming would also be able to pull this trick to get even brighter zones, but because the base brightness is high enough it doesn't bother.)

      1 reply →

Adams adjusted heavily with dodging and burning, even working to invent a new chemical process to provide more control when developing. He was great at determining exposure for his process as well. A key skill was having a vision for what the image would be after adjusting. Adams talked a lot about this as a top priority of his process.

  • > It's even more incredible that this was done on paper, which has even less dynamic range than computer screens!

    I came here to point this out. You have a pretty high dynamic range in the captured medium, and then you can use the tools you have to darken or lighten portions of the photograph when transferring it to paper.

    • Indeed so. Printing on paper and other substrates is inherently subtractive in nature which limits the gamut of colors and values that can be reproduced. Digital methods make the job of translating additive to subtractive media easier vs. the analog techniques available to film photographers. In any case, the image quality classic photography was able to achieve is truly remarkable.

      Notably, the dodging and burning used by photographers aren't obsolete. There's a reason these tools are included in virtually every image-editing program out there. Manipulating dynamic range, particularly in printed images, remains part of the craft of image-making.

> The claim that Ansel Adams used HDR is super likely to cause confusion

That isn't what the article claims. It says:

"Ansel Adams, one of the most revered photographers of the 20th century, was a master at capturing dramatic, high dynamic range scenes."

"Use HDR" (your term) is vague to the point of not meaning much of anything, but the article is clear that Adams was capturing scenes that had a high dynamic range, which is objectively true.

  • I think about the Ansel Adams zone system

    https://www.kimhildebrand.com/how-to-use-the-zone-system/

    where my interpretation is colored by the experience of making high quality prints and viewing them under different conditions, particularly poor illumination quality but you could also count "small handheld game console", "halftone screened and printed on newsprint" as other degraded conditions. In those cases you might imagine that the eye can only differentiate between 11 tones so even if an image has finer detail it ought to connect well with people if colors were quantized. (I think about concept art from Pokémon Sun and Moon which looked great printed with a thermal printer because it was designed to look great on a cheap screen.)

    In my mind, the ideal image would look good quantized to 11 zones but also has interesting detail in texture in 9 of the zones (extreme white and black don't show texture). That's a bit of an oversimplification (maybe a shot outdoors in the snow is going to trend really bright, maybe for artistic reasons you want things to be really dark, ...) but Ansel Adams manually "tone mapped" his images using dodging, burning and similar techniques to make it so.

  • Literally the sentence preceding the one you quoted is “What if I told you that analog photographers captured HDR as far back as 1857?”.

> It seems like a mistake to lump HDR capture, HDR formats and HDR display together, these are very different things.

These are all related things. When you talk about color, you can be talking about color cameras, color image formats, and color screens, but the concept of color transcends the implementation.

> The claim that Ansel Adams used HDR is super likely to cause confusion, and isn’t particularly accurate.

The post never said Adams used HDR. I very carefully chose the words, "capturing dramatic, high dynamic range scenes."

> Previously when you took a photo, if you over-exposed it or under-exposed it, you were stuck with what you got. Capturing HDR gives the photographer one degree of extra freedom, allowing them to adjust exposure after the fact.

This is just factually wrong. Film negatives have 12-stops of useful dynamic range, while photo paper has 8 stops at best. That gave photographers exposure latitude during the print process.

> Ansel Adams wasn’t using HDR in the same sense we’re talking about, he was just really good at capturing the right exposure for his medium without needing to adjust it later.

There's a photo of Ansel Adams in the article, dodging and burning a print. How would you describe that if not adjusting the exposure?

  • > Film negatives have 12-stops of useful dynamic range

    No, that’s not inherently true. AA used 12 zones, that doesn’t mean every negative stock has 12 stops of latitude. Stocks are different, you need to look at the curves.

    But yes most modern negatives are very forgiving. FP4 for example has barely any shoulder at all iirc.

  • I agree capture, format and display are closely related. But HDR capture and processing specifically developed outside of HDR display devices, and use of HDR displays changes how HDR images are used compared to LDR displays.

    > The post never said Adams used HDR. I very carefully chose the words

    Hey I’m sorry for criticizing, but I honestly feel like you’re being slightly misleading here. The sentence “What if I told you that analog photographers captured HDR as far back as 1857?” is explicitly claiming that analog photographers use “HDR” capture, and the Ansel Adams sentence that follows appears to be merely a specific example of your claim. The result of the juxtaposition is that the article did in fact claim Adams used HDR, even if you didn’t quite intend to.

    I think you’re either misunderstanding me a little, or maybe unaware of some of the context of HDR and its development as a term of art in the computer graphics community. Film’s 12 stops is not really “high” range by HDR standards, and a little exposure latitude isn’t where “HDR” came from. The more important part of HDR was the intent to push toward absolute physical units like luminance. That doesn’t just enable deferred exposure, it enables physical and perceptual processing in ways that aren’t possible with film. It enables calibrated integration with CG simulation that isn’t possible with film. And it enables a much wider rage of exposure push/pull than you can do when going from 12 stops to 8. And of course non-destructive digital deferred exposure at display time is quite different from a print exposure.

    Perhaps it’s useful to reflect on the fact that HDR has a counterpart called LDR that’s referring to 8 bits/channel RGB. With analog photography, there is no LDR, thus zero reason to invent the notion of a ‘higher’ range. Higher than what? High relative to what? Analog cameras have exposure control and thus can capture any range you want. There is no ‘high’ range in analog photos, there’s just range. HDR was invented to push against and evolve beyond the de-facto digital practices of the 70s-90s, it is not a statement about what range can be captured by a camera.

    • > The sentence “What if I told you that analog photographers captured HDR as far back as 1857?” is explicitly claiming that analog photographers use “HDR” capture,

      No, it isn't. It's saying they captured HDR scenes.

      > The result of the juxtaposition is that the article did in fact claim Adams used HDR

      You can't "use" HDR. It's an adjective, not a noun.

      > Film’s 12 stops is not really “high” range by HDR standards, and a little exposure latitude isn’t where “HDR” came from.

      The Reinhard tone mapper, a benchmark that regularly appears in research papers, specifically cites Ansel Adams as inspiration.

      "A classic photographic task is the mapping of the potentially high dynamic range of real world luminances to the low dynamic range of the photographic print."

      https://www-old.cs.utah.edu/docs/techreports/2002/pdf/UUCS-0...

      > Perhaps it’s useful to reflect on the fact that HDR has a counterpart called LDR that’s referring to 8 bits/channel RGB.

      8-bits per channel does not describe dynamic range. If I attach an HLG transfer function on an 8-bit signal, I have HDR. Furthermore, assuming you actually meant 8-bit sRGB, nobody calls that "LDR." It's SDR.

      > Analog cameras have exposure control and thus can capture any range you want.

      This sentence makes no sense.

      3 replies →

But the article even shows Adams dodging/burning a print, which is 'adjusting the exposure' in a localised fashion of the high dynamic range of the film, effectively revealing detail for the LDR of the resulting print that otherwise wouldn't have been visible.

If I look at one of the photography books in my shelf, they are even talking about 18 stops and such for some film material, and how this doesn't translate to paper and all the things that can be done to render it visible in print and how things behave at both extreme ends (towards black and white). Read: Tone-mapping (i.e. trimming down a high DR image to a lower DR output media) is really old.

The good thing about digital is that it can deal with color at decent tonal resolutions (if we assume 16 bits, not the limited 14 bit or even less) and in environments where film has technical limitations.

No, Adams, like everyone who develops their own film (or RAW digital photos) definitely worked in HDR. Film has much more DR than photographic paper, as noted by TFA author (and large digital sensors more than either SDR or HDR displays) especially if you’re such a master of exposure as Adams; trying to preserve the tonalities when developing and printing your photos is the real big issue.

Is there a difference in capturing in HDR vs RAW?

  • Good question. I think it depends. They are kind of different concepts, but in practice they can overlap considerably. RAW is about using the camera’s full native color resolution, and not having lossy compression. HDR is overloaded, as you can see from the article & comments, but I think HDR capture is conceptually about expressing brightness in physical units like luminance or radiance, and delaying the ‘exposure’ until display time. Both RAW and HDR typically mean using more than 8 bits/channel and capturing high quality images that will withstand more post-processing than ‘exposed’ LDR images can handle.