Comment by arghwhat

2 days ago

Arguably, even considering HDR a distinct thing is itself weird an inaccurate.

All mediums have a range, and they've never all matched. Sometimes we've tried to calibrate things to match, but anyone watching SDR content for the past many years probably didn't do so on a color-calibrated and brightness calibrated screen - that wouldn't allow you to have a brightness slider.

HDR on monitors is about communicating content brightness and monitor capabilities, but then you have the question of whether to clip the highlights or just map the range when the content is mastered for 4000 nits but your monitor manages 1000-1500 and only in a small window.

This! Yes I think you’re absolutely right. The term “HDR” is in part kind of an artifact of how digital image formats evolved, and it kind of only makes sense relative to a time when the most popular image formats and most common displays were not very sophisticated about colors.

That said, there is one important part that is often lost. One of the ideas behind HDR, sometimes, is to capture absolute values in physical units, rather than relative brightness. This is the distinguishing factor that film and paper and TVs don’t have. Some new displays are getting absolute brightness features, but historically most media display relative color values.

  • Absolute is also a funny size. From the perspective of human visual perception, an absolute brightness only matters if the entire viewing environment is also controlled to the same absolute values. Visual perception is highly contextual, and we are not only seeing the screen.

    It's not fun being unable to watch dark scenes during the day or evening in a living room, nor is vaporizing your retinas if the ambient environment went dark in the meantime. People want good viewing experience in the available environment that is logically similar to what the content intended, but that is not always the same as reproducing the exact same photons as the directors's mastering monitor sent towards their their eyeballs at the time of production.

    • Indeed. For a movie scene depicting the sky including the Sun, you probably wouldn't want your TV to achieve the same brightness as the Sun. You might want your TV to become significantly brighter than the rest of the scenes, to achieve an effect something like the Sun catching your eye.

      Of course, the same thing goes for audio in movies. You probably want a gunshot or explosion to sound loud and even be slightly shocking, but you probably don't want it to be as loud as a real gunshot or explosion would be from the depicted distance.

      The difference is that for 3+ decades the dynamic range of ubiquitous audio formats (like 16 bit PCM in audio CDs and DVDs) has provided far more dynamic range than is comfortably usable in normal listening environments. So we're very familiar with audio being mastered with a much smaller dynamic range than the medium supports.

    • Yep, absolutely! ;)

      This brings up a bunch of good points, and it tracks with what I was trying to say about conflating HDR processing with HDR display. But do keep in mind that even when you have absolute value images, that doesn’t imply anything about how you display them. You can experience large benefits with an HDR workflow, even when your output or display is low dynamic range. Assume that there will be some tone mapping process happening and that the way you map tones depends on the display medium and its capabilities, and on the context and environment of the display. Using the term “HDR” shouldn’t imply any mismatch or disconnect in the viewing environment. It only did so in the article because it wasn’t very careful about its terms and definitions.

The term "HDR" arguably makes more sense for the effect achieved by tone mapping multiple exposures of the same subject onto a "normal" (e.g. SRGB) display. In this case, the "high" in "HDR" just means "from a source with higher dynamic range than the display."

  • Remember "wide gamut" screens ?

    This is part of 'HDR' standards too...

    And it's quite annoying that 'HDR' (and which specific one ?) is treated as just being 'on' or 'off' even for power users...

> but your monitor manages 1000-1500 and only in a small window.

Owning a display that can do 1300+ nits sustained across a 100% window has been the biggest display upgrade I think I have ever had. It's given me a tolerance for LCD, a technology I've hated since the death of CRTs and turned me away from OLED.

There was a time I would have said i'd never own a non OLED display again. But a capable HDR display changed that logic in a big way.

Too bad the motion resolution on it, especially compared to OLED is meh. Again, at one point, motion was the most important aspect to me (its why I still own CRTs) but this level of HDR...transformative for lack of a better word.

  • Motion resolution? Do you mean the pixel response time?

    CRTs technically have quite a few artifacts in this area, but as content displayed CRTs tend to be built for CRTs this is less of an issue, and in many case even required. The input is expecting specific distortions and effects from scanlines and phosphor, which a "perfect" display wouldn't exhibit...

    The aggressive OLED ABL is simply a thermal issue. It can be mitigated with thermal design in smaller devices, and anything that increases efficiency (be it micro lens arrays, stacked "tandem" panels, quantum dots, alternative emitter technology) will lower the thermal load and increase the max full panel brightness.

    (LCD with zone dimming would also be able to pull this trick to get even brighter zones, but because the base brightness is high enough it doesn't bother.)

    • > Motion resolution? Do you mean the pixel response time?

      I indeed meant motion resolution, which pixel response time only partially affects. It’s about how clearly a display shows motion, unlike static resolution which only reflects realistically a still image. Even with fast pixels, sample and hold displays blur motion unless framerate and refresh rate is high, or BFI/strobing is used. This blur immediately lowers perceived resolution the moment anything moves on screen.

      > The input is expecting specific distortions and effects from scanlines and phosphor, which a "perfect" display wouldn't exhibit...

      That's true for many CRT purists, but is not a huge deal for me personally. My focus is motion performance. If LCD/OLED matched CRT motion at the same refresh rate, I’d drop CRT in a heartbeat, slap on a CRT shader, and call it a day. Heresy to many CRT enthusiasts.

      Ironically, this is an area in which I feel we are getting CLOSE enough with the new higher refresh OLEDs for non HDR retro content in combination with: https://blurbusters.com/crt-simulation-in-a-gpu-shader-looks... (which hopefully will continue to be improved.)

      > The aggressive OLED ABL is simply a thermal issue.

      Theoretically, yes and there’s been progress, but it’s still unsolved in practice. If someone shipped an OLED twice as thick and full of fans and heatsinks, I’d buy it tomorrow. But that’s not what the market wants, so obviously it's not what they make.

      > It can be mitigated with thermal design in smaller devices, and anything that increases efficiency (be it micro lens arrays, stacked "tandem" panels, quantum dots, alternative emitter technology) will lower the thermal load and increase the max full panel brightness.

      Sure, in theory. But so far the improvements (like QD-OLED or MLA) haven’t gone far enough. I already own panels using these. Beyond that, much of the tech isn’t in the display types I care about, or isn’t ready yet. Which is a pity, because the tandem based displays I have seen in usage are really decent.

      That said, the latest G5 WOLEDs are the first I’d call acceptable for HDR at high APL, for the preferences I hold with very decent real scene brightness, at least in film. Sadly, I doubt we’ll see comparable performance in PC monitors until many years down the track and monitors are my preference.

  • Hello fellow CRT owner. What is your use case? Retro video games? PC games? Movies?

    • Hello indeed!

      > What is your use case? Retro video games? PC games? Movies?

      All of the above! The majority of my interest largely stems from the fact that for whatever reason, I am INCREDIBLY sensitive to sample and hold motion blur. Whilst I tolerate it for modern gaming because I largely have no choice, CRT's mean I do not for my retro gaming, which I very much enjoy. (I was very poor growing up, so most of it for me is not even nostalgia, most of these games are new to me.)

      Outside of that, we have a "retro" corner in our home with a 32" trinitron. I collect laserdisc/VHS and we have "retro video" nights where for whatever reason, we watch the worst possible quality copies of movies we could get in significantly higher definition. Much the same as videogames, I was not exposed to a lot of media growing up, my wife has also not seen many things because she was in Russia back then, so there is a ton for us to catch up on very slowly and it just makes for a fun little date night every now and again.

      Sadly though, as I get ready to take on a mortgage, it's likely most of my CRT's will be sold, or at least the broadcast monitors. I do not look forward to it haha.

      3 replies →