Comment by theshackleford

2 days ago

> but your monitor manages 1000-1500 and only in a small window.

Owning a display that can do 1300+ nits sustained across a 100% window has been the biggest display upgrade I think I have ever had. It's given me a tolerance for LCD, a technology I've hated since the death of CRTs and turned me away from OLED.

There was a time I would have said i'd never own a non OLED display again. But a capable HDR display changed that logic in a big way.

Too bad the motion resolution on it, especially compared to OLED is meh. Again, at one point, motion was the most important aspect to me (its why I still own CRTs) but this level of HDR...transformative for lack of a better word.

Motion resolution? Do you mean the pixel response time?

CRTs technically have quite a few artifacts in this area, but as content displayed CRTs tend to be built for CRTs this is less of an issue, and in many case even required. The input is expecting specific distortions and effects from scanlines and phosphor, which a "perfect" display wouldn't exhibit...

The aggressive OLED ABL is simply a thermal issue. It can be mitigated with thermal design in smaller devices, and anything that increases efficiency (be it micro lens arrays, stacked "tandem" panels, quantum dots, alternative emitter technology) will lower the thermal load and increase the max full panel brightness.

(LCD with zone dimming would also be able to pull this trick to get even brighter zones, but because the base brightness is high enough it doesn't bother.)

  • > Motion resolution? Do you mean the pixel response time?

    I indeed meant motion resolution, which pixel response time only partially affects. It’s about how clearly a display shows motion, unlike static resolution which only reflects realistically a still image. Even with fast pixels, sample and hold displays blur motion unless framerate and refresh rate is high, or BFI/strobing is used. This blur immediately lowers perceived resolution the moment anything moves on screen.

    > The input is expecting specific distortions and effects from scanlines and phosphor, which a "perfect" display wouldn't exhibit...

    That's true for many CRT purists, but is not a huge deal for me personally. My focus is motion performance. If LCD/OLED matched CRT motion at the same refresh rate, I’d drop CRT in a heartbeat, slap on a CRT shader, and call it a day. Heresy to many CRT enthusiasts.

    Ironically, this is an area in which I feel we are getting CLOSE enough with the new higher refresh OLEDs for non HDR retro content in combination with: https://blurbusters.com/crt-simulation-in-a-gpu-shader-looks... (which hopefully will continue to be improved.)

    > The aggressive OLED ABL is simply a thermal issue.

    Theoretically, yes and there’s been progress, but it’s still unsolved in practice. If someone shipped an OLED twice as thick and full of fans and heatsinks, I’d buy it tomorrow. But that’s not what the market wants, so obviously it's not what they make.

    > It can be mitigated with thermal design in smaller devices, and anything that increases efficiency (be it micro lens arrays, stacked "tandem" panels, quantum dots, alternative emitter technology) will lower the thermal load and increase the max full panel brightness.

    Sure, in theory. But so far the improvements (like QD-OLED or MLA) haven’t gone far enough. I already own panels using these. Beyond that, much of the tech isn’t in the display types I care about, or isn’t ready yet. Which is a pity, because the tandem based displays I have seen in usage are really decent.

    That said, the latest G5 WOLEDs are the first I’d call acceptable for HDR at high APL, for the preferences I hold with very decent real scene brightness, at least in film. Sadly, I doubt we’ll see comparable performance in PC monitors until many years down the track and monitors are my preference.

Hello fellow CRT owner. What is your use case? Retro video games? PC games? Movies?

  • Hello indeed!

    > What is your use case? Retro video games? PC games? Movies?

    All of the above! The majority of my interest largely stems from the fact that for whatever reason, I am INCREDIBLY sensitive to sample and hold motion blur. Whilst I tolerate it for modern gaming because I largely have no choice, CRT's mean I do not for my retro gaming, which I very much enjoy. (I was very poor growing up, so most of it for me is not even nostalgia, most of these games are new to me.)

    Outside of that, we have a "retro" corner in our home with a 32" trinitron. I collect laserdisc/VHS and we have "retro video" nights where for whatever reason, we watch the worst possible quality copies of movies we could get in significantly higher definition. Much the same as videogames, I was not exposed to a lot of media growing up, my wife has also not seen many things because she was in Russia back then, so there is a ton for us to catch up on very slowly and it just makes for a fun little date night every now and again.

    Sadly though, as I get ready to take on a mortgage, it's likely most of my CRT's will be sold, or at least the broadcast monitors. I do not look forward to it haha.

    • > Outside of that, we have a "retro" corner in our home with a 32" trinitron.

      A 32” Trinny. Nice. I have the 32” JVC D-series which I consider my crown jewel. It’s for retro gaming and I have a laserdisc player but a very limited selection of movies. Analog baby.

      > Sadly though, as I get ready to take on a mortgage, it's likely most of my CRT's will be sold

      Mortgage = space. You won’t believe the nooks and crannies you can fit CRTs into. Attic. Shed. Crawl space. Space under basement stairs. Heck, even the neighbors house. I have no less than 14 CRTs ferreted away in the house. Wife thinks I have only 5. Get creative. Don’t worry about the elements, these puppies were built to survive nuclear blasts. Do I have a sickness? Probably. But analog!!!

      2 replies →