Comment by nfriedly
2 days ago
A lot of monitors that advertise HDR support really shouldn't. Many of them can decode the signal but don't have the hardware to accurately reproduce it, so you just end up with a washed out muddy looking mess where you're better off disabling HDR entirely.
As others here have said, OLED monitors are generally excellent at reproducing a HDR signal, especially in a darker space. But they're terrible for productivity work because they'll get burned in for images that don't change a lot. They're fantastic for movies and gaming, though.
There are a few good non-OLED HDR monitors, but not many. I have an AOC Q27G3XMN; its a 27" 1440p 180hz monitor that is good for entry-level HDR, especially in brighter rooms. It has over 1000 nits of brightness, and no major flaws. It only has 336 backlight zones, though, so you might notice some blooming around subtitles or other fine details where there's dark and light content close together. (VA panels are better than IPS at suppressing that, though.) It's also around half the price of a comparable OLED.
Most of the other non-OLED monitors with good HDR support have some other deal-breaking flaws or at least major annoyances, like latency, screwing up SDR content, buggy controls, etc. The Monitors Unboxed channel on YouTube and rtngs.com are both good places to check.
I think an OLED is basically an absolute necessity for HDR content.
My current monitor is an OLED and HDR in games looks absolutely amazing. My previous was an IPS that supported HDR, but turning it on caused the backlight to crank to the max, destroying black levels and basically defeating the entire purpose of HDR. Local dimming only goes so far.
Modern mini-led monitors are very good. The “local” dimming is so local that there isn’t much light bleed even in the worst-case situations (cursor over black background makes it particularly apparent).
The advantage of LEDs is they’re brighter. For example, compare two modern Asus ProArt displays: their mini-LED (PA32UCXR) at 1600 nits and their OLED (PA32DC) at 300ish nits. The OLED is 20% more expensive. These two monitors have otherwise comparable specs. Brightness matters a lot for HDR because if you’re in a bright room, the monitor’s peak brightness needs to overpower the room.
Plus for color managed work, I think LED monitors are supposed to retain their calibration well. OLEDs have to be frequently recalibrated.
And so-called micro-LEDs are coming soon, which promise to make “local” so small that it’s imperceptible. I think the near-term future of displays is really good LEDs.
> My previous was an IPS that supported HDR, but turning it on caused the backlight to crank to the max, destroying black levels and basically defeating the entire purpose of HDR
Yeah, that's kind of what I meant when I said that most monitors that advertise HDR shouldn't.
The AOC monitor is the third or fourth one I've owned that advertised HDR, but the first one that doesn't look like garbage when it's enabled.
I haven't gone oled yet because of both the cost and the risk of burn-in in my use case (lots of coding and other productivity work, occasional gaming).
Avoid cranking up the OLED brightness over 70% for static content and absolutely never drive SDR-Reds into the HDR range by using fake HDR-modes when having brightness high.
I have a LG 2018 OLED that has some burnt in Minecraft hearts because of that, not from Minecraft itself, but just a few hours of minecraft Youtube video in those settings from the built in youtube client, but virtually no other detectable issues after excessive years of use with static content.
You only see them with fairly uniform colors as a background where color banding would usually be my bigger complaint.
So burn-ins definitely happen, but they are far from being a deal breaker over the obvious benefits you get vs other types of displays.
And driving everything possible in dark mode (white text on dark bg) on those displays is even the logical thing to do. Then you dont need much max brightness anyway and even save some energy.