Comment by throwaway2037
5 days ago
My response is somewhat tangential: When I look at GPUs strictly from the perspective of gaming performance, the last few generations have been so underwhelming. I am not a gamer, but games basically look life-like at this point. What kind of improvements are gamers expecting going forward? Seriously, a mid-level GPU has life-like raytracing at 4K/60HZ. What else do you need for gaming? (Please don't read this as looking down upon gaming; I am only questioning what else gamers need from their GPUs.)
To me, the situation is similar with monitors. After we got the pixel density of 4K at 27 inches with 60Hz refresh rate (enough pixels, enough inches, enough refresh rate), how can it get any better for normies? Ok, maybe we can add HDR, but monitors are mostly finished, similar to mobile phones. Ah, one last one: I guess we can upgrade to OLED when the prices are not so scandalous. Still, for the corporate normies, who account for the lion's share of people siting in front of 1990s-style desktop PCs with a monitor, they are fine with 4K at 27 inches with 60Hz refresh rate forever.
I can't answer the first part, since I'm not playing any modern games, but continuously visit RTS games like C&C & Starcraft series.
However, I can talk about monitors. Yes, a 27" 4K@60 monitor is really, really good, but panel quality (lighting, uniformity and color correctness) goes a long way. After using Dell and HPs "business" monitors for so long, most "normal monitors for normies" look bad to me. Uncomfortable with harsh light and bad uniformity.
So, the monitor quality is not "finished" yet. I don't like OLEDs on big screens, because I tend to use what I buy for a very long time, and I don't wany my screen to age non-uniformly, esp. if I'm looking to it everyday and for long periods of time.
Is OLED burnout still a thing? If yes, then you are probably right: Normies will not upgrade to OLED until that issue is fixed, or a new technology replaces it.