Comment by gwbas1c
2 days ago
> I don't own a single 4k or HDR display
Don't feel like you have to. I bought a giant fancy TV with it, and even though it's impressive, it's kinda like ultra-hifi-audio. I don't miss it when I watch the same show on one of my older TVs.
If you ever do get it, I suggest doing for a TV that you watch with your full attention, and watching TV / movies in the dark. It's not very useful on a TV that you might turn on while doing housework; but very useful when you are actively watching TV with your full attention.
I totally love HDR on my OLED TV, and definitely miss it on others.
Like a lot of things, it’s weird how some people are more sensitive to visual changes. For example:
- At this point, I need 120hz displays. I can easily notice when my wife’s phone is in power saver mode at 60hz.
- 4k vs 1080p. This is certainly more subtle, but I definitely miss detail in lower res content.
- High bitrate. This is way more important than 4k vs 1080p or even HDR. But it’s so easy to tell when YouTube lowers the quality setting on me, or when a TV show is streaming at a crappy bitrate.
- HDR is tricky, because it relies completely on the content creator to do a good job producing HDR video. When done well, the image basically sparkles, water looks actually wet, parts of the image basically glow… it looks so good.
I 100% miss this HDR watching equivalent content on other displays. The problem is that a lot of content isn’t produced to take advantage of this very well. The HDR 4k Blu-ray of several Harry Potter movies, for example, has extremely muted colors and dark scenes… so how is the image going to pop? I’m glad we’re seeing more movies rely on bright colors and rich, contrasty color grading. There are so many old film restorations that look excellent in HDR because the original color grade had rich, detailed, contrasty colors.
On top of that, budget HDR implementations, ESPECIALLY in PC monitors, just don’t get very bright. Which means their HDR is basically useless. It’s impossible to replicate the “shiny, wet look” of really good HDR water if the screen can’t get bright enough to make it look shiny. Plus, it needs to be selective about what gets bright, and cheap TVs don’t have a lot of backlighting zones to make that happen very well.
So whereas I can plug in a 4k 120hz monitor and immediately see the benefit in everything I do for normal PC stuff, you can’t get that with HDR unless you have good source material and a decent display.
> At this point, I need 120hz displays. I can easily notice when my wife’s phone is in power saver mode at 60hz.
Yeah, the judder is a lot more noticeable on older TVs now that I have a 120hz TV. IMO, CRTs handled this the best, but I'm not going back.
I don't either see a point of having 4K TV vs 1080p TV. To me is just marketing, I have at my house both a 4K and a 1080p and from a normal viewing distance (that is 3/4 meters) you don't see differences.
Also in my country (Italy) TV transmissions are 1080i at best, a lot are still 570i (PAL resolution). Streaming media can be 4K (if you have enough bandwidth to stream it at that resolution, which I don't have at my house). Sure, if you download pirated movies you find it at 4K, and if you have the bandwidth to afford it... sure.
But even there, sometimes is better a well done 1080p movie than an hyper compressed 4K one, since you see compression artifacts.
To me 1080p, and maybe even 720p, is enough for TV vision. Well, sometimes I miss the CRT TVs, they where low resolution but for example had a much better picture quality than most modern 4K LCD TV where black scenes are gray (I know there is OLED, but is too expensive and has other issues).
For TVs under ~80" I feel like you'd have to be sitting abnormally close to your TV for it to matter much. At the same time I think the cost difference between producing 1080p and 4k panels is so low it probably doesn't matter. Like you say, things like the backlight technology (or lack thereof) make a much bigger difference in perceived quality but that's also where the actual cost comes in.
I agree about 4k vs non-4k. I will say going OLED was a huge upgrade, even for SDR content. HDR content is hit-or-miss...I find some of it is tastefully done but in many cases is overdone.
My own movie collection is mostly 2-4GB SDR 1080p files and looks wonderful.
You still watch broadcast TV?
Jokes aside, when a 4k TV has a good upscaler, it's hard to tell the difference between 1080 and 4k. Not impossible; I certainly can, but 1080 isn't distracting.
I feel the same way. To be honest even the laptop retina screen is excess. I sometimes go back to a 2012 non retina macbook pro and to be honest at normal laptop viewing angles, you can’t really discern pixels. Biggest difference is display scaling but I have my retina scaled at what the old display would be anyhow because otherwise its too small.
Kind of crazy no one thought of this aspect and we just march on to higher resolution and the required hardware for that.