← Back to context

Comment by aidenn0

2 days ago

> A big problem is that it costs the TV, Film, and Photography industries billions of dollars (and a bajillion hours of work) to upgrade their infrastructure. For context, it took well over a decade for HDTV to reach critical mass.

This is also true for consumers. I don't own a single 4k or HDR display. I probably won't own an HDR display until my TV dies, and I probably won't own a 4k display until I replace my work screen, at which point I'll also replace one of my home screens so I can remote into it without scaling.

Few things are in absolutes. Yes most consumers wont have every screen hdr nor 4k, but most consumers use a modern smartphone and just about every modern smartphone from the past half decade or more has hdr of some level.

I absolutely loathe consuming content on a mobile screen, but its the reality is the vast majority are using phone and tablets most the time.

  • Funny enough HDR content works absolutely perfect as long as it stays on device that has both HDR-recording and display tech, aka smartphones.

    The problem starts with sending HDR content to SDR-only devices, or even just other HDR-standards. Not even talking about printing here.

    This step can inherently only be automated so much, because it's also a stylistic decision on what information to keep or emphasize. This is an editorial process, not something you want to emburden casual users with. What works for some images can't work for others. Even with AI the preference would still need to be aligned.

  • How would I know if my android phone has an HDR screen?

    [edit]

    Some googling suggested I check in the Netflix app; at least Netflix thinks my phone does not support HDR. (Unihertz Jelly Max)

I have to think you are are the 1-3% outlier though. Everyone I know has an HDR screen even my friend who never buys anything new, but he did run out and buy an HDR tv to replace his old one that he gave to his son.

  • I honestly do not know if I have any screen that supports HDR. At least I've never noticed any improved image quality when viewing HDR video content and compare the image on my M3 Macbook Pro screen vs. an old external IPS monitor. Maybe my eyes are just broken?

To demonstrate some contrast (heh) with another data point from someone closer to the other extreme, I’ve owned a very HDR-capable monitor (the Apple Pro Display XDR) since 2020, so that’s 5 years now. Content that takes full advantage of it is still rare, but it’s getting better slowly over time.

  • I have a screen which is "HDR" but what that means is when you turn the feature on it just makes everything more muted, it doesn't actually have any more dynamic range. When you turn HDR on for a game it basically just makes most things more muddy grey.

    I also have a screen which has a huge gamut and blows out colors in a really nice way (a bit like the aftereffects of hallucinogens, it has colors other screens just don't) and you don't have to touch any settings.

    My OLED TV has HDR and it actually seems like HDR content makes a difference while regular content is still "correct".

    • The cheap displays adding broken HDR400 support destroyed so much public opinion on HDR. Not actually providing a wider range but accepting the HDR signal would at least have been a minor improvement if the tone mapping weren't completely broken to the point most people just associate HDR with a washed out picture.

      2 replies →

> I don't own a single 4k or HDR display

Don't feel like you have to. I bought a giant fancy TV with it, and even though it's impressive, it's kinda like ultra-hifi-audio. I don't miss it when I watch the same show on one of my older TVs.

If you ever do get it, I suggest doing for a TV that you watch with your full attention, and watching TV / movies in the dark. It's not very useful on a TV that you might turn on while doing housework; but very useful when you are actively watching TV with your full attention.

  • I totally love HDR on my OLED TV, and definitely miss it on others.

    Like a lot of things, it’s weird how some people are more sensitive to visual changes. For example:

    - At this point, I need 120hz displays. I can easily notice when my wife’s phone is in power saver mode at 60hz.

    - 4k vs 1080p. This is certainly more subtle, but I definitely miss detail in lower res content.

    - High bitrate. This is way more important than 4k vs 1080p or even HDR. But it’s so easy to tell when YouTube lowers the quality setting on me, or when a TV show is streaming at a crappy bitrate.

    - HDR is tricky, because it relies completely on the content creator to do a good job producing HDR video. When done well, the image basically sparkles, water looks actually wet, parts of the image basically glow… it looks so good.

    I 100% miss this HDR watching equivalent content on other displays. The problem is that a lot of content isn’t produced to take advantage of this very well. The HDR 4k Blu-ray of several Harry Potter movies, for example, has extremely muted colors and dark scenes… so how is the image going to pop? I’m glad we’re seeing more movies rely on bright colors and rich, contrasty color grading. There are so many old film restorations that look excellent in HDR because the original color grade had rich, detailed, contrasty colors.

    On top of that, budget HDR implementations, ESPECIALLY in PC monitors, just don’t get very bright. Which means their HDR is basically useless. It’s impossible to replicate the “shiny, wet look” of really good HDR water if the screen can’t get bright enough to make it look shiny. Plus, it needs to be selective about what gets bright, and cheap TVs don’t have a lot of backlighting zones to make that happen very well.

    So whereas I can plug in a 4k 120hz monitor and immediately see the benefit in everything I do for normal PC stuff, you can’t get that with HDR unless you have good source material and a decent display.

    • > At this point, I need 120hz displays. I can easily notice when my wife’s phone is in power saver mode at 60hz.

      Yeah, the judder is a lot more noticeable on older TVs now that I have a 120hz TV. IMO, CRTs handled this the best, but I'm not going back.

  • I don't either see a point of having 4K TV vs 1080p TV. To me is just marketing, I have at my house both a 4K and a 1080p and from a normal viewing distance (that is 3/4 meters) you don't see differences.

    Also in my country (Italy) TV transmissions are 1080i at best, a lot are still 570i (PAL resolution). Streaming media can be 4K (if you have enough bandwidth to stream it at that resolution, which I don't have at my house). Sure, if you download pirated movies you find it at 4K, and if you have the bandwidth to afford it... sure.

    But even there, sometimes is better a well done 1080p movie than an hyper compressed 4K one, since you see compression artifacts.

    To me 1080p, and maybe even 720p, is enough for TV vision. Well, sometimes I miss the CRT TVs, they where low resolution but for example had a much better picture quality than most modern 4K LCD TV where black scenes are gray (I know there is OLED, but is too expensive and has other issues).

    • For TVs under ~80" I feel like you'd have to be sitting abnormally close to your TV for it to matter much. At the same time I think the cost difference between producing 1080p and 4k panels is so low it probably doesn't matter. Like you say, things like the backlight technology (or lack thereof) make a much bigger difference in perceived quality but that's also where the actual cost comes in.

    • I agree about 4k vs non-4k. I will say going OLED was a huge upgrade, even for SDR content. HDR content is hit-or-miss...I find some of it is tastefully done but in many cases is overdone.

      My own movie collection is mostly 2-4GB SDR 1080p files and looks wonderful.

    • You still watch broadcast TV?

      Jokes aside, when a 4k TV has a good upscaler, it's hard to tell the difference between 1080 and 4k. Not impossible; I certainly can, but 1080 isn't distracting.

    • I feel the same way. To be honest even the laptop retina screen is excess. I sometimes go back to a 2012 non retina macbook pro and to be honest at normal laptop viewing angles, you can’t really discern pixels. Biggest difference is display scaling but I have my retina scaled at what the old display would be anyhow because otherwise its too small.

      Kind of crazy no one thought of this aspect and we just march on to higher resolution and the required hardware for that.

Pretty much any display you can buy today will be HDR capable, though that doesn't mean much.

I think the industry is strangling itself putting "DisplayHDR 400" certification on edgelit/backlit LCD displays. In order for HDR to look "good" you either need high resolution full array local dimming backlighting (which still isn't perfect), or a panel type that doesn't use any kind of backlighting like OLED.

Viewing HDR content on these cheap LCDs often looks worse than SDR content. You still get the wider color gamut, but the contrast just isn't there. Local dimming often loses all detail in shadows whenever there is something bright on the screen.

  • HDR marketing on monitors almost seems like a scam. Monitors will claim HDR compatibility when what they actually means is they will take the HDR data stream and display it exactly the same as SDR content because they don't actually have the contrast and brightness ability of a proper HDR monitor.

This is also true for consumers. I don't own a single 4k or HDR display. I probably won't own an HDR display until my TV dies, and I probably won't own a 4k display until I replace my work screen, at which point I'll also replace one of my home screens so I can remote into it without scaling.

People in the HN echo chamber over-estimate hardware adoption rates. For example, there are millions of people who went straight from CDs to streaming, without hitting the iPod era.

A few years ago on HN, there was someone who couldn't wrap their brain around the notion that even though VCRs were invented in the early 1960's that in 1980, not everyone owned one, or if they did, they only had one for the whole family.

Normal people aren't magpies who trash their kit every time something shiny comes along.

  • > A few years ago on HN, there was someone who couldn't wrap their brain around the notion that even though VCRs were invented in the early 1960's that in 1980, not everyone owned one, or if they did, they only had one for the whole family.

    Point of clarification: While the technology behind the VCR was invented in the '50s and matured in the '60s, consumer-grade video tape systems weren't really a thing until Betamax and VHS arrived in 1975 and 1976 respectively.

    Early VCRs were also incredibly expensive, with prices ranging from $3,500 to almost $10,000 after adjusting for inflation. Just buying into the VHS ecosystem at the entry level was a similar investment to buying an Apple Vision Pro today.

    • Exactly my point. But people on HN, especially the person I referenced, don't understand that we didn't just throw stuff away and go into debt to buy the latest gadgets because we were told to.

  • >there are millions of people who went straight from CDs to streaming, without hitting the iPod era

    Who?

    There was about a decade there where everyone who had the slightest interest in music had an mp3 player of some kind, at least in the 15-30 age bracket.

    • I imagine this depends a LOT on your specific age and what you were doing in the 00's when MP3 player usage peaked.

      I finished high school in 2001 and didn't immediately go to college, so I just didn't have a need for a personal music player anymore. I was nearly always at home or at work, and I drove a car that actually had an MP3 CD player. I felt no need to get an iPod.

      In 2009, I started going to college, but then also got my first smartphone, the Motorola Droid, which acted as my portable MP3 player for when I was studying in the library or taking mass transit.

      If you were going to school or taking mass transit in the middle of the '00s, then you were probably more likely to have a dedicated MP3 player.

    • I don't know if I count, but I never owned a dedicated MP3 player[1], I listened to MP3s on my computer, but used CDs and cassettes while on the move, until I got an android phone that had enough storage to put my music collection on.

      1: Well my car would play MP3s burned to CDs in its CD player; not sure if that counts.

    • My father, for one. He was entirely happy with radio in the car and CDs at home.

    • I skipped 2 generations for portable music : went straight from cassette to smartphone with MP3 (and radio).