Stranger Things creator says turn off "garbage" settings

13 hours ago (screenrant.com)

It would help if TV manufacturers would clearly document what these features do, and use consistent names that reflect that.

It seems they want to make these settings usable without specialist knowledge, but the end result of their opaque naming and vague descriptions is that anybody who actually cares about what they see and thinks they might benefit from some of the features has to either systematically try every possible combination of options or teach themselves video engineering and try to figure out for themselves what each one actually does.

This isn't unique to TVs. It's amazing really how much effort a company will put into adding a feature to a product only to completely negate any value it might have by assuming any attempt at clearly documenting it, even if buried deep in a manual, will cause their customers' brains to explode.

  • They will setup their TVs with whatever setting makes them sell better than the other TVs in the shop.

  • > It would help if TV manufacturers would clearly document what these features do, and use consistent names that reflect that.

    It would also help if there was a common, universal, perfect "reference TV" to aim for (or multiple such references for different use cases), with the job of the TV being to approximate this reference as closely as possible.

    Alas, much like documenting the features, this would turn TVs into commodities, which is what consumers want, but TV vendors very much don't.

  • I'm sure part of it is so that marketing can say that their TV has new putz-tech smooth vibes AI 2.0, but honestly I also see this same thing happen with products aimed at technical people who would benefit from actually knowing what a particular feature or setting really is. Even in my own work on tools aimed at developers, non-technical stakeholders push really hard to dumb down and hide what things really are, believing that makes the tools easier to use, when really it just makes it more confusing for the users.

    • I don't think you are the target audience of the dumbed down part but the people paying them for it. They don't need the detailed documentation on those thing, so why make it?

  • "Our users are morons who can barely read, let alone read a manual", meet "our users can definitely figure out how to use our app without a manual".

  • The purpose of the naming is generally to overwhelm consumers and drive long term repeat buys. You can’t remember if your tv has the fitzbuzz, but you’re damn sure this fancy new tv in the store looks a hell of a lot better than you’re current tv and there really pushing this fitzbuzz thing.

    • Cynically, I think its a bit, just a little, to do with how we handle manuals, today.

      It wasn't that long ago, that the manual spelled out everything in detail enough that a kid could understand, absorb, and decide he was going to dive into his own and end up in the industry. I wouldn't have broken or created nearly as much, without it.

      But, a few things challenged the norm. For many, many reasons, manuals became less about the specification and more about the functionality. Then they became even more simplified, because of the need to translate it into thirty different languages automatically. And even smaller, to discourage people from blaming the company rather than themselves, by never admitting anything in the manual.

      What I would do for a return to fault repair guides [0].

      [0] https://archive.org/details/olivetti-linea-98-service-manual...

      1 reply →

    • That doesn't preclude clearly documenting what the feature does somewhere in the manual or online. People who either don't care or don't have the mental capacity to understand it won't read it. People who care a lot, such as specialist reviewers or your competitors, will figure it out anyway. I don't see any downside to adding the documentation for the benefit of paying customers who want to make an informed choice about when to use the feature, even in this cynical world view.

      1 reply →

The fact that I have to turn on closed captioning to understand anything tells me these producers have no idea what we want and shouldn’t be telling us what settings to use.

  • One problem is that the people mixing the audio already know what is being said:

    Top-down processing

    (or more specifically, top-down auditory perception)

    This refers to perception being driven by prior knowledge, expectations, and context rather than purely by sensory input. When you already know the dialog, your brain projects that knowledge onto the sound and experiences it as “clear.”

  • English is my second language and I always though my lack of understanding was a skill issue.

    Then I noticed that native speakers also complain.

    Then I started to watch YouTube channels, live TV and old movies, and I found out I could understand almost everything! (depending on the dialect)

    When even native speakers can't properly enjoy modern movies and TV shows, you know that something is very wrong...

    • The sound mixing does seem to have gotten much worse over time.

      But also, people in old movies often enunciated very clearly as a stylistic choice. The Transatlantic accent—sounds a bit unnatural but you can follow the plot.

      3 replies →

    • To be fair, the diction in modern movies is different than the diction in all other examples you mentioned. YouTube and live TV is very articulate, and old movies are theater-like in style.

      1 reply →

    • I "upgraded" from a 10 year old 1080p Vizio to a 4K LG and the sound is the worst part of the experience. It was very basic and consistent with our old TV but now it's all over the place. It's now a mangled mess of audio that's hard to understand.

      2 replies →

  • Apple TV (the box) has an Enhance Dialogue option built-in. Even that plus a pair of Apple-native HomePods on full volume didn’t help me hear wtf was going on in parts of Pirates of the Caribbean (2003) on Disney. If two of the biggest companies on the planet can’t get this right, I don’t know who can.

  • Perhaps a mixing issue on your end? Multi-channel audio has the dialog track separated. So you can increase the volume of the dialog if you want. Unfortunately I think there is variability in hardware (and software players) in how to down-mix, which sometimes results in background music in the surround channels drowning out the dialog in the centre channel.

    • > Multi-channel audio has the dialog track separated. So you can increase the volume of the dialog if you want

      Are you talking about the center channel on an X.1 setup or something else? My Denon AVR certainly doesn't have a dedicated setting for dialog, but I can turn up the center channel which yields variable results for improved audio clarity. Note that DVDs and Blurays from 10+ years ago are easily intelligible without any of this futzing.

    • It's reasonable for the 5.1 mix to have louder atmosphere and be more dependent on directionality for the viewer to pick the dialog out of the center channel. However, all media should also be supplying a stereo mix where the dialog is appropriately boosted.

      1 reply →

    • Is there a way to do this in vlc? I run into this problem constantly - especially when 5.1 audio gets down mixed to my stereo setup.

    • Sometimes it's because the original mix was for theater surround sound and lower mixes were generated via software.

  • I think it isn't a mixing issue, it's an acting issue.

    It's the obsession with accents, mixed with the native speakers' conviction that vowels are the most important part.

    Older movies tended to use some kind of unplaceable ("mid atlantic") accent, that could be easily understood.

    But modern actors try to imitate accents and almost always focus on the vowels. Most native speakers seem to be convinced that vowels are the most important part of English, but I think it isn't true. Sure, English has a huge number of vowels, but they are almost completely redundant. It's hard to find cases where vowels really matter for comprehension, which is why they may vary so much across accents without impeding communication. So what the actors do is that they focus on the vowels, but slur the consonants, and you are pretty much completely lost without the consonants.

    • The Mid-Atlantic accent has fallen out of favor since at least the latter part of the 50s. The issue with hard to understand dialog is a much more recent phenomenon.

      1 reply →

  • I have the same sound issues with a lot of stuff, my current theory at this point is that TVs have gotten bigger and we're further away from them but speakers have stayed kinda shitty... but things are being mixed by people using headphones or otherwise good sound equipment

    it's very funny how when watching a movie on my macbook pro it's better for me to just use HDMI for the video to my TV but keep on using my MBP speaker for the audio, since the speakers are just much better.

    • If anything I'd say speakers have only gotten shittier as screens have thinned out. And it used to be fairly common for people to have dedicated speakers, but not anymore.

      Just anecdotally, I can tell speaker tech has progressed slowly. Stepping in a car from 20 years ago sound... pretty good, actually.

      4 replies →

    • It is a well known issue: https://zvox.com/blogs/news/why-can-t-i-hear-dialogue-on-tv-...

      I don't find the source anymore but I think that I saw that it was even a kind of small conspiracy on tv streaming so that you set your speakers louder and then the advertisement time arrive you will hear them louder than your movie.

      Officially it is just that they switch to a better encoding for ads (like mpeg2 to MPEG-4 for DVB) but unofficially for the money as always...

      5 replies →

  • Never had an issue with Stranger Things. Maybe you're using the internal speakers?

    • I watch YouTube with internal TV speakers and I understand everything, even muddled accents. I cannot understand a single TV show or movie with the same speakers. Something tells me it's about the source material, not the device.

    • I agree. There are absolutely tons of movies and TV series with indecipherable dialogue, but Stranger Things isn't among them.

    • > Maybe you're using the internal speakers?

      Which is just another drama that should not be on consumers shoulders.

      Every time I visit friends with newer TV than mine I am floored by how bad their speakers are. Even the same brand and price-range. Plus the "AI sound" settings (often on by default) are really bad.

      I'd love to swap my old tv as it shows it's age, but spending a lot of money on a new one that can't play a show correctly is ridiculous.

      12 replies →

  • Sounds like you are just using internal speakers.

    They are notorious for bad vocal range audio.

    I have a decent surround sound and had no issues at all.

    • As mentioned elsewhere: no problem with youtube videos (even with hard accents like scottish) but a world of pain for tv shows and movies. On the same TV.

      Oh, and the youtube videos don't have the infamous mixing issues of "voices too low, explosions too high".

      It's the source material, not the device. Stop accusing TV speakers, they are ok-tier.

      2 replies →

    • I'm listening to a majority of video content in my stereo headphones on PC. They are good and quality of every source is good. Everything sounds fine except for some movie and some TV shows specifically. And those are atrocious in clarity.

      Regarding internal speakers, I have listened to several cheap to medium TVs on internal speakers, and yes on some models the sound was bad. But it doesn't matter, because the most mangled frequencies are high and low, and that's not the voice ones. When I listen on the TV with meh internal speakers I can clearly understand without any distortion voices in the normal TV programming, in sports TV, in old TV shows and old movies. The only offenders again are some of he new content.

      So no, it's not the internal speakers who are at fault, at all.

      2 replies →

  • Conspriacy theory ... TVs have bad sound so you're compelled to by a soundbar for $$$

    I've certainly had the experience of hard to hear dialog but I think (could be wrong) that that's only really happened with listening through the TV speakers. Since I live in an apartment, 99% of the time I'm listening with headphones and haven't noticed that issue in a long time.

    • I don't think the bad sound is necessarily deliberate, its more of a casualty of TV's becoming so very thin there's not enough room for a decent cavity inside.

      I had a 720p Sony Bravia from around 2006 and it was chunky. It had nice large drivers and a big resonance chamber, it absolutely did not need a sound bar and was very capable of filling a room on its own.

    • Soundbars are usually a marginal improvement and the main selling point is the compact size, IMO. I would only get a soundbar if I was really constrained on space.

      Engineering tradeoffs--when you make speakers smaller, you have to sacrifice something else. This applies to both soundbars and the built-in speakers.

    • Nah, it's just smaller space that's available. Big CRT had space for half decent one, superflat panel doesn't.

    • I assume that TVs have bad sound because better speakers just don't fit into their form factor.

    • Like all conspiracy theories, this seems rooted in a severe lack of education. How exactly do you expect a thin tiny strip to produce any sort of good sound? It's basic physics. It's impossible for a modern tv to produce good sound in any capacity.

      1 reply →

  • I had the same thing with Severance (last show I watched, I don't watch many) but I'm deaf, so thought it was just that. Seemed like every other line of dialogue was actually a whisper, though. Is this how things are now?

    • Our tv’s sound is garbage and I was forced to buy a soundbar and got a Sonos one. Night mode seems to crush down the sound track. Loud bits are quieter and quiet bits are louder.

      Voice boost makes the dialogue louder.

      Everyone in the house loves these two settings and can tell when they are off.

  • The specific suggestions they made are good in this case though, they want people to turn off the soap-opera-effect filters.

  • Using some cheap studio monitors for my center channel helped quite a bit. It ain't perfect, I still use CC for many things, but the flat mid channel response does help with speech.

  • Your speakers are probably garbage.

    • This is a gross simplification. It can be part of the explanation, but not the whole one, not even the most important.

      It mostly boils down to filmmaker choices:

      1. Conscious and purposeful. Like choosing "immersion" instead of "clarity". Yeah, nothing speaks "immersion" than being forced to put subtitles on...

      2. Not purposeful. Don't atttibute to malice what can be explained by incompetency... Bad downmixing (from Atmos to lesser formats like 2.0). Even if they do that, they are not using the technology ordinary consumers have. I mean, the most glaring example is the way the text/titles/credits size on screen have been shrinking to the point of having difficulties reading them. Heck, often I have difficulties with text size on by FullHD TV, just because the editing was done on some kind of fancy 4k+ display standing 1m from the editor. Imagine how garbage it looks on 720 or ordinary 480!

      For the recent example check the size (and the font used) of the movie title in the Alien Isolation movie and compare it to the movies made in the 80-90s. It's ridiculous!

      There are many good youtube videos that explain the problem in more details.

      https://youtu.be/VYJtb2YXae8

      https://youtu.be/wHYkEfIEhO4

  • This is probably the sound settings on your TV. Turn off Clear Voice or the equivalent, disable Smart Surround, which ignores 2.0 streams and badly downmuxes 5.1 streams, and finally, check your speaker config on the TV - they’re often set to Showroom by default, which kills voice but boosts music and sfx, and there should also be options for wall proximity, which do matter, and will make the sound a muddy mess if set incorrectly.

  • For an interesting example that goes in the opposite direction, I've noticed that big YouTube creators like MrBeast optimize their audio to sound as clear as possible on smartphone speakers, but if you listen to their content with headphones it's rather atrocious.

  • My personal theory of the case is that mid-band hearing loss is more common than people want to admit and tends to go undiagnosed until old age.

  • One big cause of this is the multi-channel audio track when all you have is stereo speakers. All of the dialog that should be going into the center speaker just fades away, when do you actually have a center the dialog usually isn't anywhere near as quiet.

    Depending on what you're using there could be settings like stereo downmix or voice boost that can help. Or see if the media you're watching lets you pick a stereo track instead of 5.1

    • We've been mixing vocals and voices in stereo since forever and that was never a problem for clarity. The whole point of the center channel is to avoid the phantom center channel collapse that happens on stereo content when listening off center. It is purely an imaging problem, not a clarity one.

      Also, in consumer setups with a center channel speaker it is rather common for it to have a botched speaker design and be of a much poorer quality than the front speakers and actually have a deleterious effect to dialog clarity.

      1 reply →

My main computer monitor, ancient now (a Dell U2711), was a calibrated SRGB display when new and still gives very good colour rendition.

Are movies produced in this colour space? No idea. But they all look great in SRGB.

A work colleague got himself a 40" HD TV as a big computer monitor. This is a few years ago. I was shocked at the overamped colour and contrast. Went through all the settings and with everything turned to minimum - every colour saturation slider, everything that could be found - it was almost realistic but still garish compared to SRGB.

But that's what people want, right? Overamped everything is how those demo loops at Costco are set up, that's what sells, that's what people want in their living rooms, right?

With many modern TVs just turning off energy savings will already enhance picture quality by a magnitude.

I setup my TV (LG OLED CX) with filmmaker mode in all relevant places and turned off a lot of nubs based on HDTVs [1] recommendations. LG has definitely better ways of tuning the picture just right than my old Samsung had. For this TV I had to manual calibrate the settings.

The interesting thing when turning on filmmaker mode is the feeling of too warm and dark colors. It will go away when the eyes get used to it. But it then lets the image pop when it’s meant to pop etc. I also turned off this auto brightness [2] feature that is supposed to guard the panel from burn it but just fails in prolonged dark scenes like in Netflix Ozark.

[1] https://youtu.be/uGFt746TJu0?si=iCOVk3_3FCUAX-ye [2] https://youtu.be/E5qXj-vpX5Q?si=HkGXFQPyo6aN7T72

If only the directors didn't make everything so dark and hard to see. Also stopped messing with sound, making it impossible to hear dialogues.

  • I'm surprised they didn't mention turning off closed captioning, because understanding the dialog is less important than experiencing the creator's intent.

    • I haven’t experienced issues understanding dialogue in Stranger Things, for what it’s worth.

  • Incidentally, that's the reason why I love photography in Nolan's movies: he seems to love scenes with bright light in which you can actually see what's going on.

    Most other movies/series instead are so dark that make my mid-range TV look like crap. And no, it's not an HW fault, as 500 nits should be enough to watch a movie.

    • Very ironic that it is Nolan who is widely known for consiously making movies with incomprehensible dialogue.

  • I've watched Silo season 2 and it is basically impossible to watch it during the day. Only at night, with brightness cranked up to 100%.

    • Game of Thrones S8E3.

      Could barely tell what was going on, everything was so dark, and black crush killed it completely, making it look blocky and janky.

      I watched it again a few years later, on max brightness, sitting in the dark, and I got more of what was going on, but it still looked terrible. One day I'll watch the 'UHD' 4k HDR version and maybe I'll be able to see what it was supposed to look like.

      1 reply →

    • My LG oled will go darker by itself during prolonged dark scenes, its not noticeable (other than that you can't see anything and you're not sure if its correct or not) until you get to a slightly brighter scene, can get it to stop for a bit by opening a menu.

If your TV supports a "gaming" mode, I always recommend enabling that, because it usually turns off all the "enhancements".

TV's should not try to be anything more than a large monitor.

  • It turns of any features that introduce latency - it will still mess up the colour space/brightness/saturation/... on most TVs.

All the settings in the world won't change the story.

  • Careful what you wish for, or we might get AI-powered "Vibrant Story" filters that reduce 62 minutes of plot-less filler to a 5 minute summary of the only relevant points. Or that try to generate some logic to make the magic in the story make narrative sense.

    • Just so you know, this is already very much a thing on TikTok: AI-generated movie summaries with narrator voice explaining the plot while showing only major beats, reducing movie from 2h to shorts totaling 10min.

      It’s honestly not the worse AI content out there! Lots of movies I wouldn’t consider watching but that I’m curious enough to see summarized (eg a movie where only the first title was good but two more were still published)

      1 reply →

    • I just said to a friend that the season 5 writing is so bad that I think AI would have done a better job. I hope someone tries that out once we get the final episode: Give an LLM the scripts for the first 4 seasons, the outcome frome the finale, and let it have a go and drafting a better season 5.

      And no, I'm not talking about the gay thing. The writing is simply atrocious. Numerous plot holes, leaps of reasoning, and terrible character interactions.

      1 reply →

Are there any creators that evolved and shoot at high frame rates that eliminate the need for motion interpolation and its artifacts or is the grip of the bad old film culture still too strong? (there are at least some 48fps films)

  • Most of the issues (like "judder") that people have with 24fps are due to viewing it on 60 fps screens, which will sometimes double a frame, sometimes triple it, creating uneven motion. Viewing a well shot film with perfect, expressive motion blur on a proper film screen is surprisingly smooth.

    The "soap opera" feel is NOT from bad interpolation that can somehow be done right. It's inherent from the high frame rate. It has nothing to do with "video cameras", and a lot to do with being simply too real, like watching a scene through a window. There's no magic in it.

    Films are more like dreams than like real life. That frame rate is essential to them, and its choice, driven by technical constraints of the time when films added sound, was one of happiest accidents in the history of Arts.

    • > Most of the issues (like "judder") that people have with 24fps are due to viewing it on 60 fps screens

      That can be a factor, but I think this effect can be so jarring that many would realize that there's a technical problem behind it.

      For me 24 fps is usually just fine, but then if I find myself tracking something with my eyes that wasn't intended to be tracked, then it can look jumpy/snappy. Like watching fast flowing end credits but instead of following the text, keeping the eyes fixed at some point.

      > Films are more like dreams than like real life. That frame rate is essential to them, and its choice, driven by technical constraints of the time when films added sound, was one of happiest accidents in the history of Arts.

      I wonder though, had the industry started with 60 fps, would people now applaud the 24/30 fps as a nice dream-like effect everyone should incorporate into movies and series alike?

    • > Films are more like dreams than like real life.

      Yes! The other happy accident of movies that contribute to the dream-like quality, besides the lower frame rate, is the edit. As Walter Murch says in "In the Blink of an Eye", we don't object to jumps in time or location when we watch a film. As humans we understand what has happened, despite such a thing being impossible in reality. The only time we ever experience jumps in time and location is when we dream.

      I would go further and say that a really good film, well edited, induces a dreamlike state in the viewer.

      And going even further than that, a popular film being viewed by thousands of people at once is as though those people are dreaming the same dream.

      2 replies →

    • I have a 120 fps TV. Panning shots at 24 fps still give me an instant headache.

      Real is good, it’s ergonomic and accessible. Until filmmakers understand that, I’ll have to keep interpolation on at the lowest setting.

      4 replies →

    • Variable refresh rate displays are becoming popular in smartphones and PCs, hopefully this won't be a technical issue soon.

    • 24 fps looks like terrible judder to me in the cinema too. I'm not afraid to admit it even if it will ruffle the feathers of the old 24 fps purists. It was always just a compromise between film cost and smoothness. A compromise that isn't relevant any longer with digital medium. But we can't have nice things it seems, because some people can't get over what they're used to.

      2 replies →

    • Problem is modern OLED tv's, they have no motion blur so its a chopfest at 24hz (or 24fps content at 120hz) when you turn off all motion settings.

    • Yes, and records sound better than digital audio.

      You've just learned to associate good films with this shitty framerate. Also, most established film makers have yet to learn (but probably never will) how to make stuff look good on high frames. It's less forgiving.

      It'll probably take the next generation of viewers and directors..

He is absolutely right. The soap opera effect totally ruins the look of most movies. I still use a good old 1080p plasma on default. It always looks good

  • I watched the most recent avatar and it was some HDR variant that had this effect turned up. It definitely dampens the experience. There’s something about that slightly fuzzed movement that just makes things on screen look better

> Whatever you do, do not switch anything on ‘vivid’ because it’s gonna turn on all the worst offenders. It’s gonna destroy the color, and it’s not the filmmaker’s intent.

To be fair, "vivid" mode on my old Panasonic plasma was actually an impressive option compared to how an LCD would typically implement it. It didn't change the color profile. It mostly changed how much wattage the panel was allowed to consume. Upward of 800w IIRC. I called it "light cannon" mode. In a dark room, it makes black levels look like they have their own gravitational field despite being fairly bright in absolute terms.

  • I miss my old Panasonic Plasma. I chose to leave it with my old home because of its size and its age. It was rock solid after 10+ years with many cycles to go. Solid gear! Sigh…

    • Plasma displays died because they couldn't be made in 4k resolution at an affordable price and they use 10 times as much power as LCD or OLED.

Thanks for the thought but from what I’ve heard from friends I’ll be keeping the final season unwatched just like I did with the last 2 episodes of GoT.

  • I don't understand this at all. The episode 4 ending was up there with Dear Billy for me.

  • It's been a while - I remember liking the first two seasons. Season three felt a bit silly to me without going into much detail (we need a spoiler text wrapper for HN). Season four has a lot of "zombie-esque" stuff which just doesn't have near the dread horror that the first two seasons did IMHO. Haven't seen any of the final season.

  • Yes I also let my girlfriend skip the last two episodes. Tyrion Lannister did say "if you think this has a happy ending, you haven't been paying attention".

    • As someone who hasn't watched GoT, only heard of it from others, let me guess: In those two episodes everyone dies a very cruel and painful death, except for one or two main characters?

      2 replies →

  • It's almost like you're living in an alternate universe where everything is just a little bit better.

  • It’s very bad.

    • All of the characters are constantly arguing with each other. The story line requires constant suspension of belief based on the endless succession of improbable events and improbable character behaviors. Contradictions with earlier episodes and even details within the same episode. It's really bad. I hope the final episode redeems it but I have my doubts. I want to have an LLM rewrite season 5 and see how much it improves.

      2 replies →

    • It really isn't. I keep seeing comparisons to the last seasons of Game of Thrones, but while there is a dip in quality this season, it is no where near as bad as what happened to GoT.

      3 replies →

More importantly, I wish I could turn off the entire Samsung 'Smart' TV UI and bring back HDMI, TV, and Apps. I get bombarded with ads and recommendations every time.

  • I keep all that stuff off my LG TV by keeping the ethernet cable unplugged and let Apple TV handle all the streaming stuff. I still somewhat resent that I need to wait for the software to boot up just to change inputs, but at least I don't get ads. Hopefully Samsung works the same way?

  • I have my LG TV dumbed down with some firewall rules in OPNsense. Something similar may help you

I’m not turning off motion smoothing. I don’t like the ghosting it can introduce but I hate the stutter artifacts from fast motion at 24fps with a passion. I get that people who grew up on 24fps movies and 60fps soap operas have a negative association with HFR, but I didn’t and I dread the flickery edges you make me see. (yes, even with frame rate matching)

The "soap opera" effect is real, I don't enjoy it.

  • The TrueMotion stuff drives me crazy. Chalk it up to being raised on movies filmed at 24fps, plus a heavy dose of FPS games (Wolf, Doom, Quake) as a kid, but frame rate interpolation instantly makes it feel less like a movie and more like I’m watching a weird “Let’s Play.”

  • christmas day, walked into a relative’s living room to watch football and the players were literally gliding across the screen. lol

The only garbage I'm turning off is Stranger Things. How did they manage to keep going after the train-wreck that was Season 3??

Dynamic Contrast = Low is needed on LG TVs to actually enable HDR scene metadata or something weird like that. 60->120hz motion smoothing is also useful on OLEDs to prevent visual judder; you want either that or black frame insertion. I have no idea what Super Resolution actually does, it never seems to do anything.

Also, as a digital video expert I will allow you to leave motion smoothing on.

  • noo motion smoothing is terrible unless you like soap operas and not cinema, black frame insertion is to lower even more the pixel persistence which really does nothing for 24fps content which already has a smooth blur built in to the image, the best is setting your tv to 120hz so that your 24fps fits evenly and you don't get 3:2 pulldown judder

    • > noo motion smoothing is terrible unless you like soap operas and not cinema

      That's what's so good about it. They say turning it off respects the artists or something, but when I read that I think "so I'm supposed to be respecting Harvey Weinstein and John Lasseter?" and it makes me want to leave it on.

      > black frame insertion is to lower even more the pixel persistence which really does nothing for 24fps content which already has a smooth blur built in to the image

      That's not necessarily true unless you know to set it to the right mode for different content each time. There are also some movies without proper motion blur, eg animation.

      Or, uh, The Hobbit, which I only saw in theaters so maybe they added it for home release.

      > he best is setting your tv to 120hz so that your 24fps fits evenly and you don't get 3:2 pulldown judder

      That's not really a TV mode, it's more about the thing on the other side of the TV I think, but yes you do want that or VFR.

    • Unlike older tech OLED has no motion blur as pixel response time is basically instant making panning shots a judderfest when you turn off most settings. You can say thats how it should be, but the way it looked back then is also not how it appears on your OLED. If I go to a proper film projector cinema I don't have a problem watching it.

      https://youtu.be/E5qXj-vpX5Q?t=514

  • I assume super resolution is for upscaling old content. Try it on a 240p YouTube video and see what it does there.

Shades of the game of thrones creators telling us our TV settings were at fault when they decided to release an entire episode filmed in the dark?

Game of Thrones Season 8 was lambasted for having an episode that was mostly in darkness...in 2019.

You'd think television production would be calibrated for the median watcher's TV settings by now.

  • But that would mean that everybody is experiencing a quality level based on the least common denominator.

    I think TV filters (vivid, dynamic brightness, speech lifting, etc) are actually a pretty decent solution to less-than-ideal (bright and noisy environment, subpar screen and audio) viewing conditions.

It’s funny to read about respecting content on that site, which has no respect for their own content.

Yes, I usually run add blockers, Pihole etc, I’m away from home and temporarily without my filters.

  • Especially when the "content" is a blatant AI summary:

    > Duffer’s advice highlights a conflict between technological advances and creators' goals. Features like the ones he mentioned are designed to appeal to casual viewers by making images appear sharper or more colorful, but they alter the original look of the content. By asking fans to turn these features off, he is stressing the importance of preserving the director’s vision.

From the first four episodes released before Christmas, I feel far more worried if the season is worth watching at all, not what TV settings to use.

The tone felt considerably different: constant action, little real plot, character interaction felt a shallow reflection of prior seasons, exposition rather than foreshadowing and development. I was cringing during the “Tom, Dick and Harry” section. From body language, the actors seemed to feel the same way.

"Duffer’s advice highlights a conflict between technological advances and creators' goals. Features like the ones he mentioned are designed to appeal to casual viewers by making images appear sharper or more colorful, but they alter the original look of the content."

------------------------

Settings that make the image look less like how the material is supposed to look are not "advances".

Q: So why do manufacturers create them?

A: They sell TV's.

Assume that every manufacturer out there is equally capable of creating a screen that faithfully reproduces the content to the best ability of current technology. If every manufacturer does just that, then their screens will all look extremely similar.

If your TV looks like everybody else's, how do you get people strolling through an electronics store to say, "Wow! I want that one!"? You add gimmicky settings that make the image look super saturated, bizarrely smooth, free of grain etc.. Settings that make the image look less like the source, but which grab eyes in a store. You make those settings the default too, so that people don't feel ripped off when they take the TV out of the store.

If you take a typical TV set home and don't change the settings from default, you're typically not going to see a very faithful reproduction of the director's vision. You're seeing what somebody thought would make that screen sell well in a store. If you go to the trouble of setting your screen up properly, your initial reaction may be that it looks worse. However, once you get used to it, you'll probably find the resulting image to be more natural, more enjoyable, and easier on the eyes.

  • >Assume that every manufacturer out there is equally capable of creating a screen that faithfully reproduces the content to the best ability of current technology.

    That basically isn’t true. Or rather, there are real engineering tradeoffs required to make an actual consumer good that has to be manufactured and sold at some price. And, especially considering that TVs exist at different price points, there are going to be different tradeoffs made.

    • Yes, there are tradeoffs, but LCD, etc. technology is now sufficiently good that displays in the same general price category tend to look quite similar once calibrated. The differences are much more noticeable when they're using their default "gimmick" settings, and that's by design.

  • why even bother with "TV Screens"? Why not just get a big computer monitor instead, like 27" or something

    • I don't know what kind of a joke you tried here, but I think a vast majority of TV screens can be put in game or PC mode, and all the input lag and stupid picture processing goes away. I run a 43" LG 4K TV as a PC monitor and never have I had a (flat screen) monitor with a faster response rate! My cinema TV is an old FullHD 42" Philips that has laughably bad black levels. I run it also in PC mode but the real beauty of this TV is that without further picture processing it produces nice and cinemalike flat color that is true to the input material that I feed it. Flashy capeshit will be flashy and bright, and a muted period drama will stay muted.

Probably a good time to plug Filmmaker mode!

  • From what I’ve read, you want to make sure that the setting is spelled FILMMAKER MODE (in all caps) with a (TM) symbol, since that means that the body who popularized the setting has approved whatever the manufacturer does when you turn that on (so if there’s a setting called “Cinephile Mode” that could mean anything).

    With that being said, I’ve definitely seen TVs that just don’t have FILMMAKER MODE or have it, but it doesn’t seem to apply to content from sources like Chromecast. The situation is far from easy to get a handle on.

  • Typically “Game” mode, on TVs, turns off post processing, to avoid the extra frames of lag it causes.

    • That doesn't necessarily mean it looks good or is tuned well, just that it has lower latency.

Implying that makes a bad season better. When you watch thrash settings doesn't really matter

  • I don't think it implies that at all.

    It is perfectly understandable that the people who really care about how their work was colour-graded, then suggest you turn off all the features that shit all over that work. Similarly for the other settings he mentions.

    Don't get me wrong, I haven't seen the first season, so won't watch this, but creators / artists do and should care about this stuff.

    Of course, people can watch things in whatever dreaded settings they want, but lots of TVs default to bad settings, so awareness is good.

Anyone who mentions: "the soap opera effect" is someone who used to watch soap operas. The reason they dislike it, is their own bad taste.

I like how it looks because it is "high quality videogame effect" for me. 60 hz, 120hz, 144hz, you only get this on a good videogame setup.

  • Just because someone has different taste doesn't make it bad taste. Books have lower resolution still, and they evoke far greater imaginative leaps. For me, the magic lies in what is not shown; it helps aid the suspension of disbelief by requiring you imagination to do more work filling in the gaps.

    I'm an avid video game player, and while FPS and sports-adjacent games demand high framerates, I'm perfectly happy turning my render rates down to 40Hz or 30Hz on many games simply to conserve power. I generally prefer my own brain's antialiasing, I guess.

  • It is a well-known description for what each brand calls something different. As I wait in a physiotherapist office I am being subjected to a soap opera against my will. Many will have seen snippets of The Bold and the Beautiful without watching a single episode, but enough to know that it looks 'different'.

  • The Godfather in 144hz with DNR and motion smoothing, just like Scorsese intended.

    • My counterargument is this: I would love if Bruce Lee was filmed at 144hz.

      He had been told to slow down because 24hz simply could not capture his fast movements.

      At 144hz, we would be able to better appreciate his abilities.

  • Real high framerate is one thing, but the TV setting is faking it with interpolation. There's not really a good reason to do this, it's trickery to deceive you. Recording a video at 60fps is fine, but that's just not what TV and movies do in reality. No one is telling you to watch something at half the intended framerate, just the actual framerate.

    • In principle, I agree with you.

      I would vastly prefer original material at high frame rates instead of interpolation.

      But I remember the backslash against “The Hobbit: An Unexpected Journey” because it was filmed at 48 Hz, and that makes me think that people dislike high frame rate content no matter the source, so my comment also covers these cases.

      Also, because of that public response, we don't have more content actually filmed at high frame rates =)

      1 reply →

  • I disliked the effect (of an unfamiliar TV’s postprocessing) without calling it that and without ever having seen a soap opera. What’s your analysis, doc?

  • Films use cheap set dec and materials. They use lighting and makeup tricks.

    If you watch at a higher frame rate, the mistakes become obvious rather than melting into the frames. Humans look plastic and fake.

    The people that are masters of light and photography make intentional choices for a reason.

    You can cook your steak well done if you like, but that's not how you're supposed to eat it.

    A steak is not a burger. A movie is not a sports event or video game.

    • The choice wasn't intentional, it was forced by technology and in turn, methods were molded by technological limitation.

      What next, gonna complain resolution is too high and you can see costume seams ?

      The film IS the burger, you said it yourself, it shows off where the movie cheapened on things. If you want a steak you need steak framerate

      3 replies →

    • Enter the Dragon would have been amazing if it had been filmed at 144 Hz.

      The technical limitations of the past century should not define what constitutes a film.

    • > You can cook your steak well done if you like, but that's not how you're supposed to eat it.

      Did you read an interview with the cow’s creator?

  • I find the rejection of higher frame rates for movies and TV shows to be baffling when people accepted color and sound being introduced which are much bigger changes.

  • I call it the "British comedy effect". And it's awful, and if you like it, you're awful too, sorry to say.

  • It's called the soap opera effect because soap operas were shot on video tape, instead of film, to save money. It wasn't just soap operas, either. Generally, people focus on frame rate, but there are other factors, too, like how video sensors capture light across the spectrum differently than film.

  • wow 2008 called

    I haven't thought about or noticed in nearly two decades

    My eyes 100% adjusted, I like higher frame and refresh rates now

    I cant believe that industry just repeated a line about how magical 24fps feels for ages and nobody questioned it, until they magically had enough storage and equipment resources to abandon it. what a coincidence

> regular backup of your mail. Google's Takeout service is a straightforward way to achieve this.

Takeout is a horrible way to do regular backups. You have to manualy request it, it takes a long time to generate, manual download... I only use it for monthly full backups.

Much better way for continous incremental backups is IMAP client that locally mirrors incomming emails (Mutt or Thunderbird). It can be configured to store every email in separate file.

And how about the content garbage? Not spoilering anything but man...

  • I don't know what are you talking about, there at zillions of 10 star ratings on IMDB! /s

    It was 10 stars before it was even released... Are humans still needed at all? Just have LLMs generate crappy content and bots upvote it.

He's right about the settings. Why would these be the default? Who watches TV that way?

Unfortunately settings won't help Season 5 be any better, it verges on being garbage itself, a profound drop in quality compared to previous seasons.

  • Let’s just have Sarah Connor be in it for no reason being angry all the time for no reason.

    • I like the idea that Linda Hamilton's actually playing Sarah Connor here.

      "After battling Skynet her whole life, Sarah Connor has vowed to even the playing field... no matter what the cost. Coming soon in Terminator: Hawkins!"

My TV is from around 2017 and some of those settings definitely suck on it. I'm curious if they have improved any of them on newer TVs.

Here's how bad it was in 2017. One of the earliest things I watched on that TV was "Guardians of the Galaxy" on some expanded basic cable channel. The fight between Peter and Gamora over the orb looked very jerky, like it was only at about 6 fps. I found some reviews of the movie on YouTube that included clips of that fight and it looked great on them, so I know that this wasn't some artistic choice of the director that I just didn't like. Some Googling told me about the motion enhancement settings of the TV, and how they often suck. I had DVRed the movie, and with those settings off the scene looked great when I watched it again.

I thought there is such a thing (although probably some TV sets do not have) as "film maker mode" to do it according to the film maker's intention (although I don't know all of the details, so I do even know how well it would work). "Dolby Vision Movie Dark" is something that I had not heard of.

(However, modern TV sets are often filled with enough other junk that maybe you will not want all of these things anyways)

At first I thought it's about turning off settings that allow me to watch garbage TV shows (or garbage ending seasons of initially decent TV shows in this case)

Release your movie in native 120 fps and I'll turn off motion interpolation. Until then, minor flickering artifacts when it fails to resolve motion, or minor haloing around edges of moving objects, are vastly preferable to unwatchable judder that I can't even interpret as motion sometimes.

Every PC gamer knows you need high frame rates for camera movement. It's ridiculous the movie industry is stuck at 24 like it's the stone age, only because of some boomers screaming of some "soap opera" effect they invented in their brains. I'd imagine most Gen Z people don't even know what a "soap opera" is supposed to be, I had to look it up the first time I saw someone say it.

My LG OLED G5 literally provides a better experience than going to the cinema, due to this.

I'm so glad 4k60 is being established as the standard on YouTube, where I watch most of my content now... it's just movies that are inexplicably stuck in the past...

  • > Every PC gamer knows you need high frame rates for camera movement.

    Obviously not, because generations of people saw "movement" at 24 fps. You're railing against other people's preferences, but presenting your personal preferences as fact.

    Also, there are technical limitations in cameras that aren't present in video games. The higher the frame rate, the less light that hits it. To compensate, not only do you need better sensors, but you probably need to change the entire way that sets, costumes, and lighting are handled.

    The shift to higher frame rates will happen, but it's gonna require massive investment to shift an entire industry and time to learn what looks good. Cinematographers have higher standards than random Youtubers.

    • > You're railing against other people's preferences, but presenting your personal preferences as fact.

      It is a fact that motion is smoother at 120 fps than 24, and therefore easier to follow on screen. There are no preferences involved.

      > Also, there are technical limitations in cameras that aren't present in video games.

      Cameras capable of recording high quality footage at this refresh rate already exist and their cost is not meaningful compared to the full budget of a movie (and you can use it more than one time of course).

> Duffer’s advice highlights a conflict between technological advances and creators' goals. Features like the ones he mentioned are designed to appeal to casual viewers by making images appear sharper or more colorful, but they alter the original look of the content. By asking fans to turn these features off, he is stressing the importance of preserving the director's vision.

is it just me or does this article's last paragraph feel particularly AI generated..

whether the author did use AI or not isnt my main gripe -- it's just that certain wording (like this) won't be free from scrutiny in my head anymore :(

I read a lot of comments here that freeze my blood, what needs to be said is that there is something called creative intent.

For those unfamiliar with the term you should watch Vincent Teoh @ HDTVTest:

https://www.youtube.com/hdtvtest

Creative intent refers to the goal of displaying content on a TV precisely as the original director or colorist intended it to be seen in the studio or cinema.

A lot of work is put into this and the fact that many TVs nowadays come with terrible default settings doesn't help.

We have a whole generation who actually prefer the colors all maxed out with motion smoothing etc. turned to 11 but that's like handing the Mona Lisa to some rando down the street to improve it with crayons.

At the end of the day it's disrespectful to the creator and the artwork itself.

I hope AI tools allow for better fan edits. There's enough of a foundation and source footage to redo the later episodes of Stranger Things ... The Matrix ... etc.

  • I need to test the new audio demuxing model out for fan edits. Separating music, dialog, and sound effects into stems would make continuity much easier. Minor rewrites would be interesting, but considering Tron Ares botched AI rewrite dubbing so bad I’m not holding my breath.

    • I wouldn't be surprised if the free/open voice cloning and lip-synch tools of today are better than whatever "professional" tools they were using however many months/year ago they did that edit.

  • Yes, I think that this is one place to be very bullish on AI content creation. There are many people with fantastic visions for beautiful stories that they will never be in a position to create the traditional way; oftentimes with better stories than what is actually produced officially.

    (You ever think about how many fantastic riffs have been wasted with cringe lyrics?)

    • Nothing is stopping you right now from buying or finding or creating a catalog of loops and samples that you can use to create your own Artistic Vision[tm]. The technology exists and has existed for decades, no AI required.

    • i often think about all the music ruined by self obsessed dorks singing soulless middle school poetry, and it's the main application of AI i'm quite excited for

Do most people still watch stuff on their TVs? I haven’t used my TV for anything in 2 years. I usually consume content on my smartphone or computer.

My Advice: Turn off your TV. Anything that you watch on TV is garbage.

  • It's true, I read Hacker News on my TV.

    • TV means the media that is broadcast or streamed on TV. Not the display device that you use to read on internet or what you do on your computer.

  • I cannot take your advice seriously unless you also recommend turning computers off

Totally agreed. I read somewhere that the only place these features help is sports. They should not be defaults. They make shows and films look like total crap.

  • Actually, they do not belong anywhere. If you look at the processing pipeline necessary to, for example, shoot and produce modern sporting events in both standard and high dynamic range, the last thing you want is a television that makes its own decisions based on some random setting that a clueless engineer at the manufacturer thought would be cool to have. Companies spend millions of dollars (hundreds of millions in the case of broadcasters) to deliver technically accurate data to televisions.

    These settings are the television equivalent of clickbait. They are there to get people to say "Oh, wow!" at the store and buy it. And, just like clickbait, once they have what they clicked on, the experience ranges from lackluster and distorted to being scammed.

    • As someone who has built multi-camera live broadcast systems and operated them you are 100% correct. There is color correction, image processing, and all the related bits. Each of these units costs many times more and is far more capable with much higher quality (in the right hands) than what is included in even the most high end TV.

    • They're the equivalent of the pointless DSP audio modes on 90's A/V receivers. Who was ever going to use "Concert Hall", "Jazz Club", or "Rock Concert" with distracting reverb and echo added to ruin the sound.

    • I think it is helpful to have settings that you can change, although the default settings should probably match those intended by whoever made the movie or TV show that you are watching, according to the specification of the video format. (The same applies to audio, etc.)

      This way, you should not need to change them unless you want nonstandard settings for whatever reason.

Yeah, televisions come full of truly destructive settings. I think part of the genesis of this virus is the need for TV's to stand out at the store. Brands and models are displayed side-by-side. The only way to stand out is to push the limits of over-enhancement along every possible axis (resolution, color, motion, etc.).

Since consumers are not trained to critically discern image and video quality, the "Wow!" often wins the sale. This easily explains the existence of local dimming solutions (now called miniLED or some other thing). In a super bright Best Buy or Walmart viewing environment they can look fantastic (although, if you know what to look for you can see the issues). When you get that same TV home and watch a movie in the dark...oh man, the halos jump off the screen. Now they are starting to push "RGB miniLED" as if that is going to fix basic optics/physics issues.

And don't get me started on horrible implementations of HDR.

This is clearly a case of the average consumer not knowing enough (they should not have to be experts, BTW) and effectively getting duped by marketing.

The soap opera effect is only a problem because no one is used to it. Higher FPS is objectively better. These motion interpolation settings are now ubiquitous and pretty much nobody cares about said effect anymore, which is great, because maybe now we can start having movies above 24FPS.

To preempt replies: ask yourself why 24 frames per second is optimal for cinema instead of just being an ancient spec that everyone got used to.

  • Personally, I have no issue watching things that are shot at 60fps (like YouTube videos, even live action) but the motion smoothing on TV shows makes it look off to me.

    I dunno if it's just a me thing, but I wonder if a subconscious part of my brain is pegging the motion smoothed content as unnatural movement and dislikes it as a result.

    • The motion smoother also has to guess which parts of the picture to modify. Is the quarterback throwing the ball the important part? The team on the sidelines? The people in the stands? The camera on wires zooming around over the field to get bird’s eye views? When it guesses wrong and enhances the wrong thing, it looks weird.

      Also imagine the hand of a clock rotating at 5 minutes’ worth of angle per frame, and 1 frame per second. If you watched that series of pictures, your brain might still fill in that the hand is moving in a circle every 12 seconds.

      Now imagine smoothing synthesizing an extra 59 frames per second. If it’s only consider the change between 2 frames, it might show a bright spot moving in a straight line between the 12 and 1 position, then 1 and 2, and so on. Instead of a circle, the circle of the hand would be tracing a dodecagon. That’s fine, but it’s not how your brain knows clocks are supposed to move.

      Motion smoothing tries to do its best to generate extra detail that doesn’t exist and we’re a long way from the tech existing for a TV to be able to do that well in realtime. Until then, it’s going to be weird and unnatural.

      Film shot at 60FPS? Sure. Shot at 24 and slopped up to 60? Nah, I’ll pass.

  • easy... because 24fps has that dream like feel to it.. second you go past that it starts to look like people on a stage and you loose the illusion... i couldn't watch the hobbit because of it

    movies above 24fps won't become a thing, it looks terrible and should be left for documentaries and sports

  • > The soap opera effect is only a problem because no one is used to it. Higher FPS is objectively better.

    But synthesizing these frames ends up with a higher frame rate but with the same shutter angle / motion blur of the original frame rate, which looks off to me. Same reason the shutter angle is adjusted for footage that is intended to be slow motion.

  • > To preempt replies: ask yourself why 24 frames per second is optimal for cinema instead of just being an ancient spec that everyone got used to.

    "Everyone" includes the filmmakers. And in those cases where the best filmmakers already found all kinds of artistic workarounds for the lower framerate in the places that mattered, adding interpolation will fuck up their films.

    For example, golden age animators did their own interpolation by hand. In Falling Hare, Bugs' utter despair after looking out the window of a nosediving airplane is animated by a violent turn of his head that moves farther than what could be smoothly animated at 24fps. To avoid the jumpcut, there is a tween of an elongated bunny head with four ears, seven empty black eye sockets, four noses, and eight teeth. It's absolutely terrifying if you pause on that frame[1], but it does a perfect job of connecting the other cells and evoking snappier motion than what 24fps could otherwise show.

    Claiming that motion interpolation makes for a better Falling Hare is like claiming that keeping the piano's damper pedal down through the entirety of Bach's Prelude in C produces better Bach than on a harpsichord. In both cases, you're using objectively better technology poorly, in order to produce worse results.

    1: https://www.youtube.com/watch?v=zAPf5fSDGVk

    • Agreed, the idea that there’s anything “objective” about art is kind of hilarious. Yes, it may be technically better in that there are more frames but does it make a more enjoyable film?

  • You’d need to actually support your assertion that higher FPS is objectively better, especially higher FPS via motion interpolation which inherently degrades the image by inserting blurry duplicated frames.

    People are “used to” high FPS content: Live TV, scripted TV shot on video (not limited to only soap operas), video games, most YouTube content, etc are all at 30-60FPS. It’d be worth asking yourself why so many people continue to prefer the aesthetic of a lower framerates when the “objectively better” higher FPS has been available and moderately prevalent for quite some time.

  • Films rely on 24 fps or, rather, low motion resolution to help suspend disbelief. There are things that the viewer are not meant to see or at least see clearly. Yes, part of that specific framerate is nostalgia and what the audience expects a movie to look like, but it holds a purpose.

    Higher frame rates are superior for shooting reality. But for something that is fictional it helps the audience suspend their disbelief.

    • I'm not sure I buy that it helps the audience suspend their disbelief.

      If it did horror films would be filmed at higher frame rates for extra scares.

      Humans have a long history of suspending belief in both oral and written lore. I think that 'fps' may be as functionally equivalent as the santa clause stories, fun for kids but the adults need to pick up the bill.

> It’s gonna destroy the color, and it’s not the filmmaker’s intent.

I don't care about the "filmmaker's intent", because it is my TV. I will enable whatever settings look best to me.

This article seems to imply that the default settings are the manufacturer recommended ones for streaming movies - is that bad ux? Should Netflix be able to push recommended settings to your tv?

  • The problem is it can be subjective. Some people really like the “smooth motion” effect, especially if they never got used to watching 24fps films back in the day. Others, like me, think seeing stuff at higher refresh rates just looks off. It may be a generational thing. Same goes for “vivid color” mode and those crazy high contrast colors. People just like it more.

    On the other hand, things that are objective like color calibration, can be hard to “push down” to each TV because they might vary from set to set. Apple TV has a cool feature where you can calibrate the output using your phone camera, it’s really nifty. Lots of people comment on how good the picture on my TV looks, it’s just because it’s calibrated. It makes a big difference.

    Anyways, while I am on my soap box, one reason I don’t have a Netflix account any more is because you need the highest tier to get 4k/hdr content. Other services like Apple TV and Prime give everyone 4k. I feel like that should be the standard now. It’s funny to see this thread of suggestions for people to get better picture, when many viewers probably can’t even get 4k/hdr.

> Duffer’s advice highlights a conflict between technological advances and creators' goals

I wouldn't call it a "technological advance" to make even the biggest blockbuster look like it was filmed with a 90s camcorder with cardboard sets.

Truemotion and friends are indeed garbage, and I don't understand how people can leave it on.

what about not filming entire show in darkness. or, i don't know, filming it in a way that it will look ok on modern televisions without having to turn off settings.

  • > filming it in a way that it will look ok on modern televisions without having to turn off settings.

    That's a lost cause. You never know what sort of random crap and filters a clueless consumer may inflict on the final picture. You cannot possibly make it look good on every possible config.

    What you can do is make sure your movie looks decent on most panels out there, assuming they're somewhat standard and aren't configured to go out of their way to nullify most of your work.

    The average consumer either never knew these settings existed, or played around with them once when they set up their TV and promptly forgot. As someone who often gets to set up/fix setups for aforementioned people, I'd say this is a good reminder.

that article ends with AI slop (perhaps all of it)

"Duffer’s advice highlights a conflict between technological advances and creators' goals. Features like the ones he mentioned are designed to appeal to casual viewers by making images appear sharper or more colorful, but they alter the original look of the content. By asking fans to turn these features off, he is stressing the importance of preserving the director’s vision."

When people say “creator’s intent”, it sounds like a flavor. Like how food comes out of the kitchen before you put toppings on it to make it your own.

But vivid mode (et al) literally loses information. When the TV tries to make everything look vibrant, it’s effectively squishing all of the colors into a smaller color space. You may not be able to even tell two distinct objects apart because everything is similarly bright and vibrant.

Same with audio. The famous “smile” EQ can cause some instruments to disappear, such as woodwinds.

At the end of the day, media is for enjoyment and much of it is subjective, so fine do what you need to do to be happy. But few people would deliberately choose lower resolution (except maybe for nostalgia), which is what a lot of the fancy settings end up doing.

Get a calibration if you can, or use Filmmaker Mode. The latter will make the TV relatively dark, but there’s usually a way to adjust it or copy its settings and then boost the brightness in a Custom mode, which is still a big improvement over default settings from the default mode.

Without even clicking I know he’s talking about motion smoothing.

Went to the in-laws over the holidays and the motion smoothing on the otherwise very nice LG tv was absolutely atrocious.

My sister had her Nintendo Switch connected to it and the worst thing was not the low resolution game on the 4k display - it was the motion smoothing. Absolutely unbearable. Sister was complaining about input lag and it was most definitey caused by the motion smoothing.

I keep my own TV on game mode regardless of the content because otherwise all the extra “features” - which includes more than just motion smoothing - pretty much destroys picture quality universally no matter what I’m watching.

Stranger Things creator is not aware of how stupid most Netflix viewers are. They literally watch algorithm-generated TV shows all day long, and he expects to explain relatively technical things to them. Good luck, Mr. Creator.

I'm not even convinced anyone really watches Stranger Things, so I don't see the point. Seems like something people put on as background noise while they are distracted by their phones.

  • The first seasons were captivating. This last one? I walked out of the room, to do some housework, came ban 10 minutes later, asked what happened? Answer was a simple sentence.

    I was also gradually switching to treating this season as a background noise, as it fails to be better than that. It is insultingly bad at places even consumed this way.

  • I see a tonne of “fan” content on the video sites tagged #strangerthings, which is strange since I have that tag blocked. It's almost like it's all paid promotion…

    • I hope you don't imply that the 10 star ratings on IMDB are not organic... The system is definitely not rigged :D

  • People were clearly watching through at least season 4. That show used songs that nowadays most viewers would consider to be oldies that became hits again after the episodes containing them were released.

    For example Kate Bush's 1985 "Running up that Hill" because a huge worldwide hit after appearing in season 4.

    • “Running up that hill” becomes a huge worldwide hit approximately every ten years.

    • I never watched the show but I did catch the revival of interest in Kate Bush by osmosis, so I think the show probably does have some cultural impact.

  • Just for the synth intro

    • Ironically the Apple TV Netflix app really wants to soup the intro - going so far as to mute the intro to offer the “skip” button. You have to hit “back” to get the audio back during the intro.

      Not she why Netflix is destroying destroying the experience themselves here.

Yeah, kiss m'ass. I agree that some of those settings do need to be turned off. When I visit someone and see their TV on soap opera mode, I fight the urge to fix it. Not my house, not my TV, not my problem if they like it that way, and yet, wow, is it ever awful.

But then getting into recommendations like "turn off vivid mode" is pretty freaking pretentious, in my opinion, like a restaurant where the chef freaks out if you ask for salt. Yes, maybe the entree is perfectly salted, but I prefer more, and I'm the one paying the bill, so calm yourself as I season it to my tastes. Yes, vivid modes do look different than the filmmaker intended, but that also presumes that the viewer's eyes are precisely as sensitive as the director's. What if I need higher contrast to make out what's happening on the screen? Is it OK if I calibrate my TV to my own personal viewing conditions? What if it's not perfectly dark in my house, or I want to watch during the day without closing all the blinds?

I tried watching the ending of Game of Thrones without tweaking my TV. I could not physically see what was happening on the screen, other than that a navy blue blob was doing something against a darker grey background, and parts of it seemed to be moving fast if I squinted. I cranked the brightness and contrast for those episodes so that I could actually tell what was going on. It might not have aligned with the director's idea of how I should experience their spectacle, but I can live with that.

Note that I’d also roll my eyes at a musician who told me how to set my equalizer. I’ll set it as I see fit for me, in my living room’s own requirements, thanks.

  • I agree that the viewer should change the settings if they want different settings than the film maker intended, although it also makes sense to have a option (not mandatory) to use the settings that the film maker intended (if these settings are known) in case you do not want to specify your own settings. (The same would apply to audio, web pages, etc.)

    • Sure. I’m all for having that as an option, or even the default. That’s a good starting place for most people. I think what I most object to is the pretentiousness I read into the quote:

      > Whatever you do, do not switch anything on ‘vivid’ because it’s gonna turn on all the worst offenders. It’s gonna destroy the color, and it’s not the filmmaker’s intent.

      I’m interested in trying the filmmaker’s intent, like I’ll try the chef’s dinner before adding salt because it’ll probably be wonderful. But if I think the meal still needs salt, or my TV needs more brightness or contrast, I’ll add it. And even if the filmmaker or chef thinks I’m ruining their masterpiece, if I like it better that way, that’s how I’ll enjoy it.

      And I’m very serious about the accessibility bit. My vision is great, but I need more contrast now than I did when I was 20. Maybe me turning up the brightness and contrast, or adding salt, lets me perceive the vision or taste the meal the same way as the director or chef does.

  • 100% agree. I’ve tried multiple times to use the cinema modes in my TVs, the ones that are supposed to be “as the director intended” but in the end they’re always too dark and I find things hard to see, and turns out I just subjectively like the look of movies on the normal (or gasp sometimes vivid if it’s really bright in the room) than in the “proper” cinema mode. I don’t really care what the creator thinks, it looks better to me so it’s better for me.

    The equalizer analogy is perfect.

    • Movies are mastered for a dark room. It's not going to look good with accurate settings if you are in a lit room.

      Having said that, there are a lot of bad HDR masters.

  • > What if I need higher contrast to make out what's happening on the screen?

    The point you make isn't incorrect at all. I would say that TV's should ship without any such enhancements enabled. The user should then be able to configure it as they wish.

    Plenty of parallel examples of this: Microsoft should ship a "clean" version of Windows. Users can they opt into whatever they might want to add.

    Social media sites should default to the most private non-public sharing settings. Users can open it up to the world if they wish. Their choice.

    Going back to TV's: They should not ship with spyware, log-ware, behavioral tracking and advertising crap. Users can opt into that stuff if they value proposition being offered appeals to them.

    Etc.

    • > I would say that TV's should ship without any such enhancements enabled.

      I strongly agree with that. The default settings should be… well, “calibrated” is the wrong word here, but that. They should be in “stand out among others on the showroom floor” mode, but set up to show an accurate picture in the average person’s typical viewing environment. Let the owner tweak as they see fit from there. If they want soap opera mode for some bizarre reason, fine, they can enable it once it’s installed. Don’t make the rest of us chase down whatever this particular brand calls it.

Is there a setting to make it stop being orange and blue? Such color grading is an instant tell the show (or video game) is creatively bankrupt trash.

> Duffer’s advice highlights a conflict between technological advances and creators' goals. Features like the ones he mentioned are designed to appeal to casual viewers by making images appear sharper or more colorful, but they alter the original look of the content.

I know I'm pretty unsophisticated when it comes to stuff like art, but I've never been able to appreciate takes like this. If I'm watching something on my own time from the comfort of my home, I don't really care about what the filmmaker thinks if it's different than what I want to see. Maybe he's just trying to speak to the people who do care about seeing his exact vision, but his phrasing is so exaggerated in how negatively he seems to see these settings makes it seem like he genuinely thinks what he's saying applies universally. Honestly, I'd have a pretty similar opinion even for art outside of my home. If someone told me I was looking at the Mona Lisa wrong because it's "not what the artist intended" I'd probably laugh at them. It doesn't really seem like you're doing a good job as an artist if you have to give people instructions on how to look at it.

  • The tone might be a miss, but I enjoy having access to information on the intended experience, for my own curiosity, to better understand the creative process and intentions of the artist, and to habe the option to tweak my approach if I feel like I'm missing something other people aren't.

    I hear you, artists (and fans) are frequently overly dogmatic on how their work should be consumed but, well, that strikes me as part-and-parcel of the instinct that drives them to sink hundreds or thousands of hours into developing a niche skill that lets them express an idea by creating something beautiful for the rest of us to enjoy. If they didn't care so much about getting it right, the work would probably be less polished and less compelling, so I'm happy to let them be a bit irritating since they dedicated their life to making something nice for me and the rest of us, even if it was for themselves.

    Up to you whether or not this applies to this or any other particular creator, but it feels appropriate to me for artists to be annoying about how their work should be enjoyed in the same way it's appropriate for programmers to be annoying about how software should be developed and used: everyone's necessarily more passionate and opinionated about their domain and their work, that's why they're better at it than me even if individual opinions aren't universally strictly right!

  • If someone told me I was looking at the Mona Lisa wrong because it's "not what the artist intended" I'd probably laugh at them.

    That's arguably a thing, due to centuries of aged and yellowed varnish.

    You can watch whatever you want however you want, but it's entirely reasonable for the creator of art to give tips on how to view it the way it was intended. If you'd prefer that it look like a hybrid-cartoon Teletubby episode, then I say go for it.

  • To me it's not about art. It's about this setting making the production quality of a billion dollar movie look like a cardboard SNL set.

    When walking past a high end TV I've honestly confused a billion dollar movie for a teen weekend project, due to this. It's only when I see "hang on, how's Famous Actor in this?" that I see that oh this is a Marvel movie?

    To me it's as if people who don't see it are saying "oh, I didn't even realise I'd set the TV to black and white".

    This is not high art. It's... well... the soap opera effect.

    • If films shot at a decent enough frame rate, people wouldn’t feel the need to try to fix it. And snobs can have a setting that skips every other frame.

      Similar is the case for sound and (to a much lesser extent) contrast.

      Viewers need to be able to see and hear in comfort.

      2 replies →