Comment by omnicognate
19 hours ago
It would help if TV manufacturers would clearly document what these features do, and use consistent names that reflect that.
It seems they want to make these settings usable without specialist knowledge, but the end result of their opaque naming and vague descriptions is that anybody who actually cares about what they see and thinks they might benefit from some of the features has to either systematically try every possible combination of options or teach themselves video engineering and try to figure out for themselves what each one actually does.
This isn't unique to TVs. It's amazing really how much effort a company will put into adding a feature to a product only to completely negate any value it might have by assuming any attempt at clearly documenting it, even if buried deep in a manual, will cause their customers' brains to explode.
"Filmmaker mode" is the industry's attempt at this. On supported TVs it's just another picture mode (like vivid or standard), but it disables all the junk the other modes have enabled by default without wading though all the individual settings. I don't know how widely adopted it is though, but my LG OLED from 2020 has it.
The problem with filmmaker mode is I don't trust it more than other modes. It would take no effort at all for a TV maker to start fiddling whit "filmmaker mode" to boost colors or something to "get an edge", then everyone does it, and we're back to where we started. I just turn them off and leave it that way. Companies have already proven time and again they'll make changes we don't like just because they can, so it's important to take every opportunity to prevent them even getting a chance.
"Filmmaker mode" is a trademark of the UHD Alliance, so if TV makers want to deviate from the spec they can't call it "Filmmaker mode" anymore. There's a few different TV makers in the UHD Alliance so there's an incentive for the spec to not have wiggle room that one member could exploit to the determent of the others.
2 replies →
It's true that Filmmaker Mode might at some point in the future be corrupted, but in the actual world of today, if you go to a TV and set it to Filmmaker Mode, it's going to move most things to correct settings, and all things to correct settings on at least some TVs.
(The trickiest thing is actually brightness. LG originally used to set brightness to 100 nits in Filmmaker Mode for SDR, which is correct dark room behavior -- but a lot of people aren't in dark rooms and want brighter screens, so they changed it to be significantly brighter. Defensible, but it now means that if you are in a dark room, you have to look up which brightness level is close to 100 nits.)
On my Samsung film mode has an insane amount of processing. Game Mode is the setting where the display is most true to what's being sent to it.
Not "Film mode", but "Filmmaker mode". The latter is a trademark with specific requirements.
Game mode will indeed likely turn off any expensive latency-introducing processing but it's unlikely to provide the best color accuracy.
Game mode being latency-optimized really is the saving grace in a market segment where the big brands try to keep hardware cost as cheap as possible. Sure, you _could_ have a game mode that does all of the fancy processing closer to real-time, but now you can't use a bargain-basement CPU.
Yup, it's great, at least for live action content. I've found that for Anime, a small amount of motion interpolation is absolutely needed on my OLED, otherwise the content has horrible judder.
I always found that weird, anime relies on motion blur for smoothness when panning / scrolling motion interpolation works as an upgraded version of that... until it starts to interpolate actual animation
On my LG OLED I think it looks bad. Whites are off and I feel like the colours are squashed. Might be more accurate, but it's bad for me. I prefer to use standard, disable everything and put the white balance on neutral, neither cold nor warm.
I had just recently factory reset my samsung S90C QDOLED - and had to work through the annoying process of dialing the settings back to something sane and tasteful. Filmmaker mode only got it part of the way there. The white balance was still set to warm, and inexplicably HDR was static (ignoring the content 'hints'), and even then the contrast seemed off, and I had to set the dynamic contrast to 'low' (whatever that means) to keep everything from looking overly dark.
It makes me wish that there was something like an industry standard 'calibrated' mode that everyone could target - let all the other garbage features be a divergence from that. Hell, there probably is, but they'd never suggest a consumer use that and not all of their value-add tackey DSP.
3 replies →
The whites in Filmmaker Mode are not off. They'll look warm to you if you're used to the too-blue settings, but they're completely and measurably correct.
I'd suggest living with it for a while; if you do, you'll quickly get used to it, and then going to the "standard" (sic) setting will look too blue.
5 replies →
I'm sure part of it is so that marketing can say that their TV has new putz-tech smooth vibes AI 2.0, but honestly I also see this same thing happen with products aimed at technical people who would benefit from actually knowing what a particular feature or setting really is. Even in my own work on tools aimed at developers, non-technical stakeholders push really hard to dumb down and hide what things really are, believing that makes the tools easier to use, when really it just makes it more confusing for the users.
I don't think you are the target audience of the dumbed down part but the people paying them for it. They don't need the detailed documentation on those thing, so why make it?
> It would help if TV manufacturers would clearly document what these features do, and use consistent names that reflect that.
It would also help if there was a common, universal, perfect "reference TV" to aim for (or multiple such references for different use cases), with the job of the TV being to approximate this reference as closely as possible.
Alas, much like documenting the features, this would turn TVs into commodities, which is what consumers want, but TV vendors very much don't.
"reference TVs" exist, they're what movies/tv shows are mastered on, e.g. https://flandersscientific.com/XMP551/
I wonder if there's a video equivalent to the Yamaha NS-10[1], a studio monitor (audio) that (simplifying) sounds bad enough that audio engineers reckon if they can make the mix sound good on them, they'll sound alright on just about anything.
[1]: https://en.wikipedia.org/wiki/Yamaha_NS-10
3 replies →
$21k for a 55-inch 4K is rough, but this thing must be super delicate because basic US shipping is $500.
(Still cheaper than a Netflix subscription though.)
2 replies →
I disable all video processing features and calibrate my sets. Bought a meter years ago and it’s given me endless value.
Yup - this is the way. Your room color and lighting effect your TV so proper calibration with a meter is always ideal
These exist, typically made by Panasonic or Sony, and cost upwards of 20k USD. HDTVtest has compared them to the top OLED consumer tvs in the past. Film studios use the reference models for their editing and mastering work.
Sony specifically targets the reference with their final calibration on their top TVs, assuming you are in Cinema or Dolby Vision mode, or whatever they call it this year.
My local hummus factory puts the product destined for Costco into a different sized tub than the one destined for Walmart. Companies want to make it hard for the consumer to compare.
You think the factory decided this?
1 reply →
Costco’s whole thing is selling larger quantities, most times at a lower per unit price than other retailers such as Walmart. Walmart’s wholesale competitor to Costco is Sam’s Club. Also, Costco’s price labels always show the per unit price of the product (as do Walmart’s, in my experience).
8 replies →
There is! That is precisely how TVs work! Specs like BT.2020 and BT.2100 define the color primaries, white point, and how colors and brightness levels should be represented. Other specs define other elements of the signal. SMPTE ST 2080 defines what the mastering environment should be, which is where you get the recommendations for bias lighting.
This is all out there -- but consumers DO NOT want it, because in a back-to-back comparison, they believe they want (as you'll see in other messages in this thread) displays that are over-bright, over-blue, over-saturated, and over-contrasty. And so that's what they get.
But if you want a perfect reference TV, that's what Filmmaker Mode is for, if you've got a TV maker that's even trying.
They will setup their TVs with whatever setting makes them sell better than the other TVs in the shop.
I don't particularly like that, but even so, it doesn't preclude having a "standard" or "no enhancement" option, even if it's not the default.
On my TCL TV I can turn off "smart" image and a bunch of other crap, and there's a "standard" image mode. But I'm not convinced that's actually "as close to reference as the panel can get". One reason is that there is noticeable input lag when connected to a pc, whereas if I switch it to "pc", the lag is basically gone, but the image looks different. So I have no idea which is the "standard" one.
Ironically, when I first turned it on, all the "smart" things were off.
Sometimes PC mode reduces image quality (like lowering bit depth) at the expense of lower input lag
I'm not certain this is true. TVs have become so ludicrously inexpensive that it seems the only criteria consumers shop for is bigger screen and lower price.
"Our users are morons who can barely read, let alone read a manual", meet "our users can definitely figure out how to use our app without a manual".
I just went through this learning curve with my new Sony Bravia 8 II.
I also auditioned the LG G5.
I calibrated both of them. It is not that much effort after some research on avsforum.com. I think this task would be fairly trivial for the hackernews crowd.
Agreed. And I’m not going to flip my TV’s mode every time I watch a new show. I need something that does a good job on average, where I can set it and forget it.
The purpose of the naming is generally to overwhelm consumers and drive long term repeat buys. You can’t remember if your tv has the fitzbuzz, but you’re damn sure this fancy new tv in the store looks a hell of a lot better than you’re current tv and there really pushing this fitzbuzz thing.
Cynically, I think its a bit, just a little, to do with how we handle manuals, today.
It wasn't that long ago, that the manual spelled out everything in detail enough that a kid could understand, absorb, and decide he was going to dive into his own and end up in the industry. I wouldn't have broken or created nearly as much, without it.
But, a few things challenged the norm. For many, many reasons, manuals became less about the specification and more about the functionality. Then they became even more simplified, because of the need to translate it into thirty different languages automatically. And even smaller, to discourage people from blaming the company rather than themselves, by never admitting anything in the manual.
What I would do for a return to fault repair guides [0].
[0] https://archive.org/details/olivetti-linea-98-service-manual...
Going a level deeper, more information can be gleaned for how closely modern technology mimics kids toys that don’t require manuals.
A punch card machine certainly requires specs, and would not be confused with a toy.
A server rack, same, but the manuals are pieced out and specific, with details being lost.
You’ll notice anything with dangerous implications naturally wards off tampering near natively.
Desktop and laptop computers depending on sharp edges and design language, whether they use a touch screen. Almost kids toys, manual now in collective common sense for most.
Tablet, colorful case, basically a toy. Ask how many people using one can write bit transition diagrams for or/and, let alone xor.
We’ve drifted far away from where we started. Part of me feels like the youth are losing their childhoods earlier and earlier as our technology becomes easier to use. Being cynical of course.
Another factor is the increased importance of software part of the product, and how that changes via updates that can make a manual outdated. Or at least a printed manual, so if they're doing updates to product launch it might not match what a customer gets straight out of the box or any later production runs where new firmware is included. It would be somewhat mitigated if there was an onus to keep online/downloadable manuals updated alongside the software. I know my motherboard BIOS no longer matches the manual, but even then most descriptions are so simple they do nothing more than list the options with no explanation.
1 reply →
That doesn't preclude clearly documenting what the feature does somewhere in the manual or online. People who either don't care or don't have the mental capacity to understand it won't read it. People who care a lot, such as specialist reviewers or your competitors, will figure it out anyway. I don't see any downside to adding the documentation for the benefit of paying customers who want to make an informed choice about when to use the feature, even in this cynical world view.
That costs money.
Why let a consumer educate themselves as easily as possible when it’s more profitable to deter that behaviour and keep you confused? Especially when some of the tech is entirely false (iirc about a decade ago, TVs were advertised as ‘360hz’ which was not related to the refresh rates).
I’m with you personally, but the companies that sell TVs are not.
TV's are on their way to free, and are thoroughly enshittified. The consumer is the product, so compliance with consumer preferences is going to plummet. They don't care if you know what you want, you're going to get what they provide.
They want a spy device in your house, recording and sending screenshots and audio clips to their servers, providing hooks into every piece of media you consume, allowing them a detailed profile of you and your household. By purchasing the device, you're agreeing to waiving any and all expectations of privacy.
Your best bet is to get a projector, or spend thousands to get an actual dumb display. TVs are a lost cause - they've discovered how to exploit users and there's no going back.
worst is graphic settings for games. needs PhD to understand.
They just need 3 settings for games, 1) make room hot, 2) make room warm, 3) maintain room temperature.
I use that first setting to keep my basement living room warm in the winter.