Comment by alberth
7 months ago
Will streaming services ever stop over-compressing their content?
I have a top-of-the-line 4K TV and gigabit internet, yet the compression artifacts make everything look like putty.
Honestly, the best picture quality I’ve ever seen was over 20 years ago using simple digital rabbit ears.
You especially notice the compression on gradients and in dark movie scenes.
And yes — my TV is fully calibrated, and I’m paying for the highest-bandwidth streaming tier.
Not my tv, but a visual example: https://www.reddit.com/media?url=https%3A%2F%2Fpreview.redd....
Content delivery costs a lot for streaming services. After content is produced, this is basically the only remaining cost. It’s not surprising that they would go to extreme measures in reducing bitrate.
That’s why, presumably, Netflix came up with the algorithm for removing camera grain and adding synthetically generated noise on the client[0], and why YouTube shorts were recently in the news for using extreme denoising[1]. Noise is random and therefore difficult to compress while preserving its pleasing appearance, so they really like the idea of serving everything denoised as much as possible. (The catch, of course, is that removing noise from live camera footage generally implies compromising the very fine details captured by the camera as a side effect.)
[0] https://news.ycombinator.com/item?id=45022184
So:
1. camera manufacturers and film crews both do their best to produce a noise-free image 2. in post-production, they add fake noise to the image so it looks more "cinematic" 3. to compress better, streaming services try to remove the noise 4. to hide the insane compression and make it look even slightly natural, the decoder/player adds the noise back
Anyone else finding this a bit...insane?
> camera manufacturers and film crews both do their best to produce a noise-free image
This is not correct, camera manufacturers and filmakers engineer _aesthetically pleasing_ noise (randomized grains appear smoother to the human eye than clean uniform pixels). The rest is still as silly as it sounds.
8 replies →
Yes, just stop doing step 2 the way they're doing and instead if they _must_ do noise modify parameters for step 4 directly.
> 1. camera manufacturers and film crews both do their best to produce a noise-free image 2. in post-production, they add fake noise to the image so it looks more "cinematic"
This is patently wrong. The rest builds up on this false premise.
6 replies →
This isn't strictly Netflix per se, it's part of the AV1 codec itself, e.g. https://github.com/BlueSwordM/SVT-AV1/blob/master/Docs/Appen...
Yes, I was not strictly correct, it is a feature of AV1, but Netflix played an active role in its development, in rolling out the first implementation, and in AV1 codec development overall.
4 replies →
It feels to me like there are two different things going on:
1. Video codecs like the denoise, compress, synthetic grain approach because their purpose is to get the perceptually-closest video to the original with a given number of bits. I think we should be happy to spend the bits on more perceptually useful information. Certainly I am happy with this.
2. Streaming services want to send as few bytes as they can get away with. So improvements like #1 tend to be spent on decreasing bytes while holding perceived quality constant rather than increasing perceived quality while holding bitrate constant.
I think one should focus on #2 and not be distracted by #1 which I think is largely orthogonal.
For #1 the problem with keeping grain in the compressed video is that it doesn't follow the motion of the scene so it makes it much more expensive to code future frames.
I disagree, because 1) complete denoising is simply impossible while preserving fine detail and 2) noise is a serious artistic choice—just like anamorphic flare, lens FOV with any distortion artifacts, chromatic aberration, etc. Even if it is synthetic film grain that is added in post, that has been somebody’s artful decision; removing it and simulating noise on the client butchers the work.
>Content delivery costs a lot for streaming services.
The hard disk space to store an episode of a show is $0.01. With peering agreements, the bandwidth of sending the show to a user is free.
>With peering agreements bandwidth of sending the show to a user is free.
I'm not sure why you think this, but it's one of the oddest things I've seen today.
The more streams you can send from a single server the lower your costs are.
1 reply →
There might be also copyright owners requirements, e.g. contract that limits the quality of material.
I recall Netflix saying that streaming cost was nothing compared to all other costs.
I am curious to see their breakdown. It seems very counter-intuitive of them to invest so much into reducing bitrate if the cost of delivery is negligible. R&D and codec design efforts cost money and running more optimized codecs and aggressive denoising cost compute.
Improved compression also saved on storage costs. We would have to hear them state the same about storage.
It probably is but the bean counters do not want to hear this, they want to cut everything to the point that it's just above the limit that consumers will accept before they throw in the towel and cancel their membership (ads, low quality compression, etc)
> You especially notice the compression on gradients and in dark movie scenes.
That's not a correctly calibrated TV. The contrast is tuned WAY up. People do that to see what's going on in the dark, but you aren't meant to really be able to see those colors. That's why it's a big dark blob. It's supposed to be barely visible on a well calibrated display.
A lot of video codecs will erase details in dark scenes because those details aren't supposed to be visible. Now, I will say that streaming services are tuning that too aggressively. But I'll also say that a lot of people have miscalibrated displays. People simply like to be able to make out every detail in the dark. Those two things come in conflict with one another causing the effect you see above.
> but you aren't meant to really be able to see those colors
Someone needs to tell filmmakers. They shoot dark scenes because they can - https://www.youtube.com/watch?v=Qehsk_-Bjq4 - and it ends up looking like shit after compression that assumes normal lighting levels.
> Someone needs to tell filmmakers. They shoot dark scenes because they can…
i disagree completely. i watch a movie for the filmmakers story, i don’t watch movies to marvel at compression algorithms.
it would be ridiculous to watch movies shot with only bright scenes because streaming service accountants won’t stop abusing compression to save some pennies.
> …ends up looking like shit after compression that assumes normal lighting levels.
it’s entirely normal to have dark scenes in movies. streaming services are failing if they’re using compression algorithms untuned to do dark scenes when soooo many movies and series are absolutely full of night shots.
1 reply →
As I said, I think the streamer services have too aggressive settings there. But that doesn't change the fact that the a lot of people have their contrast settings over tuned.
It should be noted, as well, that this generally isn't a "not enough bits" problem. There are literally codec settings to tune which decide when to start smearing the darkness. On a few codecs (such as VP1) those values are pretty badly set by default. I suspect streaming services aren't far off from those defaults. The codec settings are instead prioritizing putting bits into the lit parts of a scene rather than sparing a few for the darkness like you might like.
Video codecs aren't tuned for any particular TV calibration. They probably should be because it is easier to spot single bit differences in dark scenes, because the relative error is so high.
The issue is just that we don't code video with nearly enough bits. It's actually less than 8-bit since it only uses 16-235.
If an eye is able to distuinguish all 256 shades on a correctly calibrated display, then the content should be preserved.
>Will streaming services ever stop over-compressing their content?
Before COVID Netflix were at least using 8Mbps for 1080P content. With x264 / beamr it is pretty good, and even better on HEVC. Then COVID hit, every streaming service not just Netflix have excuses to lower their quality due to increased demand with limited bandwidth. Everything went down hill since then. Customer got used to lower quality I dont believe they ever bring it back up. Now it is only something like 3-5Mbps according to previous test posted on HN.
And while it is easy for HEVC / AV1 / AV2 to have 50%+ bitrate real world savings compared to H.264 saving at 0.5 - 4Mbps range, once you go pass that the savings begin to shrink rapidly to the point good old x264 encoder may perform better at much higher bitrate.
Most 1080p WEB-DLs are in the 6-8 Mbps range still, based on a quick glance.
Nice. There was a previous post on HN with showing the 5 to 6 samples he had were from 3 - 5Mbps.
1 reply →
Netflix also has a huge incentive to not use h265 and h264, licensing cost.
Seems like piracy is the way for you
Sadly many shows aren't released on BluRay anymore, so even piracy won't deliver better quality.
Piracy enables you to do things like debanding on playback, or more advanced video filtering to remove other compression issues.
I believe many sites still prefer Amazon webrips because their content is encoded at a higher bitrate than Netflix.
darn pirates should run the content through a super resolution model!
Not all video streaming services choose to use the same extremely low average video bit rate used by Netflix on some of their 4k shows.
Kate - Netflix - 11.15 Mbps
Andor - Disney - 15.03 Mbps
Jack Ryan - Amazon - 15.02 Mbps
The Last of Us - Max - 19.96 Mbps
For All Mankind - Apple - 25.12 Mbps
https://hd-report.com/streaming-bitrates-of-popular-movies-s...
Netflix has shown they're the mattress-company equivalent of streaming services.
You will be made to feel the springs on the cheapest plan/mattress, and it's on purpose so you'll pay them more for something that costs them almost nothing.
Are you sure about the black-areas-blocking? I remember a long time ago, when I was younger and had time for this kind of tomfoolery, I noticed this exact issue in my BlueRay backups. I figured I needed to up the bitrate, so I started testing, upping the bitrate over and over. Finally, I played the BlueRay and it was still there. This was an old-school, dual-layer, 100GB disc of one of the Harry Potter movies. Still saw the blocking in very dark gradients.
the downside of 8 bits per channel is you really don't have enough to get a smooth gradient over dark colors.
Yup. I think that’s exactly it.
I’m still so surprised Disney+ degrades their content/streaming service so much. Of all the main services I’ve tried (Netflix, Prime, Hulu, HBO) Disney+ has some of the worst over-compression, lip-sync, and remembering-which-episode-is-next issues for me. Takes away from the “magic”.
Check your settings. I experienced the same until I altered Apple TV settings that fixed Disney+. If I recall, the setting was Match content or Match dynamic range (not near tv right now to confirm exact name)
Netflix now this on their lowest paid tier as well. I had to upgrade to the 4K tier just to get somewhat-ok 1080p playback...
This is interesting because Disney+ when they started out were using much higher bitrate, 2nd only to Apple+.
Economically speaking, it doesn't make any sense for them to spend more on bandwidth and storage if they can get away with not spending more.
I don't quite follow why compression would cause this. Feels more like a side effect of adaptive HTTPS streaming protocol where it would automatically adjust based on your connection speed, and so aligns with any jitter on the wire. It could also be an issue with the software implementation because they need to constantly switch between streams based on bandwidth.
> side effect of adaptive HTTPS streaming
Adaptive streaming isn't really adaptive anymore. If you have any kind of modern broadband, the most adaptive it will be is starting off in one of the lower bitrates for the first 6 seconds before jumping to the top, where it will stay for the duration of the stream. A lot of clients don't even bother with that anymore; they look at the manifest, find the highest stream, and just start there.
As a little experiment, I'd like you to set up your own little streaming service on a server and see how much bandwidth it uses, even for just a few users. It adds up extremely quickly, with the actual using being quite surprising.
At the higher prices, I'd have to agree with you. If you pay for the best you should get the best.
> You especially notice the compression on gradients and in dark movie scenes.
That can happen at even the highest bitrates if "HDR" is not enabled in the video codec.
Related video: https://www.youtube.com/watch?v=h9j89L8eQQk
Whoa. That is the best thing i watched on YouTube in a long time. Thank you.
I pirate blu-ray rips. Pirates are very fastidious about maintaining visual quality in their encodings. I often see them arguing over artifacts that I absolutely cannot see with my eyes.
>the best picture quality I’ve ever seen was over 20 years ago using simple digital rabbit ears.
The biggest jump in quality was when everything was still analog over the air, but getting ready for the digital transition.
Then digital over the air bumped it up a notch.
You could really see this happen on a big CRT monitor with the "All-in-Wonder" television receiver PCI graphics adapter card.
You plugged in your outdoor antenna or indoor rabbit ears to the back of the PC, then tuned in the channels using software.
These were made by ATI before being acquired by AMD, the TV tuner was in a faraday cage right on the same PCB as the early GPU.
The raw analog signal was upscaled to your adapter's resolution setting before going to the CRT so you had pseudo better resolution than a good TV like a Trinitron. You really could see more details and the CRT was smooth as butter.
As the TV broadcaster's entire equipment chain was replaced, like camera lenses, digital sensors and signal processing they eventually had everything in place and working. You could notice these incremental upgrades until a complete digital chain was established as designed. It was really jaw-dropping. This was well in advance of the deadline for digital deployment, so the signal over-the-air was still coming in analog the same old way.
Eventually the broadcast signal switched to digital and the analog lights went out, plus the All-in-Wonder was not ideal with a cheap converter like analog TV's could get by with.
But it was still better than most digital TVs for a few years, then it took years more before you could see the ball in live sports as well as on a CRT anyway.
Now that's about all you've got for full digital resolution, live broadcasts from your local stations, especially live sports from a strong interference-free station over an antenna. You can switch between the antenna and cable and tell the difference when they're both not overly compressed.
The only thing was, digital engineers "forgot" that TV was based on radio (who knew?) so for the vast majority of "listeners" on the fringe reception areas who could get clear audio but usually not a clear picture if any, too bad for you. You're gonna need a bigger antenna, good enough to have gotten you a clear picture during the analog days. Otherwise your "clean" digital audio may silently appear on the screen as video, "hidden" within the sparse blocks of scattered random digital noise. When anything does appear at all.
for the super affluent, https://www.kaleidescape.com/compare/
Funny that they're marketing the supposed advantages of higher bitrates using pictures with altered contrast and saturation lol. I would expect the target audience to be somewhat affluent in the actual benefits? Then again, I wouldn't expect somebody like Scorsese to be a video compression nerd.
Also the whole "you can hear more with lossless audio" is just straight up a lie.
This has been more or less proven to be a complete scam, the quality isn’t any better than Blu-ray and in many cases worse.
The “best” quality of streaming you have is Sony Core https://en.wikipedia.org/wiki/Sony_Pictures_Core but it has a rather limited library.
"Not any better than Blu-ray" is the same as saying "much better than streaming."
I think there are a few examples where the bitrate is higher than a native rip however.
Fascinating.
Pricing, if I am reading the site correctly: $7k-ish for a server (+$ for local disks, one assumes), $2-5k per client. So you download the movie locally to your server and play it on clients scattered throughout your mansion/property.
Not out of the world for people who drop 10s of thousands on home theater.
I wonder if that's what the Elysium types use in their NZ bunkers.
No true self-respecting, self-described techie (Scotsman) would use it instead of building their own of course.
For the less affluent you can setup a Jellyfin media server and rip your own blu-rays with makemkv.
It's a little surprising to me that there generally aren't more subscription tiers where you can pay more for higher quality. Seems like free money, from people like you (maybe) and me.
You can already pay for 4K or "enhanced bitrate" but it's still relatively low bitrate and what's worse, this service quality is not guaranteed. I've had Apple TV+ downgrade to 1080p and lower on a wired gigabit connection so many times.
And on top of that a lot of streaming services don't go above 1080p on desktop, and even getting them to that point is a mess of DRM. I sometimes wonder if this is the YouTube powerhouse casting a bad shadow. As LTT says, don't try to compete with YouTube. They serve so much video bandwidth it's impossible to attempt. So all these kinda startup streaming services can't do 4k. Too much bandwidth.
I'm not surprised they don't offer an even higher tier. When you're pricing things, you often need to use proxies - like 1080p and 4K. It'd be hard to offer 3 pricing tiers: 1080p, 4K, 4K but actually good 4K that we don't compress to hell. That third tier makes it seem like you're being a bit fraudulent with the second tier. You're essentially admitting that you've created a fake-4K tier to take people's money without delivering them the product they think they're buying. At some point, a class-action lawsuit would use that as a sort of admission that you knew you weren't giving customers what they were paying for and that it was being done intentionally, both of which matter a lot.
Right now, Netflix can say stuff like "we think the 4K video we're serving is just as good." If they offer a real-4K tier, it's hard to make that argument.
YouTube does 1080p premium without much problem.
Ironically, piracy gives you yet again a better service. Thanks QxR.
Well, you'll be happy to learn that AV2 delivers 30% better quality for the same bitrate!
Isn't Sony Bravia Core supposed to be UHD Blu-ray quality?
and this is why I don't look down on those who choose to pirate bluray/4k content
4K Blu-ray is the top quality.
> Honestly, the best picture quality I’ve ever seen was over 20 years ago using simple digital rabbit ears.
That I find super hard to believe!
Why? ATSC is 19Mbps. A single 1080i video using that whole bitrate will look quite good.
Many confounding factors. That’s just one dimension of image quality. Others include things like the panel quality, production quality.
that's 19mbps including error correction. only 10 after, and that's using mpeg2 which is probably roughly equivalent to 6-7mbps av1
5 replies →
[dead]
[dead]