Comment by strogonoff
7 months ago
Content delivery costs a lot for streaming services. After content is produced, this is basically the only remaining cost. It’s not surprising that they would go to extreme measures in reducing bitrate.
That’s why, presumably, Netflix came up with the algorithm for removing camera grain and adding synthetically generated noise on the client[0], and why YouTube shorts were recently in the news for using extreme denoising[1]. Noise is random and therefore difficult to compress while preserving its pleasing appearance, so they really like the idea of serving everything denoised as much as possible. (The catch, of course, is that removing noise from live camera footage generally implies compromising the very fine details captured by the camera as a side effect.)
So:
1. camera manufacturers and film crews both do their best to produce a noise-free image 2. in post-production, they add fake noise to the image so it looks more "cinematic" 3. to compress better, streaming services try to remove the noise 4. to hide the insane compression and make it look even slightly natural, the decoder/player adds the noise back
Anyone else finding this a bit...insane?
> camera manufacturers and film crews both do their best to produce a noise-free image
This is not correct, camera manufacturers and filmakers engineer _aesthetically pleasing_ noise (randomized grains appear smoother to the human eye than clean uniform pixels). The rest is still as silly as it sounds.
Considering how much many camera brands boast their super low noise sensors, I'd still say a very common goal is to have as little noise as possible and then let the director/dop/colorist add grain to their liking. Even something like ARRI's in-camera switchable grain profiles requires a low-noise sensor to begin with.
But yes, there are definitely also many DPs that like their grain baked-in and camera companies that design cameras for that kind of use.
1 reply →
> randomized grains appear smoother to the human eye than clean uniform pixels
Does this explain why i dislike 4K content on a 4K TV? Where some series and movies look too realistic, what in turn gives me a amateur film feeling (like somebody made a movie with a smartphone).
5 replies →
Yes, just stop doing step 2 the way they're doing and instead if they _must_ do noise modify parameters for step 4 directly.
> 1. camera manufacturers and film crews both do their best to produce a noise-free image 2. in post-production, they add fake noise to the image so it looks more "cinematic"
This is patently wrong. The rest builds up on this false premise.
1.1: Google some low-light performance reviews of cinema cameras - you'll see that ISO noise is decreasing with every generation and that some cameras (like from Sony) have that as a selling feature.
1.2.: Google "how to shoot a night scene" or something like that. You'll find most advice goes something along the lines of "don't crank the ISO up, add artificial lighting to brighten shadows instead". When given a choice, you'll also find cinematographers use cameras with particularly good low-light performance for dark scenes (that's why dark scenes in Planet Earth were shot on the Sony A7 - despite the "unprofessional" form factor, it had simply the best high-ISO performance at the time)
2: Google "film grain effect". You'll find a bunch of colorists explaining why film grain is different from ISO noise and why and how you should add it artificially to your films.
5 replies →
This isn't strictly Netflix per se, it's part of the AV1 codec itself, e.g. https://github.com/BlueSwordM/SVT-AV1/blob/master/Docs/Appen...
Yes, I was not strictly correct, it is a feature of AV1, but Netflix played an active role in its development, in rolling out the first implementation, and in AV1 codec development overall.
Prior codecs all had film grain synthesis too (at least back to H264), but nobody used it. Partly because it was obviously artificial and partly because it was too expensive to render, since you had to copy each frame to apply effects to it.
3 replies →
It feels to me like there are two different things going on:
1. Video codecs like the denoise, compress, synthetic grain approach because their purpose is to get the perceptually-closest video to the original with a given number of bits. I think we should be happy to spend the bits on more perceptually useful information. Certainly I am happy with this.
2. Streaming services want to send as few bytes as they can get away with. So improvements like #1 tend to be spent on decreasing bytes while holding perceived quality constant rather than increasing perceived quality while holding bitrate constant.
I think one should focus on #2 and not be distracted by #1 which I think is largely orthogonal.
For #1 the problem with keeping grain in the compressed video is that it doesn't follow the motion of the scene so it makes it much more expensive to code future frames.
I disagree, because 1) complete denoising is simply impossible while preserving fine detail and 2) noise is a serious artistic choice—just like anamorphic flare, lens FOV with any distortion artifacts, chromatic aberration, etc. Even if it is synthetic film grain that is added in post, that has been somebody’s artful decision; removing it and simulating noise on the client butchers the work.
>Content delivery costs a lot for streaming services.
The hard disk space to store an episode of a show is $0.01. With peering agreements, the bandwidth of sending the show to a user is free.
>With peering agreements bandwidth of sending the show to a user is free.
I'm not sure why you think this, but it's one of the oddest things I've seen today.
The more streams you can send from a single server the lower your costs are.
Sure, but buying a server is not buying bandwidth. The point of my post is to counter the narrative that streaming video is very expensive.
There might be also copyright owners requirements, e.g. contract that limits the quality of material.
I recall Netflix saying that streaming cost was nothing compared to all other costs.
I am curious to see their breakdown. It seems very counter-intuitive of them to invest so much into reducing bitrate if the cost of delivery is negligible. R&D and codec design efforts cost money and running more optimized codecs and aggressive denoising cost compute.
Improved compression also saved on storage costs. We would have to hear them state the same about storage.
It probably is but the bean counters do not want to hear this, they want to cut everything to the point that it's just above the limit that consumers will accept before they throw in the towel and cancel their membership (ads, low quality compression, etc)