Comment by phantasmish
10 hours ago
I dunno, but if there is grain in the source it may erase it (discarding information) then invent new grain (noise) later.
10 hours ago
I dunno, but if there is grain in the source it may erase it (discarding information) then invent new grain (noise) later.
I'm skeptical of this (I think they avoid adding grain to the AV1 stream which they add to the other streams--of course all grain is artificial in modern times), but even if true--like, all grain is noise! It's random noise from the sensor. There's nothing magical about it.
The grain’s got randomness because distribution and size of grains is random, but it’s not noise, it’s the “resolution limit” (if you will) of the picture itself. The whole picture is grain. The film is grain. Displaying that is accurately displaying the picture. Erasing it for compression’s sake is tossing out information, and adding it back later is just an effect to add noise.
I’m ok with that for things where I don’t care that much about how it looks (do I give a shit if I lose just a little detail on Happy Gilmore? Probably not) and agree that faking the grain probably gets you a closer look to the original if you’re gonna erase the grain for better compression, but if I want actual high quality for a film source then faked grain is no good, since if you’re having to fake it you definitely already sacrificed a lot of picture quality (because, again, the grain is the picture, you only get rid of it by discarding information from the picture)
If you’re watching something from the 70s, sure. I would hope synthesized grain isn’t being used in this case.
But for anything modern, the film grain was likely added during post-production. So it really is just random noise, and there’s no reason it can’t be recreated (much more efficiently) on the client-side.