Comment by pornel

18 hours ago

There are other possible explanations, e.g. AVC and HEVC are set to the same bitrate, so AVC streams lose quality, while AV1 targets HEVC's quality. Or they compare AV1 traffic to the sum of all mixed H.26x traffic. Or the rates vary in more complex ways and that's an (over)simplified summary for the purpose of the post.

Netflix developed VMAF, so they're definitely aware of the complexity of matching quality across codecs and bitrates.

I have no doubt they know what they are doing. But it's a srange metric no matter how you slice it. Why compare AV1's bandwith to the average of h.264 and h.265, and without any more details about resolution or compression ratio? Reading between the lines, it sounds like they use AV1 for low bandwidth and h.265 for high bandwidth and h.264 as a fallback. If that is the case, why bring up this strange average bandwidth comparison?

  • Yeah it's a weird comparison to be making. It all depends on how they selected the quality (VMAF) target during encoding. You couple easily end up with other results had they, say, decided to keep the bandwidth but improve quality using AV1.