← Back to context

Comment by crazygringo

18 hours ago

Wow. To me, the big news here is that ~30% of devices now support AV1 hardware decoding. The article lists a bunch of examples of devices that have gained it in the past few years. I had no idea it was getting that popular -- fantastic news!

So now that h.264, h.265, and AV1 seem to be the three major codecs with hardware support, I wonder what will be the next one?

> To me, the big news here is that ~30% of devices now support AV1 hardware decoding

Where did it say that?

> AV1 powers approximately 30% of all Netflix viewing

Is admittedly a bit non-specific, it could be interpreted as 30% of users or 30% of hours-of-video-streamed, which are very different metrics. If 5% of your users are using AV1, but that 5% watches far above the average, you can have a minority userbase with an outsized representation in hours viewed.

I'm not saying that's the case, just giving an example of how it doesn't necessarily translate to 30% of devices using Netflix supporting AV1.

Also, the blog post identifies that there is an effective/efficient software decoder, which allows people without hardware acceleration to still view AV1 media in some cases (the case they defined was Android based phones). So that kinda complicates what "X% of devices support AV1 playback," as it doesn't necessarily mean they have hardware decoding.

  • That was one of the best decisions of AOMedia.

    AV1 was specifically designed to be friendly for a hardware decoder and that decision makes it friendly to software decoding. This happened because AOMedia got hardware manufacturers on the board pretty early on and took their feedback seriously.

    VP8/9 took a long time to get decent hardware decoding and part of the reason for that was because the stream was more complex than the AV1 stream.

    • Hmmm disagree on your chain there. Plenty of easy hardware algorithms are hard for software. For example, in hardware (including FPGAs), bit movement/shuffling is borderline trivial if it's constant, while in software you have to shift and mask and or over and over. In hardware you literally just switch which wire is connected to what on the next stage. Same for weird bit widths. Hardware doesn't care (too much) if you're operating on 9 bit quantities or 33 or 65. Software isn't that granular and often you'll double your storage and waste a bunch.

      I think they certainly go hand in hand in that algorithms relatively easier for software vs previously are easier for hardware vs previously and vice versa, but they are good at different things.

      1 reply →

    • All I read about is that it's less hardware friendly than H.264 and HEVC, and they were all complaining about it. AV2 should be better in this regard.

      Where did you read that it was designed to make creating an hardware decoder easier?

      1 reply →

  • “30% of viewing” I think clearly means either time played or items played. I’ve never worked with a data team that would possibly write that and mean users.

    If it was a stat about users they’d say “of users”, “of members”, “of active watchers”, or similar. If they wanted to be ambiguous they’d say “has reached 30% adoption” or something.

    • I am not in data science so I can not validate your comment, but 30% of viewing I would assume mean users or unique/discreet viewing sessions and not watched minutes. I would appreciate it if Netflix would clarify.

    • Agreed, but this is the internet, the ultimate domain of pedantry, and they didn't say it explicitly, so I'm not going to put words in their mouth just to have a circular discussion about why I'm claiming they said something they didn't technically say, which is why I asked "Where did it say that" at the very top.

      Also, either way, my point was and still stands: it doesn't say 30% of devices have hardware encoding.

> So now that h.264, h.265, and AV1 seem to be the three major codecs with hardware support, I wonder what will be the next one?

Hopefully AV2.

  • H266/VVC has a five year head-start over AV2, so probably that first unless hardware vendors decide to skip it entirely. The final AV2 spec is due this year, so any day now, but it'll take a while to make it's way into hardware.

    • H266 is getting fully skipped (except possibly by Apple). The licensing is even worse than H265, the gains are smaller, and Google+Netflix have basically guaranteed that they won't use it (in favor of AV1 and AV2 when ready).

      6 replies →

    • VVC is pretty much a dead end at this point. Hardly anyone is using it; it's benefits over AV1 are extremely minimal and no one wants the royalty headache. Basically everyone learned their lesson with HEVC.

      1 reply →

    • If it has a five year start and we've seen almost zero hardware shipping that is a pretty bad sign.

      IIRC AV1 decoding hardware started shipping within a year of the bitstream being finalized. (Encoding took quite a bit longer but that is pretty reasonable)

      1 reply →

    • When even H.265 is being dropped by the likes of Dell, adoption of H.266 will be even worse making it basically DOA for anything promising. It's plagued by the same problems H.265 is.

      3 replies →

> So now that h.264, h.265, and AV1 seem to be the three major codecs with hardware support, I wonder what will be the next one?

Hopefully, we can just stay on AV1 for a long while. I don't feel any need to obsolete all the hardware that's now finally getting hardware decoding support for AV1.

That's not at all how I read it.

They mentioned they delivered a software decoder on android first, then they also targeted web browsers (presumably through wasm). So out of these 30%, a good chunk of it is software not hardware.

That being said, it's a pretty compelling argument for phone and tv manufacturers to get their act together, as Apple has already done.

I'm not too surprised. It's similar to the metric that "XX% of Internet is on IPv6" -- it's almost entirely driven by mobile devices, specifically phones. As soon as both mainstream Android and iPhones support it, the adoption of AV1 should be very 'easy'.

(And yes, even for something like Netflix lots of people consume it with phones.)

Not trolling, but I'd bet something that's augmented with generative AI. Not to the level of describing scenes with words, but context-aware interpolation.

  • I don't want my video decoder inventing details which aren't there. I much rather want obvious compression artifacts than a codec where the "compression artifacts" look like perfectly realistic, high-quality hallucinated details.

    • In case of many textures (grass, sand, hair, skin etc) it makes little difference whether the high frequency details are reproduced exactly or hallucinated. E.g. it doesn't matter whether the 1262nd blade of grass from the left side is bending to the left or to the right.

      10 replies →

>So now that h.264, h.265, and AV1 seem to be the three major codecs with hardware support

That'd be h264 (associated patents expired in most of the world), vp9 and av1.

h265 aka HEVC is less common due to dodgy, abusive licensing. Some vendors even disable it with drivers despite hardware support because it is nothing but legal trouble.

I mean... I bought a Samsung TV in 2020, and it already supported AV1 HW decoding.

2020 feels close, but that's 5 years.

  • Is that supposed to be long-lived for a TV?

    I'm running an LG initially released in 2013 and the only thing I'm not happy with is that about a year ago Netflix ended their app for that hardware generation (likely for phasing out whatever codec it used). Now I'm running that unit behind an Amazon fire stick and the user experience is so much worse.

    (that LG was a "smart" TV from before they started enshittifying, such a delight - had to use and set up a recent LG once on a family visit and it was even worse than the fire stick, omg, so much worse!)

  • Two years ago I bought a Snapdragon 8+ Gen 1 phone (TSMC 4nm, with 12 GB LPDDR RAM, 256 GB NAND flash, and a 200 megapixel camera). It still feels pretty modern but it has no AV1 support.