Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
Media Technology

AV1 is Supposed To Make Streaming Better, So Why Isn't Everyone Using It? (theverge.com) 44

Despite promises of more efficient streaming, the AV1 video codec hasn't achieved widespread adoption seven years after its 2018 debut, even with backing from tech giants Netflix, Microsoft, Google, Amazon, and Meta. The Alliance for Open Media (AOMedia) claims AV1 is 30% more efficient than standards like HEVC, delivering higher-quality video at lower bandwidth while remaining royalty-free.

Major services including YouTube, Netflix, and Amazon Prime Video have embraced the technology, with Netflix encoding approximately 95% of its content using AV1. However, adoption faces significant hurdles. Many streaming platforms including Max, Peacock, and Paramount Plus haven't implemented AV1, partly due to hardware limitations. Devices require specific decoders to properly support AV1, though recent products from Apple, Nvidia, AMD, and Intel have begun including them. "In order to get its best features, you have to accept a much higher encoding complexity," Larry Pearlstein, associate professor at the College of New Jersey, told The Verge. "But there is also higher decoding complexity, and that is on the consumer end."

AV1 is Supposed To Make Streaming Better, So Why Isn't Everyone Using It?

Comments Filter:
  • Clearly the other streaming services just don't care. Bandwidth is too low on their cost matrix for even a significant savings to matter to them. They just don't want to bother with the complexity of setting up another encoding system. It's not like they have to switch; they just need to encode shows in both, and then stream to customers with hardware that supports the new codec. If they just did this for their top hits, they could save a ton of bandwidth with minimal effort.

    • Apple just added AV1 hardeware support in 2023, so thats a thing.
    • Re:They Don't Care (Score:5, Insightful)

      by Malc ( 1751 ) on Thursday April 03, 2025 @12:06PM (#65278793)

      No, it's lack of hardware support. I called in to an AOM webinar end of October entitled "Is Real-Time AV1 Ready For Prime Time?". Meta, Google and Microsoft for instance all complained that there isn't enough hardware support, e.g. only in higher-end phones or some GPUs. Agora didn't mind so much because most of their users were desktop browsers and they could cope with CPU decoding.

      There's also the cost factor. A streaming service isn't going to switch off AVC or HEVC streams just because they've started to use AV1. They want to support the long tail of users who can't decode AV1, so that means adopting a new codec increases their bandwidth, storage and CDN costs, as well as making their encoding and packaging pipelines more complex and expensive. Furthermore, many vendors that have already switched/added from AVC to HEVC aren't going to switch to AV1 because the savings aren't enough and are more likely to wait for something better such as VVC.

      Complaining that usage is not widespread enough seems to be common and it shows a lack of understanding of codec adoption. HEVC usage is still growing and it was standardised in 2013. AV1 needs a few more years.

      Give it more time.

    • Re:They Don't Care (Score:4, Informative)

      by Kisai ( 213879 ) on Thursday April 03, 2025 @07:23PM (#65279783)

      Nah. Three issues:
      - AV1 playback hardware is not even across the board. Particularly in mobile phones and SmartTV's. The iPhone 15 Pro (2023) is the first iphone that supports it. No Android device guarantees support for it in hardware, and considering how utterly trash Android hardware is, it means the experience is not uniform. Pixel 6 and Samsung S21 (both 2021 devices)
      - AV1 Encoding support only exists in cards people do not have. (RTX 40xx/50xx / Intel Xe Arc Series)
      - AV1 Playback iGPU support is only available in Intel 11th gen+

      So that means that distribution of AV1 content (eg 4K HDR, 1080p60+) that really needs the codec can be served by fixed function hardware that most of this stuff already has, eg HEVC (h.265) and has been paid for.

      Now that said, for google, netflix, amazon, disney+ etc, you aren't going to encode the same show 4 times (h264, h265, AV1, and obviously whatever the media was made in) you're going to encode the show once at a high bit rate that makes sense to convert to these other codecs in HDR and non HDR mode.

      Unfortunately until the current smartphone upgrade cycle is done, the majority of the devices out there are not going to support AV1, and it can be assumed that anyone who wants AV1 as the default has to wait. Like Twitch and Youtube have been talking about AV1, but never send you AV1 in any situation, even when your hardware supports it.

      • by Guspaz ( 556486 )

        Hardware decode support isn't critical for all devices. For example, an iPhone 14 or 15 non-pro can play AV1 just fine with software decoding, because they have more than enough compute available to do it, it's just much less efficient. Doesn't really change the big picture, lots of devices don't have the processing power to brute force it, so decode support isn't widespread enough, it just means that the situation isn't *quite* as bad.

      • and considering how utterly trash Android hardware is

        There is no such thing as Android hardware, there is just hardware for which there is Android support. Some of it is crap, some of it is great, and it all comes at a price point that you get to choose instead of being forced to buy functionality you may not need like you are with iDevices.

        Your characterization is a mischaracterization.

        Unfortunately until the current smartphone upgrade cycle is done, the majority of the devices out there are not going to support AV1, and it can be assumed that anyone who wants AV1 as the default has to wait.

        This part is true. Only powerful devices can support new codecs without dedicated hardware, and it costs battery life in the case of mobile devices.

  • Easy (Score:4, Interesting)

    by Valgrus Thunderaxe ( 8769977 ) on Thursday April 03, 2025 @11:41AM (#65278703)
    My video card doesn't support this in hardware, so there's no advantage for me to enable the use of this codec (so specifically blacklist it), and opt for others that are accelerated on my graphics hardware vs. my CPU.
    • Re:Easy (Score:4, Informative)

      by war4peace ( 1628283 ) on Thursday April 03, 2025 @11:45AM (#65278719)

      If you have a RDNA2 or above GPU, or an Intel 11th gen / AMD Ryzen 6000 or above CPU, they support AV1 decoding.

      • That is good news and definitely an improvement in efficiency. However many older and less powerful CPU's have able to software decode high bitrate 1080p single-tile AV1 encoded videos for a long time. AV1 hardware decoding for SoC types is very useful though like for example Fire TV Stick 4K Max has hardware AV1 decoding. If it didn't have AV1 hardware decoding support this device couldn't do playback of AV1.
      • A lot of us still have much older hardware that's still perfectly good and useful. My desktop that I use for email and web browsing at home was already old when those CPUs/GPUs were new. Replacing it for AV1 support isn't really a priority. The era of computing where a PC or device was halfway obsolete before you could get it out of the store has been over for a while now. Even phones are generally fine to keep for 5+ years now.
        • Well, OK, but in that case it's not AV1 to blame. I wonder if your machine could even hardware decode HEVC.

      • by antdude ( 79039 )

        What about my old GeForce GTX 750 Ti and 8800 GT?

        • Nobody cares :)
          Yes, this may sound harsh, but it's true. Newer hardware decoders require newer hardware, strange but true :)

    • Hardware based encoders are great for live-streaming things but are terrible at producing high quality VOD items that I want on my home NAS. Even the better versions of these like ARC and NVENC fall significantly short of what good software encoders can achieve for VOD content.
      • I can't speak to every codec because I haven't tested them all, but Nvidia does a great job with H.265. I was disappointed with their H.264, but the 265 is peachy.

        • I only have a RTX 2080 but found that using the hardware GPU to convert an H264 to H265 was very fast, but the file size was 30% larger than the source H264. If I used the software conversion (on Intel i9-99ooK) took a very long time, but the final file size was about 60% smaller.
          • I have a 4060 Ti, but I'm not honestly sure if they're using different algorithms on different GPUs or if they just run faster on newer hardware. Both H264 and H265 were very fast, but for a given bitrate the H265 just craps on the H264. I did not compare to other encoders, though. I was just finding something I could live with that would play on my TV.

  • by DarkOx ( 621550 ) on Thursday April 03, 2025 @11:57AM (#65278761) Journal

    Prime video has a lot of uses because it is bundled. Netflix is probably the biggest name in streaming.

    Is there any reason to think the same general population has peacock or paramount+?

    I have had paramount off and on so I could see Star Trek - Latest, but other than that it seems like a lot of back catalog that was not especially worth revisiting. I have no idea what the user make up is or isn't but maybe people that want to pay to watch old comedies from the '80s and 90s sitcom reruns over lap with people that have hardware that can't use AV1?

    Its really not that much better than VP8/9, but you'll need pretty recent Apple device to get hardware support for AV1. Maybe the percentage of subscribers who can use it for the niche streamers just isnt there.

    • We get both Paramount and Peacock because of bundles. The latter is standard for any long time Comcast customer. The former... I don't remember what bundled it (not our phone, I know that, I want to say it was a store's "plus" service) but that's how we have it.

      But yeah, they're bundled for a lot of people.

      • by DarkOx ( 621550 )

        That actually might explain it right there. My parents still have 'cable' even though it has been IpTV underneath for over a decade. Still that Comcast STB they have is probably at least 5 to 7 years old since it was last replaced.

        I bet that is true for a lot of subscribers and given the ownership situation with Peacock that probably means a huge portion of their viewership doesn't have a device that does AV1.

  • It will make services stick with an idea longer than a handful of episodes before cancelling?

  • AV1 is insanely complex to decode when compared to 264 or even 265, and very few have a hardware decoder.

    The tradeoffs for lower bandwidth and no royalties really do not give any advantage at all - at least not now.

  • It's device support (Score:5, Informative)

    by thecombatwombat ( 571826 ) on Thursday April 03, 2025 @12:02PM (#65278779)

    And seriously nothing else. Everything else in the article is just fluff. I worked for a streaming company when h265 was taking hold, no different. We fairly regularly would survey all of the devices using our service, to weigh how many of them would actually use it if we started encoding all of our library in h265.

    It's the same thing, Apple in particular, was really slow to adopt AV1. No Apple TV supports it, and only very new iPads, iPhones and Macs do. (All of the ones for sale today do, but most streamers customers will be on older ones still.) Android and PCs are a better, but not by much. Big streaming services will start to adopt it once they can say something like 40%+ of their users will get a good experience from it.

    Even if it saves 50% bandwidth, that isn't worth some very expensive reencode of a huge library if only 20% of users get that benefit. Just like h265 was ten years ago, the encoders are still really resource intensive, but getting better every day, so they wait.

    • When it comes to hardware, it has to be in a chip somewhere. For Apple, they have to design it into their chips. For other companies, they have to wait for chip companies like Marvell and RealTek to put into the chips they will use. While AV1 is royalty free, there is some additional cost. Bear in mind, AV1 is better at 4K streaming but 4k is not everywhere yet; for 1080p streaming h265 is perfectly acceptable.
    • by Bahbus ( 1180627 )

      Even if it saves 50% bandwidth, that isn't worth some very expensive reencode of a huge library if only 20% of users get that benefit.

      And this is the kind of mindset that shoots most businesses in the foot. It doesn't matter if not every customer can receive AV1, you serve them up whatever the best version is for their hardware. If they can receive AV1, then company immediately saves money on bandwidth which is more expensive than encoding. AV1 hardware will only increase in availability over time. Might as well be somewhat ready. It helps to have content people actually want to watch - Max, Peacock, Paramount, and AppleTV are awful for t

  • by vyvepe ( 809573 ) on Thursday April 03, 2025 @12:27PM (#65278855)
    My small test indicated that it is only about 20% smaller (though TFA claims 30%) and requires about 45% more CPU to decode when compared to VP9. Encoding is also many times slower than VP9. Both encoding and decoding were done on CPU (no HW support; ffmpeg; linux). AV1 may be more interesting later when it is better supported.
    • by Kazymyr ( 190114 )

      I played with svt-av1/ffmpeg about a year ago and my conclusions were about the same as yours or worse. Depending on the source material, the space/bandwidth savings were between about 15% and -10% (yes, 10% larger than the source material) while power consumption and CPU utilization was much larger that software h264/265

      • Was that on 1080p or 4K? From what I remember AV1's major benefit would have been on 4K but it has taken time for 4K to be the standard. Even now while some new content is 4K, older content at best is upscaled. As a matter of need, AV1 has not really been a priority thus slow adoption rate.

        It parallels the discrepancy between Bluray vs DVD adoption. Most consumer got immediate benefits updating from VHS to DVD. There were some benefits to updating to Bluray, but many still had older collections on DVD.

      • by msk ( 6205 )

        Last time I tried the svt-av1 encoder in ffmpeg, it still didn't pass captions/subtitles through like libx264/libx265 do. Getting those into the final encoding are a requirement for me, and I don't like having to do several extra steps to make that happen.

        Also, only one of my three Roku units supports AV1 playback, so I haven't bothered for that reason. libx265 at defaults looks worse than libx264, so everything I've transcoded for archiving is with libx264.

    • by Bahbus ( 1180627 )

      20% smaller than VP9 is still in the ballpark. TFA says 30% vs h.264. Not to mention your tests don't mean anything because any real video encoding/decoding is not going to be purely CPU.

  • I played around with encoding in av1 through ffmpeg. It either was unbearably slow, or did not have any significant advantage w. r. t. Hvec in hardware accelerated versions. It has been more than a year ago though. Does anyone have more recent experience here?
    • by ffkom ( 3519199 )
      I had a similar experience a few months ago. My CPU supports both HEVC and AV1 encoding in hardware, both reasonably fast, but at the same result size, the HEVC encoding looked subjectively a little better (most prominently in a rainy outdoor scenery, were AV1 looked unreasonably blurry). My CPU-internal hardware encoders do, however, lack support for a lot of the more advanced features of those encoding standards, so my result does not say "HEVC better than AV1", it just says that with the hardware I have
  • A lot of older hardware that people may still be watching on doesn't support AV1 decoding. Only very recent gen video cards support AV1 encoding.

    Also realistically its just not THAT much better IMHO. H265 is a significant step above H264. AV1 vs H265 though just isn't as extreme (quick google searching says that H265 offers a 50% reduction in filesize versus an equivalent quality H264, whilst AV1 typically only results in a 12% reduction in size vs H265).

    Personally for my Jellyfin library where I archive

    • This absolutely is, I need to pick up an Nvidia 4060 or a Radeon RX 7900 at least to get hardware encoding. We still have people fighting tooth and nail to stay on windows 10, since their hardware doesn't support windows 11.

      It will probably take about 10 years for support to be widespread enough to matter. Throw in support being subpar (as it's new) and H265 will hang on for awhile. H265 is likely to hang on just as well as gif and jpeg did just through literal momentum by how much exists, A1 will end up be

  • Encoding is more expensive and slower, and you get more bugs with all the random setups people use to play videos. H264 just works.

  • Support for hardware decoding is still pretty new but it will get there eventually. Nobody will notice the changeover except for obsessive nerds.

  • How about including all the old titles in your on-line catalog? I'd even put up with a bit more buffering delay if I could only just watch the movie.

  • by allo ( 1728082 )

    publishers want a format that can do DRM. Even when they do not use DRM yet, but they may want later or have some files protected and some not. Guess what formats work with common DRM and what don't.

    • Guess what formats work with common DRM and what don't.

      DRM is applied to the container, not the codec.

  • Energy wise, how does it compare to current codecs ?

    I'm talking about the total : encoding, decoding, file transfert (but let 's forget the cost of renewing hardware)

    Genuine question. Does anyone know?

Ma Bell is a mean mother!

Working...