Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Media

The AV1 Video Codec Gains Broader Hardware Support 44

AV1 -- a next-generation, royalty-free video codec developed by the Alliance for Open Media, a consortium including tech giants like Google, Mozilla, Cisco, Microsoft, Netflix, Amazon, Intel, and Apple -- is finally making inroads. From a report: We are finally seeing more hardware support for this codec. The new M3 chips from Apple support AV1 decode. The iPhone 15 Pro and iPhone 15 Pro Max also feature an AV1 hardware decoder. The official Android 14 Compatibility Definition makes support for AV1 mandatory. The Snapdragon 8 Gen 2 chipset, widely used by Android phones released in 2023, supports AV1. With the exception of Microsoft Edge, all browsers support AV1.
This discussion has been archived. No new comments can be posted.

The AV1 Video Codec Gains Broader Hardware Support

Comments Filter:
  • I love h.265 for it's quality and space saving in comparison to everything else I've used, but actually compressing something with it takes massive horsepower over a lot of time. I've experimented with CPU, Nvidia, and Radeon chips, my finding is if I'm going to a live-stream I want to use one of the GPUs, I think the Radeon was actually be a bit better at being live and solid, but the Nvidia chip was hot on its heals with just slightly chunkier, but slightly better results. The CPU was the absolute king for making an efficient file for storage and later playback, but not up to par for live streaming.

    Any background on how AV1 works both front and back-end?

    • by slaker ( 53818 )

      It takes longer to encode than H.265, even with an Arc GPU and 48 threads to throw at it. It looks fine for 4k content at similar to H.265 at similar bitrates. The files are somewhat smaller. I generally thought it wasn't worth the effort to mess vs already well supported HEVC format.

      • So it does sound like this might be the future CODEC, but we need to wait a bit for mass deployment.

      • Comment removed based on user account deletion
        • by slaker ( 53818 )

          Iris XE (found in 12th gen+ Intel CPUs with integrated graphics) and Alchemist discrete cards do in fact support AV1 encoding in hardware. My Thinkpad engages its iGPU for AV1 encoding in Resolve rather than bothering the discrete nVidia GPU it also has with that task.

          AV1 encoding just isn't fast on anything.

          My video editing workstation is an older Threadripper. I have an Intel A750 installed JUST for hardware encoding support in DaVinci Resolve Studio. There aren't even any displays hooked up to the stupi

    • AV1/HEIC (Score:5, Informative)

      by JBMcB ( 73720 ) on Tuesday October 31, 2023 @10:40AM (#63969070)

      I've played around with it. Encoding is abysmally slow last time I checked, as the codec is new, unoptimized, and as pointed out in the article, hardware encoding/decoding is just now becoming common.

      I have been using HEIC regularly, though, which is the still image format derived from AV1, quite a bit via Imagemagick. It's phenomenal. I've been compressing old PNG images and, compared to JPEG, the image quality is slightly better, and the resulting file is usually less than 1/10th the size of the equivalent JPEG. I've also compared it to HEIF, derived from H.265, and I think that at the same file size, AV1 looks slightly better than HEIF.

      • You have just given me something to look into that I am unfamiliar with. As long as nearly everything can decode it, it may be the way to go for web hosting and even social media posting.

        Just like when I tried to be an early adopter of PNG, it's the support on the playback side.

        Then again the lack of early support in PNG was my favorite thing to throw in the face of IE huggers. I pointed out that IE really was the desktop turd I said it was, and when they argued back I pointed out that Mozilla browsers, O

      • by AmiMoJo ( 196126 )

        It's a shame that neither Chrome nor Firefox support HEIC. No JPEG XL either. For some reason Safari supports both, but it's kinda useless because only Mac/iOS users can see your website.

        AV1 encoding seems to be efficient enough with hardware support, similar to x265. Quality wise it is usually thought to be better at low bit rates in streaming scenarios (constant bitrate), for higher quality stuff it's mostly down to the encoder.

      • Clarification-AVIF (Score:4, Interesting)

        by JBMcB ( 73720 ) on Tuesday October 31, 2023 @12:23PM (#63969388)

        I'm using AVIF, which is the AV1 version of the H.265-based HEIC. I prefer AVIF to HEIC.

        AVIF is supported by most web browsers. I'd imagine the lack of HEIC support probably has to do with licensing issues, as from what I've read the commercial H.265 licenses are fairly onerous.

    • You are totally wrong on all fronts. h263 works, h264 works, h265 is decent, AV1 is even better. The Nvidia h265 support is pretty good, their AV1 support is event better. That's all there is. Intel is the best by far.

      • You don't understand the issue I am describing. HEVC is supported, but not the 422 color space that most cameras use. If you plan to do color grading in an editing application or would prefer to maintain color fidelity from the raw video files, it is in your best interest to keep 422 color. Most GPUs only handle the incomplete 420 color space in hardware, and therefore don't help for editing or color grading functions at the time when their help would be most useful. Throw some 8k Clog3 data at DaVinci Res

      • You might have me on the which GPU is best for streaming - my testing wasn't extensive enough to prove much of anything other than what I had on hand, which wasn't every card available. In fact I'm going to make an admission: every Radeon I tested with seriously was built into a Ryzen CPU, not a stand alone card. The Nvidia card I tested with was a bit out of date (1080 Ti), older than the Radeon. I have a newer one now, but I've only done just a little testing with it - cold storage, not streaming, so I

        • Tomshardware tested a bunch of different GPU encoders a little while back.

          https://www.tomshardware.com/n... [tomshardware.com]

          • That's a quality link, thank you. I'll read it closer later, it appears as though they had different testing criteria than I did in mind, but it's great none the less.

        • Why the hell would I want to chain my CPU down with HEVC encoding? I can get the same quality / file size trade offs at orders of magnitude more speed with NVENC, all while having the same options ( for the most part ). My 3070TI even offers the lesser used 12bit HEVC in the NVENC block.

          There isn't a single compelling reason for me to use CPU. Period. The tiny amount of space you MIGHT save can easily be offset with a few points less of the CQ setting... which will be visually no different since the losses

    • The biggest change right now is that it's an open-licensed codec unlike HEVC / H.265 which is patent-encumbered out the ass and requires shitloads of royalty payments to use commercially.

      The tech industry got pissed at that, so they came up with this. And now MPEG is fucked because they killed that golden goose with their greed.

    • by tlhIngan ( 30335 )

      I love h.265 for it's quality and space saving in comparison to everything else I've used, but actually compressing something with it takes massive horsepower over a lot of time. I've experimented with CPU, Nvidia, and Radeon chips, my finding is if I'm going to a live-stream I want to use one of the GPUs, I think the Radeon was actually be a bit better at being live and solid, but the Nvidia chip was hot on its heals with just slightly chunkier, but slightly better results. The CPU was the absolute king fo

    • by Ken_g6 ( 775014 )

      I've started using AV1 since Jellyfin is now able to transcode it back to h.264 for my media players. On CPU - I haven't tried GPU encoding - Handbrake's SVT-AV1 preset 6 takes 2-3 times as long as my x264 encodes did (on a custom preset similar to veryslow.) But AV1 produces files up to half the size, and often in better quality.

      A nice thing about AV1 is it can be encoded in tiles to improve multi-core decoding speed, at a tiny efficiency expense.

      The downside seems to be that slower presets also can't use

  • Every non-cine mirrorless Camera made in the last four or five years natively outputs HEVC 422. If you're lucky, your camera's implementation of that can be easily transcoded to a more editable format (e.g ProRes 422), or you have an external recorder such as an Atomos Ninja to capture ProRes in the first place.

    (Cine cameras generally output HEVC 444, I think, an even more rarified format, but if you're working with that, you have the budget to address that problem, too)

    Only Intel Iris Xe/Alchemist GPUs sup

    • by Saffaya ( 702234 )

      I think it makes sense for the graphics card maker to prioritize the format that will be used by game streamers? But if Intel does it, yes, it would be nice that the other two catch up and offer it.

      • by slaker ( 53818 )

        AMD and nVidia are fully aware that cameras produce 422 output. They just don't care. Apple silicon and high end Qualcomm and maybe(?) MediaTek SoCs support hardware transcoding to a more traditional editing format, but since most PC hardware does not, a common workaround is to record a camera's HDMI out with an external device such as an Atomos Ninja, which can transcode to a more editing-friendly format. Preserving the 10-bit color space is incredibly valuable, and it's an absolute travesty that PCs with

  • Until AV1 is over 20 years old there will always be some SCO-like company out there waiting to cash in, and even after that they could do the drug manufacturer's trick of saying the newer encoders/decoders infringe some patent.
    • True but any company would have to go against Google. Anyone looking at SCO as a model will have to remember it took almost 20 years and bankrupted the company before they even got any money. Even after IBM settled (IMHO they did not have to settle), the money is most likely used to settle debts of the former company.
    • by pecosdave ( 536896 ) on Tuesday October 31, 2023 @12:10PM (#63969350) Homepage Journal

      "Submarine Patents"

      That was Apple's flimsy excuse for not supporting Ogg/Vorbis anywhere with their own stuff (games and other software excepted as long as they bring their own CODEC).

      I never bought it. I always assumed it was to keep from legitimizing the OSS standards any more than they had to make a lateral move between OS's more difficult for the average user. I just cemented me into the anit-Apple category, I play nicer with those who play nice in turn. When I did use Macs in the past I always put Clementine on them so my Ogg music would work, fuck iTunes.

      • Apple is a governing member of the Alliance for Open Media [aomedia.org], along with all the other major tech names you can think of with a large interest in streaming video, which owns the AV1 codec. AOMedia's open license covers all patents used in the codec, royalty-free; this means we may actually see it in open source projects such as Firefox which still doesn't support HEVC / H.265 due to the required royalties and patent terms that are open-source-unfriendly.

        Oh, and MPEG-LA is completely fucked in the next 12 to

        • NVidia has had AV1 8/10 bit hardware decode in all of the Ampere line of GPU's the 30xx series) except the A100, which if I remember correctly is a datacenter GPU with no video out at all. If you are running an A100, you have more than enough CPU grunt to not notice any CPU decoding anyways, if you don't just decode on a modern card that is doing the video out anyways.

      • So you're anti Apple because of some stupid conspiracy theory you came up with in your head?

        • Sure, totally from my head.

          https://digitalcitizen.info/20... [digitalcitizen.info]

          • So your conspiracy comes from a blog post in 2007 then. But let's examine the points in the blog post.

            Apple opposed Ogg/Vorbis in the HTML5 standard for a few reasons:

            1. Submarine patents: "Ogg Vorbis+Theora might be encumbered by patents we don’t yet." The blog poster then starts a strawman argument about how no software is free from patent infringement claims. No one argued that. The point that went over the author's head is a patent infringement fight even a frivolous one can be drawn out and cost
      • That was Apple's flimsy excuse for not supporting Ogg/Vorbis anywhere with their own stuff (games and other software excepted as long as they bring their own CODEC).

        The issue for Ogg/Vorbis is that it does not solve many problems that is not served by another codec. Many devices use MP3 as default with AAC being the successor to it. The main reason anyone uses Ogg is no licensing requirements.

        I always assumed it was to keep from legitimizing the OSS standards.

        Considering that all of Apple's different OS is based on BSD and many components are based on OSS, that would be a strange assumption. Apple releases their BSD variant, DarwinOS. OSS standards like CUPS and LLVM/Clang have been supported by Apple over the decades as well. I would

        • Right, but BSD allows you to close the code afterward. It's why Microsoft has been using BSD code in WIndows since at least the earliest NT release, what there one before NT 3.51? Maybe earlier.

          Apple is only sort-of pro open source. I once did a search for the word "Linux" on the Apple website, it only had one hit, on the forums, and then that post got deleted later. I know at one point in history they tried to make sure the word didn't exist on their domain. They bought CUPS after it was already estab

          • Right, but BSD allows you to close the code afterward. It's why Microsoft has been using BSD code in WIndows since at least the earliest NT release, what there one before NT 3.51? Maybe earlier.

            BSD does not require users to release modifications yet Apple has released and continues to release DarwinOS which are their modifications to BSD. However, you suggested Apple did not support Ogg/Vorbis because they are opposed to OSS standards somehow. That is belied by the fact they use OSS in many places and have supported it in many cases.

            Apple is only sort-of pro open source.

            "Only" meaning what? They have supported it. They have used it. Their software is not 100% open source. So what?

            I once did a search for the word "Linux" on the Apple website, it only had one hit, on the forums, and then that post got deleted later. I know at one point in history they tried to make sure the word didn't exist on their domain.

            Open Source does not mean "Linux only". It never has.

        • I use Ogg Vorbis because it's better than MP3. The sound quality is great, especially at low bitrates. Plus the tagging is easier to work with.

    • If they want to take a legal poke at AV1, they'll be going against the combined legal might of Apple, Google, Microsoft, Nvidia, AMD, Netflix, Amazon, Samsung, Qualcomm, Broadcom, Cisco, Meta, and others.

      Even the MPEG consortium, which probably has the biggest patent portfolio and motivation for making that poke, hasn't done shit because they know there's nothing there.

      Good luck with that.

    • If there was a patent troll (specifically MPEG-LA) that had a claim they would have come out by now. AV1 is already widely used by several major companies with content significant enough for any patent troll to make an absolute killing, a true massacre, if their patent claims were valid.

  • Isn't current iterations of Edge based on Chromium, which supports AV1?
    • For reasons probably known only to Microsoft they don't use the chromium video decoding code but instead roll their own (possibly something to do with patent licensing connected to the patented codecs said code handles alongside av1)

      • For reasons probably known only to Microsoft they don't use the chromium video decoding code

        Chrome doesn't use Chromium's video decoding code either. The reason is DRM. The effect on consumers it that certain streaming services stream in higher resolutions / with better sound only when using Edge or Chrome.

  • I was going to buy an iPhone 15 but it looks like I'll be waiting on the 16 because I'm not going to step up to a Pro just for this codec. Looks like the 7+ will have to work for another year.

  • Edge users don't worry, Windows recently got RAR support after a 30 year wait, so expect Edge to support AV1 cica 2053.

Economists state their GNP growth projections to the nearest tenth of a percentage point to prove they have a sense of humor. -- Edgar R. Fiedler

Working...