Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Media

AV1 Update Reduces CPU Encoding Times By Up To 34 Percent (tomshardware.com) 37

According to Phoronix, Google has released a new AOM-AV1 update -- version 3.5, that drastically improves encode times when streaming, rendering, or recording from the CPU. At its best, the update can improve encoding times by up to 34%. Tom's Hardware reports: It is a fantastic addition to AV1's capabilities, with the encoder becoming very popular among powerful video platforms such as YouTube. In addition, we are also seeing significant support for AV1 hardware acceleration on modern discrete GPUs now, such as Intel's Arc Alchemist GPUs and, most importantly - Nvidia's RTX 40-series GPUs. Depending on the resolution, encoding times with the new update have improved by 20% to 30%. For example, at 1080P, encode times featuring 16 threads of processing are reduced by 18% to 34%. At 4K, render times improved by 18% to 20% with 32 threads. Google could do this by adding Frame Parallel Encoding to heavily multi-threaded configurations. Google has also added several other improvements contributing to AV1's performance uplifts in other areas - specifically in real-time encoding.

In other words, CPU utilization in programs such as OBS has been reduced, primarily for systems packing 16 CPU threads. As a result, they are allowing users to use those CPU resources for other tasks or increase video quality even higher without any additional performance cost. If you are video editing and are rendering out a video in AV1, processing times will be vastly reduced if you have a CPU with 16 threads or more.

This discussion has been archived. No new comments can be posted.

AV1 Update Reduces CPU Encoding Times By Up To 34 Percent

Comments Filter:
  • Not worth it. (Score:4, Informative)

    by Guspaz ( 556486 ) on Saturday September 24, 2022 @05:46PM (#62911057)

    When it comes to streaming, you're better off using hardware h.264 encoding than AV1 software encoding. If you want to use AV1 for streaming, get a device with hardware acceleration for it.

    • Re: (Score:3, Funny)

      by Gravis Zero ( 934156 )

      Oh sure... let me just pick up an hardware encoder at the Doesn't Fucking Exist store.

      • by brad0 ( 6833410 )
        That is not true in the slightest.
      • Re:Not worth it. (Score:5, Informative)

        by Guspaz ( 556486 ) on Saturday September 24, 2022 @07:37PM (#62911265)

        There's hardware available for it. The Intel A380 can be stuck in a spare PCIe slot and used for hardware encode offloading, and the A310 might be an even cheaper choice for that. Next month, the latest nVidia cards will also support it.

        If you don't have hardware AV1 encoding, then use h.264 or something else that your GPU can handle with hardware encoding. Nobody wants the huge performance hit from trying to do real-time AV-1 software encoding with OBS, as TFA suggests.

        • The Intel A380 can be stuck in a spare PCIe slot and used for hardware encode offloading

          No, this is just using GPU compute and offloading the processing. It's still software based encoding... it just runs on a GPU.

    • Nonsense. Get a nice Ryzen CPU with cores to burn and all your AV1 streaming prayers will be answered.

  • "It is a fantastic addition to AV1's capabilities, with the encoder becoming very popular among powerful video platforms such as YouTube"

    Does anyone use it other than Google's YouTube?

    • I think Netflix does too. Although they will have to continue encoding they library with other codecs for a while since few devices support AV1 hardware decoding yet.
    • Comment removed based on user account deletion
      • by brad0 ( 6833410 )
        They're not a software solution. They're fixed function encoders like the other codecs already supported. Why are people such clueless idiots.
  • by ffkom ( 3519199 ) on Saturday September 24, 2022 @05:53PM (#62911071)
    ... is not really relevant, post comparisons with VP9, H.264 and HEVC encoders if you want to leave an impression.

    Also, we all know that any encoder advancements will not be used to enhance the video quality you get to watch, but to lower the bit-rate, and thus costs.
    • by brad0 ( 6833410 )
      It very much is relevant.
    • by godrik ( 1287354 ) on Saturday September 24, 2022 @07:24PM (#62911245)

      I do performance optimization for a living and that comment is incorrect. Comparison to a previous version of the code is very relevant. This is how you know that you improved compared to past performance. Also it is a piece of information that is directly relevant to people who already use that particular encoder.

      Making a comparison to a past version is also very easy to do and rarely subject to debate Meanwhile a comparison to other video encoder can be really difficult. For instance, you may not get the same image quality at the same compression ratio. Or people might argue that you use the wrong options in the codec, or you used the wrong codebase for it.

      Comparing different compression library is a VERY different task than showing that you improved the performance of an existing compression library.

    • "Also, we all know that any encoder advancements will not be used to enhance the video quality you get to watch, but to lower the bit-rate, and thus costs"

        What are you standing around for? Get back to work breaking open that rock to get the precious and scarce bits out of it! Wait a minute...what do I see here? YOU ARE STEALING BITS! We'll see about that! You won't be stealing bits while you are walled up in the old abandoned coke oven.

  • by NewtonsLaw ( 409638 ) on Saturday September 24, 2022 @07:34PM (#62911261)

    That Tom's Hardware article states:

    "with its RTX 40-series GPUs supporting AV1 encoding for the first time - thanks to the brand-new 8th generation NVENC engine"

    Yet NVIDIA's own website states that the 4000 series only support AV1 DECODING... see for yourself:

    Video Encode and Decode GPU Support Matrix [nvidia.com]

  • So it didn't improve encoding algorithm perse, they just improved the multithreading part so it is better spread over multiple cores. So no real improvement on a single thread.
    • by godrik ( 1287354 )

      In practice, multi thread performance is what users care about. There is rarely a good reason to run on a single thread if you could run multi thread with low/no overhead.

      • Well, you're right, except if it runs better on a single thread (due to optimization) it'll run even better when multi threaded.

Real Programmers think better when playing Adventure or Rogue.

Working...