AV1 Update Reduces CPU Encoding Times By Up To 34 Percent (tomshardware.com) 37
According to Phoronix, Google has released a new AOM-AV1 update -- version 3.5, that drastically improves encode times when streaming, rendering, or recording from the CPU. At its best, the update can improve encoding times by up to 34%. Tom's Hardware reports: It is a fantastic addition to AV1's capabilities, with the encoder becoming very popular among powerful video platforms such as YouTube. In addition, we are also seeing significant support for AV1 hardware acceleration on modern discrete GPUs now, such as Intel's Arc Alchemist GPUs and, most importantly - Nvidia's RTX 40-series GPUs. Depending on the resolution, encoding times with the new update have improved by 20% to 30%. For example, at 1080P, encode times featuring 16 threads of processing are reduced by 18% to 34%. At 4K, render times improved by 18% to 20% with 32 threads. Google could do this by adding Frame Parallel Encoding to heavily multi-threaded configurations. Google has also added several other improvements contributing to AV1's performance uplifts in other areas - specifically in real-time encoding.
In other words, CPU utilization in programs such as OBS has been reduced, primarily for systems packing 16 CPU threads. As a result, they are allowing users to use those CPU resources for other tasks or increase video quality even higher without any additional performance cost. If you are video editing and are rendering out a video in AV1, processing times will be vastly reduced if you have a CPU with 16 threads or more.
In other words, CPU utilization in programs such as OBS has been reduced, primarily for systems packing 16 CPU threads. As a result, they are allowing users to use those CPU resources for other tasks or increase video quality even higher without any additional performance cost. If you are video editing and are rendering out a video in AV1, processing times will be vastly reduced if you have a CPU with 16 threads or more.
Not worth it. (Score:4, Informative)
When it comes to streaming, you're better off using hardware h.264 encoding than AV1 software encoding. If you want to use AV1 for streaming, get a device with hardware acceleration for it.
Re: (Score:3, Funny)
Oh sure... let me just pick up an hardware encoder at the Doesn't Fucking Exist store.
Re: (Score:2)
Re: (Score:1)
It's true if you don't have $50K to blow on some ASICs.
Re:Not worth it. (Score:5, Informative)
You can buy an Intel graphics card for $140 USD and stick it in a space PCIe slot to offload AV1 encoding. Why would anybody blow $50k on an ASIC to do it?
Re: (Score:2)
There's merely offloading the task but it's not dedicated hardware which implies an ASIC codec.
Large companies (e.g. Google/Youtube) would make an ASIC for their servers (electricity is money). $50K is just the cost of getting engineering samples but if they want several thousand then the cost per chip is much much lower.
Re: (Score:1)
Re:Not worth it. (Score:4, Informative)
Re: (Score:2)
Re: (Score:1)
Re: (Score:1)
There's merely offloading the task but it's not dedicated hardware which implies an ASIC codec.
Re: (Score:1)
Re:Not worth it. (Score:5, Informative)
There's hardware available for it. The Intel A380 can be stuck in a spare PCIe slot and used for hardware encode offloading, and the A310 might be an even cheaper choice for that. Next month, the latest nVidia cards will also support it.
If you don't have hardware AV1 encoding, then use h.264 or something else that your GPU can handle with hardware encoding. Nobody wants the huge performance hit from trying to do real-time AV-1 software encoding with OBS, as TFA suggests.
Re: (Score:1)
The Intel A380 can be stuck in a spare PCIe slot and used for hardware encode offloading
No, this is just using GPU compute and offloading the processing. It's still software based encoding... it just runs on a GPU.
Re: (Score:2)
Re: (Score:2)
Nonsense. Get a nice Ryzen CPU with cores to burn and all your AV1 streaming prayers will be answered.
Serious question about "powerful video platforms" (Score:2)
"It is a fantastic addition to AV1's capabilities, with the encoder becoming very popular among powerful video platforms such as YouTube"
Does anyone use it other than Google's YouTube?
Re: (Score:3)
Re: (Score:2)
Re: (Score:1)
Comparing only with oneselve's past results... (Score:3, Informative)
Also, we all know that any encoder advancements will not be used to enhance the video quality you get to watch, but to lower the bit-rate, and thus costs.
Re: (Score:1)
Re:Comparing only with oneselve's past results... (Score:5, Informative)
I do performance optimization for a living and that comment is incorrect. Comparison to a previous version of the code is very relevant. This is how you know that you improved compared to past performance. Also it is a piece of information that is directly relevant to people who already use that particular encoder.
Making a comparison to a past version is also very easy to do and rarely subject to debate Meanwhile a comparison to other video encoder can be really difficult. For instance, you may not get the same image quality at the same compression ratio. Or people might argue that you use the wrong options in the codec, or you used the wrong codebase for it.
Comparing different compression library is a VERY different task than showing that you improved the performance of an existing compression library.
Re: Comparing only with oneselve's past results... (Score:1)
"Also, we all know that any encoder advancements will not be used to enhance the video quality you get to watch, but to lower the bit-rate, and thus costs"
What are you standing around for? Get back to work breaking open that rock to get the precious and scarce bits out of it! Wait a minute...what do I see here? YOU ARE STEALING BITS! We'll see about that! You won't be stealing bits while you are walled up in the old abandoned coke oven.
WTF Tom's Hardware?? (Score:3)
That Tom's Hardware article states:
"with its RTX 40-series GPUs supporting AV1 encoding for the first time - thanks to the brand-new 8th generation NVENC engine"
Yet NVIDIA's own website states that the 4000 series only support AV1 DECODING... see for yourself:
Video Encode and Decode GPU Support Matrix [nvidia.com]
Re:WTF Tom's Hardware?? (Score:5, Informative)
Scroll to the bottom, junior. 4080 12GB, 4080 16GB and 4090 ... all have "YES" in the AV1 encode column.
Re: (Score:3)
Thanks for that... who knew it scrolled sideways as well :-) :-O
Re: WTF Tom's Hardware?? (Score:2)
"Scroll sideways" :-\
Re: (Score:2)
No markers, and the page takes less than half of horizontal space, yet it needs horizontal scrolling... WTF?
Ok.... (Score:2)
Re: (Score:3)
In practice, multi thread performance is what users care about. There is rarely a good reason to run on a single thread if you could run multi thread with low/no overhead.
Re: (Score:2)