Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
Media Entertainment

Hulu Joins Netflix and Amazon In Promoting Royalty-free Video Codec AV1 (fiercecable.com) 134

theweatherelectric writes: Hulu has joined the Alliance for Open Media, which is developing an open, royalty-free video format called AV1. AV1 is targeting better performance than H.265 and, unlike H.265, will be licensed under royalty-free terms for all use cases. The top three over-the-top SVOD services (Netflix, Amazon, and Hulu) are now all members of the alliance. In joining the alliance, Hulu hopes "to accelerate development and facilitate friction-free adoption of new media technologies that benefit the streaming media industry and [its] viewers."
This discussion has been archived. No new comments can be posted.

Hulu Joins Netflix and Amazon In Promoting Royalty-free Video Codec AV1

Comments Filter:
  • by Anonymous Coward

    Currently AV1 encoding with common encoding tools is a very time consuming process, as can be seen in the below screenshot taken from a Lenovo T540p notebook with an i7-4800MQ, 8GB RAM running Ubuntu 14.04. It would take 8 hours and 42 minutes to encode a 1080p@24fps 40 second long sequence (Tears of Steel Teaser) with a target bitrate of 1.5Mbps.

    I wonder if GPUs can speed things up?

    • by K. S. Kyosuke ( 729550 ) on Tuesday July 04, 2017 @03:34PM (#54743307)
      That, and writing a non-prototype encoder, most likely.
      • AV1 still alpha (Score:4, Insightful)

        by DrYak ( 748999 ) on Tuesday July 04, 2017 @04:43PM (#54743569) Homepage

        That, and writing a non-prototype encoder, most likely.

        yup, currently AV-1 is still an alpha.

        it's still a playground in which to experiment by activating feature which are currently being developped.
        (e.g.: the Perceptual Vector Quantization (PVQ) and Assymetic Numeric System entropy coder (ANS) that were developped at Xiph as part of Daala, can be tested into AV-1)

        Wait until it hits AV-1, only then will developers start optimizing performance instead of chasing compression factors.

        • By the way, is it one of those standards like DJVu that are defined in terms of the decoding process? Not sure if AV1 belongs to it but such standards allow for a variety of approaches of how the encoded source can be generated and there's in principle no specific fixed quality/speed curve for those since you can come up with new ways of encoding in the future that satisfy the decoder. But I admit that I see much less into video formats.
    • by 93 Escort Wagon ( 326346 ) on Tuesday July 04, 2017 @03:46PM (#54743339)

      For home use, I don't really see the point of using these very computationally expensive codecs - it's not like you can make better rips... just smaller ones, and disk space isn't expensive anymore. My hundred-or-so DVD/Blu-Ray collection was ripped to h.264 a number of years ago, and those still work just fine.

      However for a commercial service, it's a different argument. Not only do they have tens of thousands of items in their catalogs, but there's also bandwidth to think about. For them, the investment may make sense. However if it's equally expensive, hardware-wise, to decode the streams... then they have to worry whether their customers will be willing to make the investment.

      • No worries dude. All these codecs are designed to be easier to decode them than it is to decode them, so inexpensive-real-time-decoding players for media consumption are feasible on day one (say, either streaming from the internet, or streaming from a plastic shiny disk spinning at a fixed rate).

        Normally you would produce content in non-compressed format for maximum quality, and then compress in non real-time. As time progresses, and moore's law progresses (and the programmers codding the codec refine the a

        • [...] designed to be easier to decode them than it is to decode them, [...]

          lulwut?

        • by Whiteox ( 919863 )

          I'd like to see you watch anything using HVEC (265) codec. It has to be the worst out there and I wish it would go away.

      • by thegarbz ( 1787294 ) on Tuesday July 04, 2017 @05:28PM (#54743729)

        My hundred-or-so DVD/Blu-Ray collection was ripped to h.264 a number of years ago, and those still work just fine.

        So you compressed a high quality source into a smaller file, but you say there's no point in potentially doing it with better quality? You still have the original collection then you could get a quality improvement.

        If you don't have the original however you should note that files aren't getting smaller, and the "not expensive" 4TB HDD will quickly fill up if you value your 4K content.

      • For home use, I don't really see the point of using these very computationally expensive codecs - it's not like you can make better rips... just smaller ones, and disk space isn't expensive anymore. My hundred-or-so DVD/Blu-Ray collection was ripped to h.264 a number of years ago, and those still work just fine.

        However for a commercial service, it's a different argument. Not only do they have tens of thousands of items in their catalogs, but there's also bandwidth to think about. For them, the investment may make sense. However if it's equally expensive, hardware-wise, to decode the streams... then they have to worry whether their customers will be willing to make the investment.

        Don't see the point?? Who buys DVDs snd Blu-Rays these days to rip them? I never even bothered to get a Blu-Ray player. Netflix, Amazon, and Hulu are streaming services. They presumably want to use this codec for streaming, in which case chewing up computational resources which are available in plenty on most PCs, is an acceptable tradeoff for better quality and above all probably less bandwidth consumption. This is especially important now that Trump's new FCC Chairman Ajit Pai is getting ready to stomp th

        • It's not acceptable because that's the main reason why I get 1h or more browsing time out of my battery when using Safari which generally used the mp4 codec when available in video streams, while Chrome is trying to push their computationally expensive VP9 codec down our throats (particularly on YouYube). This is the same shit all over again.
          • by AaronW ( 33736 )

            With hardware acceleration AV1 is not that battery intensive. Just about everyone is adopting AV1 except Apple. What makes a big difference is if the underlying hardware supports offloading AV1 encoding and decoding. All of the major hardware vendors will support AV1, including Intel, AMD, ARM and Nvidia. Apple is the lone holdout. Most web browsers except Safari support AV1 which builds on VP9. VP9 is also not supported by Apple.

      • by Anonymous Coward

        The keyword here is royalty free... h.264 is encumbered by MPEG-LA

      • Don't worry, decoding is always orders of magnitude less intensive than encoding. A regular PC will probably be enough for full hd playback in the beginning.
        Later hardware support will appear both for PC GPUs and for mobile SOCs (there're hardware manufacturers in the alliance) and it'll be feasible to play even on mobile devices.
        Meanwhile it can coexist with current codecs.
        Funnyly, this is one of those cases in which a few companies (alliance for open media) ally to fight other companies (h265's paten
    • by Anonymous Coward

      a Lenovo T540p notebook with an i7-4800MQ, 8GB RAM

      There's your problem. You have to use the proper tool for the job.

      8 hours and 42 minutes to encode a 1080p@24fps 40 second long sequence with a target bitrate of 1.5Mbps.

      For comparison, I'm using a PC with 32GB RAM and an AMD 8 core CPU (the old FX, not the new Ryzen)

      It takes approx. 6 hours to encode a 2 hour 30 minute video, 1080p@25fps, HEVC/H.265, with a bitrate of 5Mbps.

    • GPU : Yes (Score:5, Informative)

      by DrYak ( 748999 ) on Tuesday July 04, 2017 @04:34PM (#54743533) Homepage

      I wonder if GPUs can speed things up?

      given that [aomedia.org] AMD, Nvidia, Intel, ARM, Broadcom are also on board (beside content providers like Netflix, Amazon, Hulu and Google)
      you can bet that Yes, there are going to be GPU implementations.

      (And if you've followed the posts of Xiph - you know that they take GPU into account from the beginning).

      Also there are already currently cloud based solution [bitmovin.com] that distribute the compression workload accross a cluster.
      (Video is split into smaller segment, each segment is independently compressed by a separate job on the cluster, then the compressed streams are concatenated together).
      And bitmovin is already providing alpha support for AV-1 as it is now (so they can already test their solution and so, in a few months, on the day when AV-1 hits version 1.00 they are already ready and their users have already tested pipelines).

      Actually the only single major player that is missing here is Apple.
      Probably because they are betting all their marbles on their own patended H265/MPEG4 HEVC.
      They are among the patent owners of the patent - so using/licensing H265 comes much more cheaper for them.
      Which was the main reason for everybody else to drop H265 and consider joining Aomedia for AV-1 (between the original patent-pool, the other competing pools that have formed with other sets of patents and patent troll waiting to sue to try to get their share, licensing H265 is a much more expensive adventure than licensing H264/MPEG4 AVC was- To the point that H265 licenses cost a significant part of the price of embed ARM SoC as those used by cheap phones, ruining their competitivity)

  • by Anonymous Coward

    but I'm going to wait for AV2

  • While nice... (Score:2, Interesting)

    by Anonymous Coward

    It will take about 10 years for it to become a viable standard. Considering how many devices out there that won't support it. I know that I won't rush out to replace my Smart TV that can now handle H265 and H264. Nor will I be re-encoding my videos until forced to, which would be around 15-20 years IF IF IF this 'new video code' (heard this story before) becomes viable.

    • It will take about 10 years for it to become a viable standard.

      No, the AV1 bitstream format will be frozen later this year, browsers will add support for AV1 soon after that (Mozilla, for example, is already working on it [mozilla.org] in Firefox), and YouTube, which is the world's largest video site, intends to start using AV1 as soon as possible. AV1 will be adopted quickly.

      Considering how many devices out there that won't support it.

      Many devices will be able to support it in software. My iPhone 7 doesn't "officially" support VP9 [wikipedia.org] but VP9 video plays back just fine in VLC for iOS.

      Nor will I be re-encoding my videos until forced

      You don't have to re-encode anything unless you want to.

    • by rtb61 ( 674572 )

      Lets see the forces opposed to your choice. All media companies distributing compressed digital content (https://en.wikipedia.org/wiki/High_Efficiency_Video_Coding pretty fucking expensive 25 million dollars, for something that is a straight mathematical model pretty much any company could have come up with, nothing new, just application maths). Manufactures also save the fee and as a major bonus get to replace all outdated equipment or if the customer base in too cranky, supply a software update.

      So billio

    • It depends whether we can manage to use existing hardware to accelerate decode.

      My devices that can handle streaming video include a TiVo, PS3, PS4, and an Amazon fire stick. All of them are software upgradable. A lot of smart TVs are Android based, so they are as well. A company like Netflix has decent bandwidth savings if only a few percent of its users switch, and the switchover will be nearly invisible to them. The app will just query the hardware to find what codec is available.

      Obviously there's no
  • by Anonymous Coward

    What about Dirac? Invented for the exact same reason. Theora anyone? Same thing. VP1? Again.

    What's got me slightly pissed off is why the fuck these assholes all went "Nope, fuck off" to all of those in turn? Were they hoping to make enough money with locked down codecs at the time that they wanted the ability to enforce rights in codecs? Or just NIH?

    • Re: (Score:3, Insightful)

      by Anonymous Coward

      What about Dirac? Invented for the exact same reason. Theora anyone? Same thing. VP1? Again.

      What's got me slightly pissed off is why the fuck these assholes all went "Nope, fuck off" to all of those in turn?

      It takes a long time to go from inventing the standard and producing a sufficiently competent encoder. Hell, look at mp3 encoding and how right now lame is tons better than the first mp3 encoder. Yet...

      Were they hoping to make enough money with locked down codecs at the time that they wanted the abili

    • by Anonymous Coward

      Or maybe you don't know what you're talking about?

      AV1 is an extension of VP9, which traces back to VP1, and includes Ophus for audio, developed by Xiph, as a succesor to Vorbis and Speex. Dirac is actually the only NIH project developed by a single "company": the BBC.

    • Re: (Score:3, Insightful)

      by Anonymous Coward

      The economics changed. There are enough large streaming services now that it's cheaper for each of them to work with the others developing a new format than to keep licensing the latest and greatest codecs from MPEG LA. Older codecs aren't up to the challenge of streaming 4K video over the shitty connections that pass for broadband in the States.

    • by Lennie ( 16154 )

      Did you check who are involved ? These are the same people who worked on VP and Dirac and then some companies that know how to do streaming.

      Mostly the same core companies that were working on Opus at IETF, they started this work at the IETF as well. Not sure why they went their separate way for this though.

    • by theweatherelectric ( 2007596 ) on Tuesday July 04, 2017 @07:37PM (#54744109)

      What's got me slightly pissed off is why the fuck these assholes all went "Nope, fuck off" to all of those in turn?

      They didn't. VP9 [wikipedia.org] is used, for example, by YouTube [googleblog.com], Netflix [medium.com], and Wikipedia. Watch a video on YouTube, right click on it and select "Stats for nerds". If your browser supports VP9 then chances are the video will be playing back in VP9.

      AV1 is the successor to VP9.

    • Bear in mind there are two factors that lead H.264 to end up the standard for the web:

      1. There was some controversy as to whether the rival codecs (WebM etc) being offered by Google et al were actually as good at similar bitrates. There wasn't a consensus, with a significant number of people saying WebM was slightly worse.

      2. Apple, at an early stage, threw it's weight behind H.264 and refused to support WebM at all, including in the web browser. Meaning that every streaming company had to either suppor

  • I noticed that at the bottom of the article the reason behind this is basically to adopt technologies to improve streaming services. Could this movement also be used to bring in new royalty free physical media? Call me a tin foil hat lover but I do not like the fact that everything is going streaming. People are so dependant on having to move nothing more than a couple fingers for entertainment. Well, I ask you good Sir and Madam what are you going to do when streaming services shut down? In (15)years whe
    • VHS tapes are reaching the point where they are degrading. You can still rip Blu-Ray and DVD.

      • But even then they are salvageable, even if a big chunk is damaged. Some media files won't play correctly if even 10% is damaged.

        Big scratch on a DVD disc on the wrong side of the disc? might render it unplayable. VHS that got eaten by a deck? simple splice job...

        (thinking about delamination problems on laserdiscs, even worse)

        • by Bert64 ( 520050 )

          Which is why DRM is such a bad thing...
          Digital brings the benefit of perfect copies, you can backup the media and keep the original safe. A spliced VHS tape may be playable while damaged, but a digital backup would be perfect.
          When dealing with kids, or media that will be played/kept in hostile conditions, it's always sensible to make backups.

  • by carlhaagen ( 1021273 ) on Tuesday July 04, 2017 @04:31PM (#54743521)
    I see a problem with Bitmovin's comparisons (linked in the article) not telling us which encoder was used for the H.264 and H.265 tests. This matters tremendously - there are shitty encoders producing bad H.264/265, and there are amazing encoders producing excellent H.264/265, at one and the same bitrate. It's like comparing a new audio coded to MP3 and using Xing MP3 instead of LAME, and calling the test legit.
    • Most likely they are using the h.264/h.265 reference design encoders. The reason most commercial encoders vary in quality, is because they all use different shortcuts to achieve faster encode times.
    • This matters tremendously.

      Indeed. Especially if you are using GPU acceleration as well. But I wonder if that is part of the equation. NVENC produces garbage comparable to FFMPEG's H.265 encoder, but I still use it because a 10x speed increase matters sometimes. I imagine it matters even more when you're serving half the internet's bits to customers.

  • Unless the principals have some nice, cheap silicon in their back pockets, AV1 likely won't go anywhere. AVC/H.264 prevailed over VC-1 in part due to reference hardware decoder designs, and HEVC/H.265 already has dedicated decoder chipsets that can fit in the next iteration of smartphones and STBs easily, freeing up GPUs for whiz-bang interface or more critical number-crunching work.
    • by Luthair ( 847766 ) on Tuesday July 04, 2017 @05:32PM (#54743741)
      If you look at the membership lists its the whose who of software, hardware, streaming and editing. It looks like a very real chance of happening unless some of those members are actively sabotaging the process.
      • by Anonymous Coward

        They are all chomping at the bit to have everybody all buy new hardware because the hardware they already have won't run this new thing...

  • by Anonymous Coward

    This has nothing to do with any of this companies standing up for open access or freedoms. It is all about not wanting to pay the royalties and keep all the money for themselves.

    • It is all about not wanting to pay the royalties

      Yes. I, too, don't want to pay royalties just to work with a video file and transmit it over the internet. I don't see video licensing as needing to be different from and I want it to be the same as HTML and PNG and JPEG and all the royalty-free protocols and formats that make the web and the internet possible. We have that now with VP9 [wikipedia.org] and we will have it with AV1.

      In this instance their agenda matches our agenda. That's a good thing. Exploit it.

  • So do I have this right? Basically AV1 is VP10, but with two pieces taken from Daala (new symbol coding inspired by Daala, and the directional deringer)?

    I'm not sure what other pieces ended up in AV1. There must have been something from Thor in there.

  • This is great at all, especially if in winds up in the webm wrapper. But can the slashdot audience please inform the general plebs about the death of gifs already. Gfycat and imgurs switcher extension .gifv are not gifs. They are motion compressed video files, forced to support h.264 inside mp4 because of apple product users. Please god can we come up with a new word, the tools people search for in 'how to make gifs' are still out there and i wish them to be quite DEAD
    • by tepples ( 727027 )

      What's a better term for "low-definition silent video, often with a low and/or variable frame rate, no longer than 15 seconds"?

  • Irony (Score:4, Interesting)

    by hackel ( 10452 ) on Wednesday July 05, 2017 @02:20PM (#54748921) Journal

    There's something very ironic about these three companies joining an "open media" alliance, while they all rely on DRM *extensively*.

"I just want to be a good engineer." -- Steve Wozniak, co-founder of Apple Computer, concluding his keynote speech at the 1988 AppleFest

Working...