Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Google Graphics Media Open Source News Technology

FFmpeg Announces High-Performance VP8 Decoder 80

An anonymous reader writes "Three FFmpeg developers — Ronald Bultje, David Conrad, and x264 developer Jason Garrett-Glaser — have written the first independent, free implementation of a VP8 video decoder. Benchmarks show that it's as much as 65% faster than Google's official libvpx. The announcement also gives a taste of what went into the development process, as well as the optimization techniques used. Currently it's only fully optimized on x86, but ARM and PowerPC optimizations are next in line for development."
This discussion has been archived. No new comments can be posted.

FFmpeg Announces High-Performance VP8 Decoder

Comments Filter:
  • Spec' Writing Course (Score:5, Interesting)

    by Manip ( 656104 ) on Saturday July 24, 2010 @09:23AM (#33012944)

    As someone who spends most of their work day implementing someone else's specifications I know exactly where they are coming from. I honestly cannot tell if people are bad at writing spec's because they're simply lazy or if they need to be trained to document their file formats completely.

    When I think back to my University days we never really learned how to write a specification and wonder if that wouldn't be a course worth teaching. Perhaps you get the students to write a program that outputs a set of complex information into a format, and then get them to write an end to end specification to both read and write that format.

    My favourite moments are when you realise that the current implementation not only doesn't follow the spec' but directly contracts it (e.g. A "bool" that can be TRUE, FALSE, "", "null", or "nan").

    • Re: (Score:1, Informative)

      by Anonymous Coward

      e.g. A "bool" that can be TRUE, FALSE, "", "null", or "nan"

      Well since TRUE and FALSE are uppercase (meaning preprocessor definitions/value constants) it's obvious that "bool" was not meant to be a unique type in this hypothetical language, and instead was a typedef for an integer type. Nothing a coding standard can't rectify.

    • Where I went to school we actually did have a course in business writing that included writing specifications, proposal requests, etc. I didn't enjoy it at the time but it has come in useful on many occasions.
      • Re: (Score:2, Insightful)

        by sdiz ( 224607 )

        "specifications" in business writing class is not technical specification.
        You don't describe how you convert colorspace, inverts the matrix, etc in them

    • by daveime ( 1253762 ) on Saturday July 24, 2010 @10:44AM (#33013368)

      Come one, everyone knows that a boolean can have the values TRUE, FALSE and FILE_NOT_FOUND.

      • Re: (Score:1, Funny)

        by Anonymous Coward

        "Oh, Fry, I love you more than the moon and the stars and the - POETIC IMAGE NUMBER 37 NOT FOUND."

      • s/FILE_NOT_FOUND/NULL
        <me action='self-whoosh' />
    • by daemonc ( 145175 )

      When I think back to my University days we never really learned how to write a specification and wonder if that wouldn't be a course worth teaching.

      At WVU we had Software Engineering, which was pretty much entirely about writing specs, and is required for all CS majors.

      Most people think we're just a party school (which, for the most part, is true), but the more I hear about other universities, the more I realize that our computer science and engineering programs are probably some of the best in the country.

    • A "bool" that can be TRUE, FALSE, "", "null", or "nan"

      That's why I like Perl so much. Anything can be a Bool., and it's easy to understand: if it is "something" it is true; otherwise it is false (like 0, "0.0", "", undefined, or that NaN nonsense). It's the sort of thing that drives me crazy in places like PHP or Javascript, where you suddenly need "===" operators or crazy tests for something that should be completely obvious.

      Of course, that makes offtopic, flamebait and whatnot all true.

  • We need WebM (Score:5, Informative)

    by ciaran_o_riordan ( 662132 ) on Saturday July 24, 2010 @09:35AM (#33013004) Homepage

    Abolishing software patents will take years. Most of the short-term goals are a waste of time, or a distraction by companies that don't really want to end the problem, but WebM is a project that would have a big impact, and has a good chance of succeeding. Great to hear that Xiph continues to support it!

    File formats and compatibility are the biggest problem caused by software patents. They're how monopolies get too powerful, and they're how companies with people-friendly terms get locked out of commercial software development. (Commerce isn't the only valid form of software development, but it's important for the sustainability of a project.)

  • by Anonymous Coward

    I usually rip my DVDs to ~1.2GiB Xvid avi files at native res using mencoder (not reencoding the audio), and have been doing this for many years. Does anyone know what combination of muxer and audio/video codecs is preferred nowadays? I'm thinking of using Matroska with Vorbis for audio but I'm completely lost as to what video codec to use. As for which tools to use, I find most of what I need in the Debian repositories but I'm open to suggestions.

    Also, I prefer quality over size but over 1.2GiB for a 90 mi

    • x264 in avi container is the most popular. x264 is going to take a little better than double the cpu to playback. The quality is better xvid even for dvd source but not dramatically better. It is substantially better for HD rips.

      If you use a weaker cpu for playback you are going to want to stick to xvid. And if you are concerned about 1.2GB of disc space when 1TB HDD are less than $100 you probably are.

      "I'm thinking of using Matroska with Vorbis for audio but I'm completely lost as to what video codec to us

      • Re: (Score:1, Insightful)

        by Anonymous Coward

        Using a modern audio codec like Vorbis is hardly "killing" the audio. Vorbis is generally transparent at around q3 and still quite respectable below that, and can thus offer savings ranging from "pretty good" (~1/2 with 192kbps AC-3) to "very significant" (~1/16 with raw PCM).

        Also, H.264 in AVI is an abomination, like sex with other men or eating shrimp. If you really want to risk eternal suffering in the fiery depths of encoding hell, go right ahead, but don't say I didn't warn you.

        • "Using a modern audio codec like Vorbis is hardly "killing" the audio"

          It is compared with using the source multi-channel audio.

    • Re: (Score:3, Insightful)

      by TheRaven64 ( 641858 )

      Also, I prefer quality over size but over 1.2GiB for a 90 minutes DVD is too much IMHO.

      Really? It's still a factor of 5-10 improvement over the DVD...

      When I got my first DVD drive, it went in a computer with a 20GB hard disk. For about half what I paid for that disk now (less when you factor in inflation), I can buy a 1.5TB disk. Most DVDs aren't full, they only use 6-8GB of space, so that's enough for 200 DVDs. At that price, why bother messing around with transcoders and recreating the menus - just store them as disk images and then you can transcode them later if you want.

      • by fandingo ( 1541045 ) on Saturday July 24, 2010 @11:01AM (#33013478)

        That's sort of what I do, but I would like to watch my DVDs on a dedicated device, which doesn't support ISOs.

        I have a 3TB RAID array that I'm just beginning to populate. I rip the full ISO, and then rip the videos (usually TV episodes) into h.264+aac in mp4. I used to use mkv, but it doesn't have good device support. I use a UPnP server on my Linux box to share with my PS3, which works great. Also, mp4 (really m4v) is great for iDevices as well, so I have that flexibility if I want.

        I encode with handbrake, which is ok, although I'm not happy with the Linux support. Since it's so Mac-centric, there isn't any support for the most recent release of Gnome (so no distros released after March 2010 work), so I have to run a dev version. I want really high quality encodes. I get pretty much perfect quality from the encodes and they run about 600-800MB/hr for film; animation is all over the place, but quality is good: 280-600MB/hr.

        I don't plan to delete the ISOs until my disk space is full. This way if technology changes, then I can still encode from source rather than from another encode.

        That being said, I think that h.264 will be around for many years.

      • Not only does this make sense, but sometimes later never comes.

        I rip DVDs straight to a drive array as plain isos. The original plan was to get around to re-encoding the video as h264 when the drives were full / I could be bothered. I don't buy new drives very often but the drive array is still growing faster than it is filling.

        The next hike is 2GB drives to replace some of the oldest in the array now that they are down to 100 quid. As this is prompted by errors starting to show up on an old drive rather th

      • At that price, why bother messing around with transcoders and recreating the menus

        Well, I for one would rather the menus not be there. If I want them I'll use the physical DVD. The other reason for transcoding is to reduce the file size so that it streams over wireless. Those little media players are cheap and work great - but often people do not have a CAT5 cable going to their TV. And while wireless might be fast enough most of the time, a little interference and you will notice the player start to stutter. Lower bandwidth requirements result in more reliable streaming.

    • Re: (Score:1, Informative)

      by Anonymous Coward

      use x264 to encode video to h.264, audio can be anything you like if you mux into matroska. If you care about quality then forget about using xvid. OK, it seemed great several years ago, but next to video encoded with x264 it is *pitifully* bad. You can use ffmpeg or mencoder to handle the cropping, scaling, muxing, encoding parameters etc. ffmpeg is better in some ways because it can mux successfully into mkv or mp4 while mencoder is only really useful for avi or outputting raw video.

      If a movie looks g

    • Re: (Score:1, Informative)

      by Anonymous Coward

      Personally, I've been going with H.264 and native audio (either AAC or DTS), in an MKV file. I usually throw the subtitles, and any director commentary (downmixed MP3 96kbps stereo 44.1) tracks.

      I've been doing the same with my blu-ray rips, but get frustrated because handbrake and VLC can't handle the subtitles on-disk.

    • by imroy ( 755 )

      Here's what I use:

      • MPEG-4 AVC video (a.k.a "h.264")
      • MPEG-4 AAC audio
      • MPEG-4 container

      Notice a pattern there?

      For Debian users, add the debian-multimedia repository and install x264, faac, and gpac (for the MP4Box tool).

    • Until I can play mkv files in a cheap DVD player then they're a non-starter for me. AVI might be rubbish but it works.

      I also wanted to chop up a mkv file into pieces the other day and there doesn't seem to be an equivalent to VirtualDUB for this, I thought there would be something by now.

      Is matroska gaining any support in the mainstream world or is it just another niche format like ogg?

    • I usually rip my DVDs to ~1.2GiB Xvid avi files at native res using mencoder (not reencoding the audio), and have been doing this for many years. Does anyone know what combination of muxer and audio/video codecs is preferred nowadays?

      Speaking for myself, I use XviD for video, raw ac3 or perhaps ogg for audio and mux everything together in an mkv file for best results.

      The big question I've faced is whether to use h264 or not for video. After considering this for a long time, I finally came to the conclusion

Our OS who art in CPU, UNIX be thy name. Thy programs run, thy syscalls done, In kernel as it is in user!

Working...