Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Media Television

BBC Lowers HDTV Bitrate; Users Notice 412

aws910 writes "According to an article on the BBC website, BBC HD lowered the bitrate of their broadcasts by almost 50% and are surprised that users noticed. From the article: 'The replacement encoders work at a bitrate of 9.7Mbps (megabits per second), while their predecessors worked at 16Mbps, the standard for other broadcasters.' The BBC claims 'We did extensive testing on the new encoders which showed that they could produce pictures at the same or even better quality than the old encoders ...' I got a good laugh off of this, but is it really possible to get better quality from a lower bitrate?"
This discussion has been archived. No new comments can be posted.

BBC Lowers HDTV Bitrate; Users Notice

Comments Filter:
  • Focus group... (Score:2, Insightful)

    by gandhi_2 ( 1108023 )

    ...of blind retards.

    • by account_deleted ( 4530225 ) on Thursday December 17, 2009 @02:18PM (#30476256)
      Comment removed based on user account deletion
      • by Anonymous Coward on Thursday December 17, 2009 @02:38PM (#30476590)

        FTA: ""Even my wife can see a reduction in picture quality and she's got cataracts," wrote one. "

        They must have a pretty big screen if she can see that difference from the kitchen.

        • Re: (Score:3, Funny)

          by geekoid ( 135745 )

          That's just wrong and offensive. I can' not believe you would you would say something so neanderthall.

          This is the modern age asshole, and we put TV's in the kitchen~

          No TV in the kitchen, sheesh.

    • Focus group of blind retards.

      "And do you know why? Because they're Scotts! Ha ha ha ha hah!"

      Sigh...

    • Re:Focus group... (Score:5, Informative)

      by jasonwc ( 939262 ) on Thursday December 17, 2009 @02:37PM (#30476580)

      Yes, it IS possible to get higher picture quality out of a lower bitrate, but not with all else equal. For example, you can get higher quality with CPU-intensive settings using H.264 5.1 Profile than you can with H.264 4.1 (what Blu-Ray's/HD DVDs use), at the same bitrate. You're giving up CPU cycles in decoding for lower video size. This is why x264 can produce near-transparent encodes of Blu-Ray movies at about half the size. x264 uses much more demanding settings.

      x264 at 20 Mbit which high-quality settings is far more demanding than a 40 Mbit H.264 stream from a Blu-Ray.

      • Re:Focus group... (Score:5, Interesting)

        by postbigbang ( 761081 ) on Thursday December 17, 2009 @02:42PM (#30476684)

        In the US, Comcast uses codex compression to squeeze HD on their cable systems. When people get to see native resolution at the TV store, then get the Comcast version when they plug in their shiny new HD TV, they wonder WTF? That the beeb would put their foot on the garden hose and expect no one to notice is ludicrous.

        I wish the FCC would get involved in the US to force cable companies to limit the number of channels supported and broadcast them in the highest sustainable resolution-- or tell their users the truth about what's happening and why. Maybe we can start to get rid of the excess junk channels.

        • Re:Focus group... (Score:4, Interesting)

          by kimvette ( 919543 ) on Thursday December 17, 2009 @02:51PM (#30476832) Homepage Journal

          Adding to that, Comcast's programming is 720p, with much of it upscaled. The Blu-Ray source you see at the stores are often 1080p, or at least 1080i. You're comparing rotten wormy apples to nice juicy oranges, where Comcast's feeds are the rotten wormy apples.

        • Re: (Score:3, Informative)

          by Maxo-Texas ( 864189 )

          I left comcast because their *digital* signals were worse than standard TV over the antennae.

          Dish has gone the other way-- their signal *looks* crisp, but there is a lot more blockiness than there used to be. I used to have blocky outbreaks perhaps 1 or 2 times in 40 hours of viewing. Now I get blockiness 1 or 2 times per 10 hours of viewing.

        • Re: (Score:3, Interesting)

          by Znork ( 31774 )

          to limit the number of channels supported

          Of course, as long as most storylines are low def, having a lot of channels might not be a bad idea; don't assume they'll necessarily kill off the channels you consider excess junk.

          Personally I can't say I care much about resolution. As long as the format is progressive, digital and anywhere near SD increases in resolution are about the last thing I'll notice in any normal viewing situation. Improved contrast, lack of compression artifacts, sound, content, basically

  • by Locke2005 ( 849178 ) on Thursday December 17, 2009 @02:12PM (#30476146)
    They also lowered their math standards. From 16MBps to 9.7 MBps is a 40% reduction, not "almost 50%".
  • Yes (Score:2, Informative)

    by Anonymous Coward
    Yes, if more time and passes are spent encoding the video, lower bitrate CAN result in a higher quality video. However, this does not appear to be the case in this instance.
    • Re: (Score:3, Insightful)

      by Bakkster ( 1529253 )

      In other words, lower bitrate can be better, but only if you compare to shitty and inefficient compression.

      • Re:Yes (Score:5, Insightful)

        by Fantom42 ( 174630 ) on Thursday December 17, 2009 @02:38PM (#30476598)

        In other words, lower bitrate can be better, but only if you compare to shitty and inefficient compression.

        And by this you mean compression that is state of the art two minutes ago, vs. today. Seriously, this field is moving pretty fast, and what you call shitty and inefficient was not long ago the best people could do. A few years ago when I was messing with the x264-svn libraries, stuff would get tweaked daily.

        Not to mention there are other factors at play with regards to compression. A well-engineered system isn't necessarily going to go for the maximum compression rate for video quality. One has to look at other limitations, such as the decoding hardware, the method by which the video is being delievered, and even the viewing devices on the receiving end.

        What is disheartening about the article is that it looks like the BBC are just in denial mode, and not really taking the complaints seriously. "Standard viewing equipment"? Seriously, what exactly are they getting at with that comment? On top of that it looks like they are trying to blame the quality of the source material, which certainly muddies the picture, but certainly the customers that are complaining would be used to these variations in quality before the change and not just suddenly notice it at the same time this equipment was rolled out.

        I have respect for them sticking to their guns, but not when they are doing it with such lame excuses. Then again, the BBC spokesperson and reporter may not be the most tech savvy individuals, and its likely some of the message here is lost in translation. Lossy codec indeed.

    • It would also be quite remarkable to see better quality compared to any other modern encoding while reducing bitrate by 50%.

  • Yes (Score:5, Informative)

    by Unoriginal Nick ( 620805 ) on Thursday December 17, 2009 @02:13PM (#30476170)

    I got a good laugh off of this, but is it really possible to get better quality from a lower bitrate?

    Sure, if you also switch to a better codec, such as using H.264 instead of MPEG-2. However, I don't think that's what's happening in this case.

    • Re:Yes (Score:5, Insightful)

      by natehoy ( 1608657 ) on Thursday December 17, 2009 @02:25PM (#30476368) Journal

      You can also get better compression by specifying a more sophisticated compression method within the same codec, for example, since many codecs support various levels of compression.

      Generally, "better" compression (fitting a higher resolution and/or framerate into a smaller size) requires a lot more power to encode and often some more power to decode. You can use less bitrate to get a quality signal there, but you need "smarter" coders and decoders at the respective ends of the transmission. So the BBC may have upgraded their compression engine to something that can do "better" compression, thereby fitting the same resolution and framerate into a 40% smaller stream. But their customers' television sets might not have the horsepower to decode it at full quality.

      That could easily explain why the BBC's testing went so well but their consumers (with varying brands of TV sets probably mostly tested for British use with the old compression) can't keep up and render an inferior picture.

      It's also possible that, by compressing the video stream into a denser compression method, signal loss is having a greater effect than it did with the old compression method. The viewers may be seeing artifacts that are the decoder's attempts to fill in the blanks. The old compression method might have allowed a certain amount of redundancy or error correction that the new one lacks, and the loss of part of the signal has a more visible effect on the new one.

      • Re: (Score:3, Insightful)

        by HateBreeder ( 656491 )

        > can't keep up and render an inferior picture.

        It's not like there's half-way here. This is the digital age - if it can't keep up you won't see anything!

        • Re: (Score:2, Informative)

          Despite what many people claim, when you get a garbled digital signal, most systems will give you garbled artifacts on the screen. This is because in any broadcast situation you expect a certain percentage of interference and have to design to deal with that. If you only ever displayed images on a tv if the signal was perfect, you would be amazed at how often your display would blank out.

          On my TV at home I have changed the settings to turn off the "blue screen/bad signal screen". The TV does its best
          • These are two different things:
            1. Partial decode due to missing data (i.e a picture is comprised of blocks, you can have some of the block but not all of them - hence partial decode)
            2. A primitive decoder's inability to decipher/keep-real-time-rate-of an advanced encoding (i.e. trying to decompress something in realtime on a weak machine will at best result in frame drops -- or completely incompatible schemes of decoding: my decoder supports only mpeg-2 but i've got a h.264 movie)

            You are talking about optio

        • Re: (Score:3, Informative)

          by zippthorne ( 748122 )

          Say what?

          f(x) = 1 - x^2/2 + x^4/24 - x^6/720 + ... (where the constant diminishes rapidly, and 0<x<1)

          If you know that the HOT affect the result less and less, you can drop them and still get a "good enough" though less perfect answer. You can keep dropping terms until the error is unacceptable, or in the case of something where the actual value is not critical (i.e. a block of pixels), you can keep dropping terms to reach a target number of operations and hope that the answer is sufficiently precise

      • So what you're saying is although they tested it, they didn't test it. ;-)
    • I got a good laugh off of this, but is it really possible to get better quality from a lower bitrate?

      Sure, if you also switch to a better codec, such as using H.264 instead of MPEG-2. However, I don't think that's what's happening in this case

      Just to amplify what has been said here a few times, yes it is possible, and not only from changing codecs. H.264 supports many optional features that are not implemented in all decoders, and these features can have an effect on quality. Use of a variable bit rate vice a constant bitrate can also increase quality and decrease bandwidth needs, at the cost of requiring some bursting capability or buffering to accomodate the variation. Also, there are tricks that can be played with dark and light tone maski

    • by Speare ( 84249 )
      In addition, if your higher bitrate is clogging the net and stalling every few seconds, and the lower bitrate actually allows the audio and/or to be played in real time, then many people would say that is an improvement in quality although not necessarily on a frame-by-frame basis.
  • by godrik ( 1287354 ) on Thursday December 17, 2009 @02:14PM (#30476190)

    but is it really possible to get better quality from a lower bitrate?

    If you are changing the compression algorithm of course it is possible. In H264, there are a lot of compression possibilities which are not used by the compression algorithm but which will be recognized by the decompression algorithm.

  • Yes, of course (Score:5, Informative)

    by TheRaven64 ( 641858 ) on Thursday December 17, 2009 @02:14PM (#30476196) Journal
    Any lossy compression works by throwing away bits of the picture that the viewer might not notice. You can lower the bitrate with better psychovisual and psychoacoustic models. You're still throwing away more information, but you're doing it in a way that the user is less likely to notice. This takes more CPU time on the compressor, a more optimised encoder, or a better algorithm.
    • Re:Yes, of course (Score:4, Interesting)

      by Locke2005 ( 849178 ) on Thursday December 17, 2009 @02:23PM (#30476346)
      Fractal encoding works well in that you can zoom way in on the fractal without noticing obvious compression artifacts. However, there is no straightforward algorithm for doing the compression; as far as I know, you have to brute-force every possibility to get optimal encoding -- not something you can effectively do in real time. But if you've got several days before the segment airs in which to encode it, should should be able to get better quality out of far fewer bits.
      • Re:Yes, of course (Score:4, Informative)

        by TheRaven64 ( 641858 ) on Thursday December 17, 2009 @02:38PM (#30476604) Journal
        Yup, fractal encoding is pretty impressive. I played with it a bit on a 386, when it took about ten minutes to compress a 320x240 image. I've not heard of any newer algorithms that improve matters much. More interesting is topological compression, which has most of the same advantages as storing a vector image (resolution independent) and a raster image (can come from a sampled source). You can extend these to video by modelling the video data as a 3D volume, rather than as a sequence of frames. The topological changes in the time dimension are usually quite gradual, and it's easy to trade special and temporal resolution. The really nice thing about this approach is that it's resolution independent in three dimensions, not just two, so it's easy to generate a signal that matches the display's refresh rate.
    • Re:Yes, of course (Score:5, Informative)

      by Andy Dodd ( 701 ) <atd7NO@SPAMcornell.edu> on Thursday December 17, 2009 @02:27PM (#30476408) Homepage

      LAME was a pretty good example of this for MP3 - Eventually it was able to achieve (somewhat) better quality at (somewhat) lower bitrates than the reference encoders.

      Vorbis, similarly, had the AoTUV tuning - This provided significant rate/distortion tradeoff improvements compared to a "vanilla" encoder, without changing the decoder.

      However, 40% reduction in bitrate with an increase in quality is very difficult unless the original encoder was CRAP. (Which is actually a definite possibility for a realtime hardware encoder.) Also, it's far more likely to have such improvements with H.264 or MPEG-4 ASP, not nearly as likely with MPEG-2, which had a far less flexible encoding scheme.

      • Re: (Score:3, Informative)

        by TheRaven64 ( 641858 )

        Which is actually a definite possibility for a realtime hardware encoder.

        Not just a realtime hardware encoder, but likely a first-generation encoder. Most compression standards are now designed with some headroom. When AAC was first introduced, Dolby provided two encoders, a consumer-grade and a professional encoder. The consumer-grade one was only slightly better than MP3, but ran much faster. The pro encoder was a lot slower but the quality was noticeably better. More recent encoders produce even better quality. A 40% decrease in bitrate is about what I'd expect going f

        • Re:Yes, of course (Score:4, Informative)

          by wagnerrp ( 1305589 ) on Thursday December 17, 2009 @05:10PM (#30479144)

          A 40% decrease in bitrate is about what I'd expect going from a single-pass to a two-pass H.264 encoder, and it's entirely possible that a newer single-pass encoder can do the same sort of thing just by using a longer window now that RAM is a lot cheaper.

          No, there is no difference in compressibility between a single pass and a two pass encoder. The two pass encoder simply allows you to set the quantizer so as to very accurately hit a target average bitrate.

    • Re: (Score:3, Funny)

      by trb ( 8509 )
      Lossless compression also works by throwing away data. In a simple case, if you have a still image and you store it in a file that's 1000x1000 pixels and 24 bits deep with 8 bits each of red, green, and blue, you store that uncompressed in 3 megabytes. 24 bits of color isn't infinite, it's a palette of 16.77 million colors. And you're not saving every micron of the image. You are dicing the image into 1000x1000. If you are taking a picture of a scene that's 10 meters by 10 meters, you are stuffing a sq
  • by jandrese ( 485 ) <kensama@vt.edu> on Thursday December 17, 2009 @02:15PM (#30476210) Homepage Journal
    It's not impossible to get better results out of lower bitrates, but you have to pay the penalty elsewhere, typically in encode/decode complexity.

    If your decode hardware is fixed (it's generic HDTV hardware), then there is much less room for improvement, and half the bitrate is an enormous drop. It's no surprise that the BBC viewers complained.
    • If your decode hardware is fixed (it's generic HDTV hardware), then there is much less room for improvement, and half the bitrate is an enormous drop.

      You're assuming they started with a halfway decent encoder to begin with. The difference between a good encoder and a crappy one is vastly more than 50%.

      Added complexity need not be involved (though it certainly can help). A better Quantization table, for instance, wouldn't be any slower, and the reduced bitrate would speed-up encoding/decoding.

      More relevant

  • by w0mprat ( 1317953 ) on Thursday December 17, 2009 @02:15PM (#30476212)
    Nitpick: So 39% is "almost 50%"?? I would have called that "almost 40%". Then again that is a /. summary.
    • Re: (Score:2, Interesting)

      by Andy Dodd ( 701 )

      Depends on your definition of "almost" and what precision you're rounding to.

    • Re: (Score:3, Informative)

      Nitpick: So 39% is "almost 50%"??

      Yes, its almost 50% -- if you are, e.g., relating it to the nearest 25%. (Rounding it to the nearest 25% it would be just plain 50%, not "almost 50%".)

      I would have called that "almost 40%".

      Its also almost 40% -- if you are, e.g., relating it to the nearest 10% (or 5% or 2%). And, in fact, 6.3/16 is also "almost 39.5%" if you are relating it to nearest 0.5%, and "just over 39%" if you are relating it to the nearest 1%.

      "Almost" means you are giving an approximation (and the

    • When they say "news for nerds" they mean computer nerds, not math nerds.

  • by seven of five ( 578993 ) on Thursday December 17, 2009 @02:15PM (#30476222)
    They just remove the naughty bits.
  • by PhrostyMcByte ( 589271 ) <phrosty@gmail.com> on Thursday December 17, 2009 @02:15PM (#30476224) Homepage

    Bitrate is only part of the equation -- the H.264 spec allows for a number of different ways to compress video, and it's up to the encoder to find out which is best for your video. Even in the same encoder, you can tweak dozens of settings in ways that dramatically change output quality -- usually a trade off between time and size.

    x264 has beat every commercial encoder out there -- in some cases, on a level that would indeed render higher quality with half the bitrate.

    • by Kjella ( 173770 )

      x264 has beat every commercial encoder out there -- in some cases, on a level that would indeed render higher quality with half the bitrate.

      Last I checked x264 was just on par or slightly below some commercial encoders with a standard profile. But x264 tends to have a bunch of OCD encoders who don't quit until they've tweaked it for just the right grain settings and tweaks for a given show or movie, which is usually what gives it the edge.

      • by Silverlancer ( 786390 ) on Thursday December 17, 2009 @02:37PM (#30476574)
        The main change in the past year has been the psy optimizations that were added; before the psy optimizations, x264 was roughly on par with Mainconcept, one of the better commercial encoders. The psy optimizations--adaptive quantization and psy-RD (both on by default)--put x264 way over the top. Recently, the new MB-tree algorithm (also on by default) has boosted quality quite a bit as well. The main catch with psy optimizations is that they're nearly impossible to measure mathematically, and in fact, unless you disable them, they will make the "mathematical" measures of quality (mean squared error/PSNR) much worse.

        It's always nice when free software solutions trash the commercial alternatives.
  • Crap HD Quality (Score:3, Interesting)

    by TooTechy ( 191509 ) on Thursday December 17, 2009 @02:19PM (#30476288)

    Try watching a football game here in the US and you will see what crap quality can be. The turf turns into squares of blur when the camera moves, then returns to blades of grass when the picture is stationary. As soon as you spot it you will hate it. If you don't see it then OK for you.

    I used to have a friend who could spot the two little circles in the top right of a movie in the theater telling the projectionist to change the reel. Once he saw them the movies were never the same again.

    • Re:Crap HD Quality (Score:5, Informative)

      by Brett Buck ( 811747 ) on Thursday December 17, 2009 @02:42PM (#30476672)

      I think you might want to talk to you cable company on that one. I know the effect you are seeing (it's by far the worst on local Public TV since they crammed 7 sub-channels into the same carrier), but network TV coverage of football in my area is pretty pristine for the most part. OTA is even better but cable is still awfully good.

            Of course, by "talk to your cable company", I mean "do nothing" because talking to the cable company is a complete waste of time.

            Brett

  • by Locke2005 ( 849178 ) on Thursday December 17, 2009 @02:19PM (#30476290)
    If you're watching a soap opera, you only need to see a few frames per week to follow the story. If you are watching a live sports event with a lot of action, most people will notice every dropped frame and compression artifact (I've noticed myself while watching the Olympics via satellite feed.) Methinks they did the testing on a relatively static video. Video compression works by (among other methods) creating a key frame, then sending diffs off that key frame for several frames. If every frame is completely different, compression does not work well.
  • Still a way to go... (Score:3, Informative)

    by TheRaven64 ( 641858 ) on Thursday December 17, 2009 @02:22PM (#30476330) Journal
    For reference, the BBC HD content on iPlayer is 3.5Mb/s for 720p (no higher quality available). 9.7Mb/s is less than three times as much, so it probably won't be long before the streaming and broadcast signals are the same quality.
  • by jdgeorge ( 18767 )

    BBC accountant: We provide the $ame or better picture quality with half the bitrate! Just think of the $aving$!

    BBC IT decision maker: I $ee what you're $aying.... The$e picture$ look $uper!

    Public: This looks like crap.

    BBC rep: (waves hand) The$e aren't the compre$$ion artifact$ you're looking for. We can go about our bu$ine$$. There are no complaint$.

    • by BUL2294 ( 1081735 ) on Thursday December 17, 2009 @02:40PM (#30476646)
      Public: Shou£dn't you be ta£king in our £anguage?
      • by Inda ( 580031 )
        Have you read an early story on BBC News? Full of spelling mistakes, wrong names and poor grammar. The sports stories are the worse.

        Find one, refresh every 30 seconds and watch the incompetence. I've saw half a dozen edits in five minutes after the footy games last night.
        • You actually expect a sports reporter to watch a complete footy game without imbibing copious amounts of Ale? What country are you from? Sure, as they sober up, they correct their mistakes... doesn't everyone?
  • If they've switched from MPEG-2 to MPEG-4, then yes, you can get equal or better quality at a lower bitrate.

  • Test video (Score:5, Funny)

    by bugs2squash ( 1132591 ) on Thursday December 17, 2009 @02:28PM (#30476430)
    was the featureless black-screen video to 4'33" from John Cage. Results were far better at the lower bitrate. The absolute darkness was less blurry.
  • The article talks about bitrate, which implies not a change in codec, so it remains mpeg-2. I'm assuming the BBC is available OTA, so unless they want everyone with a HD ready TV to have to get a new receiver box they can't just change to x264, etc. So in this context, the answer is no, using the same codec at a reduced bitrate can't produce better than the source. However, that assumes you are comparing to the original source. Take for example a standard DVD player which has a mpeg2 uncompressed file at 4
    • BBC HD is a separate channel from normal BBC broadcasts, and uses H.264 rather than MPEG2.

      The standard definition channels remain MPEG2 for compatibility.

      • Are you sure about that? At least according to wiki they have two formats and OTA is still mpeg2. The fact they may be doing both raises some other questions like whether OTA is seeing this as well. Cutting back on the OTA bitrate makes zero sense, so maybe this is only talking about their sat/cable distribution?
    • Re: (Score:3, Informative)

      by lga ( 172042 ) *
      They do expect everyone to get a new receiver. Freeview HD has just started it's rollout, although no receivers will be available until next month. It uses H.264 and DVB-T2.
  • In her blog post, Ms Nagler said that the services was created to be at its best for "typical viewing set ups" and that user groups with standard equipment were happy with the service.

    Is she saying that they've optimized their HD for people without HD screens, or am I just confused?

    It really does sound like they're trying to sell something which isn't HD, but gets sold as if it is. Strange. "Now, to server you better, we are open fewer hours."

    Cheers

  • by Edgewize ( 262271 ) on Thursday December 17, 2009 @02:37PM (#30476578)

    Lossy compression formats depend on an understanding of human perception. Nobody has a perfect model of the human brain, and nobody has a perfect algorithm for deciding what data to keep and what data to throw away.

    If you have a better model of human perception than your competitors, then your encoder will yield higher quality output. If you spend 50% of your bits on things that nobody will notice, and I spend 25% of my bits on things that nobody will notice, then my 650kbps stream is going to look better than your 900kbps stream.

    LAME did not win out as the MP3 encoder of choice just because it is free. It won out because its psychoacoustic model yields better-sounding MP3s at 128kbps than its competitors managed at 160kbps or even 192kbps.

  • I suspect the move is connected with Freeview (a non-profit organisation, of which BBC is a member, that runs the free digital broadcasts) and the digital switchover. The BBC are probably thinking that if they cut their HD bitrate there can be more HD channels (and I assume more people would be able to get a good-enough signal). Or it just costs less.

  • by jameskojiro ( 705701 ) on Thursday December 17, 2009 @03:47PM (#30477772) Journal

    Now when we go to watch the Christmas special it will look like Cardboard sets, that wibble and wobble. The TARDIS will look utterly horrible and the Doctor will revert to a bloke from Liverpool with big eyes, big teeth, curly hair and a long scarf.

    They also lowered the audio rate down to 16Kbps, so that rich orchestral music will sound like it came out of a cheap 1970's Moog.

    Great, just when they updated the look of the show this will undo all of their work and it will look like the viewers were taken back to the 80's in an actual TARDIS.

    Bravo BBC, Just Bravo......

  • better quality... (Score:3, Insightful)

    by bkr1_2k ( 237627 ) on Thursday December 17, 2009 @04:00PM (#30478012)

    Depending upon the configuration settings (frames per second, bit rate, I frame P frame structure etc.) it is most certainly possible to have a lower bit rate setting with better quality video than that of a higher bit rate setting. For example, if you drop the frame rate on a lower bitrate you often increase the quality of the video. So theoretically you can get easily the same quality at say 5 Mbps with a 15 fps as you can at 10 Mbps with 30 fps. I don't have specific numbers but subjectively (and empirically) it's quite possible.

    There are definitely things that do make a difference here though, such as motion or high speed camera work. These types of video often suffer more noticeably than others so anyone watching sports, for example, will see the differences in quality more readily than someone watching a soap opera.

Our OS who art in CPU, UNIX be thy name. Thy programs run, thy syscalls done, In kernel as it is in user!

Working...