Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Media Television

BBC Lowers HDTV Bitrate; Users Notice 412

aws910 writes "According to an article on the BBC website, BBC HD lowered the bitrate of their broadcasts by almost 50% and are surprised that users noticed. From the article: 'The replacement encoders work at a bitrate of 9.7Mbps (megabits per second), while their predecessors worked at 16Mbps, the standard for other broadcasters.' The BBC claims 'We did extensive testing on the new encoders which showed that they could produce pictures at the same or even better quality than the old encoders ...' I got a good laugh off of this, but is it really possible to get better quality from a lower bitrate?"
This discussion has been archived. No new comments can be posted.

BBC Lowers HDTV Bitrate; Users Notice

Comments Filter:
  • Yes (Score:2, Informative)

    by Anonymous Coward on Thursday December 17, 2009 @02:12PM (#30476148)
    Yes, if more time and passes are spent encoding the video, lower bitrate CAN result in a higher quality video. However, this does not appear to be the case in this instance.
  • Yes (Score:5, Informative)

    by Unoriginal Nick ( 620805 ) on Thursday December 17, 2009 @02:13PM (#30476170)

    I got a good laugh off of this, but is it really possible to get better quality from a lower bitrate?

    Sure, if you also switch to a better codec, such as using H.264 instead of MPEG-2. However, I don't think that's what's happening in this case.

  • by Anonymous Coward on Thursday December 17, 2009 @02:13PM (#30476176)

    So they starting to act like comcast cable with there compressed HD.

  • by godrik ( 1287354 ) on Thursday December 17, 2009 @02:14PM (#30476190)

    but is it really possible to get better quality from a lower bitrate?

    If you are changing the compression algorithm of course it is possible. In H264, there are a lot of compression possibilities which are not used by the compression algorithm but which will be recognized by the decompression algorithm.

  • Yes, of course (Score:5, Informative)

    by TheRaven64 ( 641858 ) on Thursday December 17, 2009 @02:14PM (#30476196) Journal
    Any lossy compression works by throwing away bits of the picture that the viewer might not notice. You can lower the bitrate with better psychovisual and psychoacoustic models. You're still throwing away more information, but you're doing it in a way that the user is less likely to notice. This takes more CPU time on the compressor, a more optimised encoder, or a better algorithm.
  • by w0mprat ( 1317953 ) on Thursday December 17, 2009 @02:15PM (#30476212)
    Nitpick: So 39% is "almost 50%"?? I would have called that "almost 40%". Then again that is a /. summary.
  • by PhrostyMcByte ( 589271 ) <phrosty@gmail.com> on Thursday December 17, 2009 @02:15PM (#30476224) Homepage

    Bitrate is only part of the equation -- the H.264 spec allows for a number of different ways to compress video, and it's up to the encoder to find out which is best for your video. Even in the same encoder, you can tweak dozens of settings in ways that dramatically change output quality -- usually a trade off between time and size.

    x264 has beat every commercial encoder out there -- in some cases, on a level that would indeed render higher quality with half the bitrate.

  • Re:Yes (Score:1, Informative)

    by Anonymous Coward on Thursday December 17, 2009 @02:18PM (#30476252)

    Yes, and double yes, Even within the same codec. There is a LOT that an encoder can do to improve the quality of video (especially with more advanced codecs).

    Just check out the differences between say, x264 and apples encoder. Both are H.264, but x264 blows apple's encoder clean out of the water.

    9.4 mbps is still a pretty large bandwidth. An encoder like x264 could do quite a bit with that bandwidth. at 16mbps though, Almost any encode, from MPEG 2 standards, up could produce some pretty clean looking pictures.

  • by gandhi_2 ( 1108023 ) on Thursday December 17, 2009 @02:19PM (#30476274) Homepage

    Technically speaking, they suck at "maths".

  • by Anonymous Coward on Thursday December 17, 2009 @02:19PM (#30476278)

    I'm far from an expert, but my understanding is that to a limited extent, you can make a trade-off between the bitrate and encoding/decoding time. H.264/MPEG-4 AVC is superior to older codecs, generally having both better visual quality and a lower bitrate, but it requires much more time to encode and requires more powerful hardware to decode the stream.

    But my very loose understanding is that all they did was lower the bitrate and maybe conducted a test to see if some random idiots could tell the difference with ideal samples.

  • by Locke2005 ( 849178 ) on Thursday December 17, 2009 @02:19PM (#30476290)
    If you're watching a soap opera, you only need to see a few frames per week to follow the story. If you are watching a live sports event with a lot of action, most people will notice every dropped frame and compression artifact (I've noticed myself while watching the Olympics via satellite feed.) Methinks they did the testing on a relatively static video. Video compression works by (among other methods) creating a key frame, then sending diffs off that key frame for several frames. If every frame is completely different, compression does not work well.
  • Still a way to go... (Score:3, Informative)

    by TheRaven64 ( 641858 ) on Thursday December 17, 2009 @02:22PM (#30476330) Journal
    For reference, the BBC HD content on iPlayer is 3.5Mb/s for 720p (no higher quality available). 9.7Mb/s is less than three times as much, so it probably won't be long before the streaming and broadcast signals are the same quality.
  • Re:Yes, of course (Score:5, Informative)

    by Andy Dodd ( 701 ) <atd7NO@SPAMcornell.edu> on Thursday December 17, 2009 @02:27PM (#30476408) Homepage

    LAME was a pretty good example of this for MP3 - Eventually it was able to achieve (somewhat) better quality at (somewhat) lower bitrates than the reference encoders.

    Vorbis, similarly, had the AoTUV tuning - This provided significant rate/distortion tradeoff improvements compared to a "vanilla" encoder, without changing the decoder.

    However, 40% reduction in bitrate with an increase in quality is very difficult unless the original encoder was CRAP. (Which is actually a definite possibility for a realtime hardware encoder.) Also, it's far more likely to have such improvements with H.264 or MPEG-4 ASP, not nearly as likely with MPEG-2, which had a far less flexible encoding scheme.

  • by DragonWriter ( 970822 ) on Thursday December 17, 2009 @02:36PM (#30476564)

    Nitpick: So 39% is "almost 50%"??

    Yes, its almost 50% -- if you are, e.g., relating it to the nearest 25%. (Rounding it to the nearest 25% it would be just plain 50%, not "almost 50%".)

    I would have called that "almost 40%".

    Its also almost 40% -- if you are, e.g., relating it to the nearest 10% (or 5% or 2%). And, in fact, 6.3/16 is also "almost 39.5%" if you are relating it to nearest 0.5%, and "just over 39%" if you are relating it to the nearest 1%.

    "Almost" means you are giving an approximation (and the direction the value differs from the approximation), not an exact figure. There are infinite number of possible approximations for any given exact value. That something could be described as "almost 40%" does not mean it cannot also be described as "almost 50%" without any "rounding error", since "almost" does not specify the precision of the approximation being used.

  • Re:Focus group... (Score:5, Informative)

    by jasonwc ( 939262 ) on Thursday December 17, 2009 @02:37PM (#30476580)

    Yes, it IS possible to get higher picture quality out of a lower bitrate, but not with all else equal. For example, you can get higher quality with CPU-intensive settings using H.264 5.1 Profile than you can with H.264 4.1 (what Blu-Ray's/HD DVDs use), at the same bitrate. You're giving up CPU cycles in decoding for lower video size. This is why x264 can produce near-transparent encodes of Blu-Ray movies at about half the size. x264 uses much more demanding settings.

    x264 at 20 Mbit which high-quality settings is far more demanding than a 40 Mbit H.264 stream from a Blu-Ray.

  • Re:Yes, of course (Score:4, Informative)

    by TheRaven64 ( 641858 ) on Thursday December 17, 2009 @02:38PM (#30476604) Journal
    Yup, fractal encoding is pretty impressive. I played with it a bit on a 386, when it took about ten minutes to compress a 320x240 image. I've not heard of any newer algorithms that improve matters much. More interesting is topological compression, which has most of the same advantages as storing a vector image (resolution independent) and a raster image (can come from a sampled source). You can extend these to video by modelling the video data as a 3D volume, rather than as a sequence of frames. The topological changes in the time dimension are usually quite gradual, and it's easy to trade special and temporal resolution. The really nice thing about this approach is that it's resolution independent in three dimensions, not just two, so it's easy to generate a signal that matches the display's refresh rate.
  • by Jackie_Chan_Fan ( 730745 ) on Thursday December 17, 2009 @02:40PM (#30476638)

    I'll toss FIOS under the bus too. Verizon's HD varies greatly. I'm not sure if its the channel companies themselves or Verizon doing it...

    Either way, I hate watching fast motion movies or tv shows where the bitrate is too low.

    Try watching "How its Made" on discovery HD and watch how compressed things look as fast moving manufactured parts pass through machinery.

    Same for HBO films etc.

  • Re:Crap HD Quality (Score:5, Informative)

    by Brett Buck ( 811747 ) on Thursday December 17, 2009 @02:42PM (#30476672)

    I think you might want to talk to you cable company on that one. I know the effect you are seeing (it's by far the worst on local Public TV since they crammed 7 sub-channels into the same carrier), but network TV coverage of football in my area is pretty pristine for the most part. OTA is even better but cable is still awfully good.

          Of course, by "talk to your cable company", I mean "do nothing" because talking to the cable company is a complete waste of time.

          Brett

  • Re:Focus group... (Score:5, Informative)

    by Anonymous Coward on Thursday December 17, 2009 @02:44PM (#30476712)

    1) The alleged wife in the quote is purported to have cataracts. Cataracts typically reduce visual acuity due to the cloudiness they impart to the lens of the eye. How does a reduction of visual acuity translate to "just another racist characterization of women being incompetent with technology"?

    2) If the quote had been ""Even my husband can see a reduction in picture quality and he's got cataracts," wrote one." would you have bothered to make your little rant post?

    P.S. The term you were looking for is "sexist" not "racist".

  • Re:Yes, of course (Score:3, Informative)

    by TheRaven64 ( 641858 ) on Thursday December 17, 2009 @02:46PM (#30476750) Journal

    Which is actually a definite possibility for a realtime hardware encoder.

    Not just a realtime hardware encoder, but likely a first-generation encoder. Most compression standards are now designed with some headroom. When AAC was first introduced, Dolby provided two encoders, a consumer-grade and a professional encoder. The consumer-grade one was only slightly better than MP3, but ran much faster. The pro encoder was a lot slower but the quality was noticeably better. More recent encoders produce even better quality. A 40% decrease in bitrate is about what I'd expect going from a single-pass to a two-pass H.264 encoder, and it's entirely possible that a newer single-pass encoder can do the same sort of thing just by using a longer window now that RAM is a lot cheaper.

    Also, it's far more likely to have such improvements with H.264 or MPEG-4 ASP, not nearly as likely with MPEG-2, which had a far less flexible encoding scheme

    BBC HD uses H.264. It's rebroadcast after transcoding to MPEG-2 if you have Virgin Media cable, because their decoder boxes, unlike the FreeView boxes, can't handle H.264.

  • by TheRaven64 ( 641858 ) on Thursday December 17, 2009 @02:47PM (#30476766) Journal
    BBC HD also uses H.264 for terrestrial and satellite broadcasts. It's only if you have Virgin Media cable that you get the stream transcoded to MPEG-2.
  • Re:Yes (Score:2, Informative)

    by GasparGMSwordsman ( 753396 ) on Thursday December 17, 2009 @02:47PM (#30476778)
    Despite what many people claim, when you get a garbled digital signal, most systems will give you garbled artifacts on the screen. This is because in any broadcast situation you expect a certain percentage of interference and have to design to deal with that. If you only ever displayed images on a tv if the signal was perfect, you would be amazed at how often your display would blank out.

    On my TV at home I have changed the settings to turn off the "blue screen/bad signal screen". The TV does its best to figure out what it is receiving. I still can get a signal loss if there is enough interference, but for the most part I just get warped images and garbled sound if something happens. (I have a very nice HD tv with tuner built in.) I am at the very edge of two stations in my area and on both of those I have to fiddle with the antenna to have them come in clear. (Plus my cat moves the antenna onto the floor pretty often...)
  • Re:Focus group... (Score:3, Informative)

    by Maxo-Texas ( 864189 ) on Thursday December 17, 2009 @04:00PM (#30478010)

    I left comcast because their *digital* signals were worse than standard TV over the antennae.

    Dish has gone the other way-- their signal *looks* crisp, but there is a lot more blockiness than there used to be. I used to have blocky outbreaks perhaps 1 or 2 times in 40 hours of viewing. Now I get blockiness 1 or 2 times per 10 hours of viewing.

  • Re:Focus group... (Score:4, Informative)

    by bkr1_2k ( 237627 ) on Thursday December 17, 2009 @04:04PM (#30478080)

    Theoretically, perhaps. In reality either one could look better given other factors.

  • Re:Focus group... (Score:4, Informative)

    by PIBM ( 588930 ) on Thursday December 17, 2009 @04:19PM (#30478324) Homepage

    Actually, the bandwidth at 8bits per channel not including the 5.1 sound is 16,588,800 bits per FRAME not per second, so at 60 FPS you get a 950 mb/s bandwidth requirement for the video alone, and that`s why we need to use a compressed distribution method...

  • by clone53421 ( 1310749 ) on Thursday December 17, 2009 @04:25PM (#30478416) Journal

    One generally accepted practice is to put the punctuation inside the quote if the punctuation is part of the quotation, and outside the quote otherwise. According to that rule of thumb, his use of punctuation was correct.

  • Re:Focus group... (Score:3, Informative)

    by jasonwc ( 939262 ) on Thursday December 17, 2009 @04:56PM (#30478904)

    These are the Reference frame limits in Level 4.1

    Resolution | no. ref
    -----------|---------
      1280x544 | 12
      1280x720 | 9
      1920x800 | 5
      1920x816 | 5
      1920x1080 | 4

    If none of the resolutions above match your source, use the following equation to work it out for yourself:

                              8388608
                    __________________

                      (width x height)

    However, I've seen Level 5.1 encodes with 16 ref frames at full 1920x1200.

  • Re:Focus group... (Score:3, Informative)

    by Culture20 ( 968837 ) on Thursday December 17, 2009 @04:59PM (#30478950)

    My cranium nearly exploded while attempting to parse

    "3yo lesbian, father of seven"

    Father of 7, then transgendered 3 years ago?

  • Re:Yes, of course (Score:4, Informative)

    by wagnerrp ( 1305589 ) on Thursday December 17, 2009 @05:10PM (#30479144)

    A 40% decrease in bitrate is about what I'd expect going from a single-pass to a two-pass H.264 encoder, and it's entirely possible that a newer single-pass encoder can do the same sort of thing just by using a longer window now that RAM is a lot cheaper.

    No, there is no difference in compressibility between a single pass and a two pass encoder. The two pass encoder simply allows you to set the quantizer so as to very accurately hit a target average bitrate.

  • by lga ( 172042 ) * on Thursday December 17, 2009 @06:34PM (#30480440) Journal
    They do expect everyone to get a new receiver. Freeview HD has just started it's rollout, although no receivers will be available until next month. It uses H.264 and DVB-T2.
  • Re:Yes (Score:3, Informative)

    by zippthorne ( 748122 ) on Friday December 18, 2009 @12:45AM (#30483670) Journal

    Say what?

    f(x) = 1 - x^2/2 + x^4/24 - x^6/720 + ... (where the constant diminishes rapidly, and 0<x<1)

    If you know that the HOT affect the result less and less, you can drop them and still get a "good enough" though less perfect answer. You can keep dropping terms until the error is unacceptable, or in the case of something where the actual value is not critical (i.e. a block of pixels), you can keep dropping terms to reach a target number of operations and hope that the answer is sufficiently precise.

    Now, in a video codec, it's probably a vector function, and it's probably not polynomial either (although any more complex functions will still be approximated with polynomials, whose number of terms would be chosen for performance reasons...). The point being that there are lots of opportunities to drop terms and save cycles (and as the compressed data itself is likely an array of coefficients, there are also opportunities to drop terms and save space at the other end) which result in lower quality output as a tradeoff.

    Frankly, I'm not entirely sure whether "drop quality" ought to be preferred over "drop frames" but they're both choices on the spectrum of "making due with whacha got."

The one day you'd sell your soul for something, souls are a glut.

Working...