Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Media Television

BBC Lowers HDTV Bitrate; Users Notice 412

aws910 writes "According to an article on the BBC website, BBC HD lowered the bitrate of their broadcasts by almost 50% and are surprised that users noticed. From the article: 'The replacement encoders work at a bitrate of 9.7Mbps (megabits per second), while their predecessors worked at 16Mbps, the standard for other broadcasters.' The BBC claims 'We did extensive testing on the new encoders which showed that they could produce pictures at the same or even better quality than the old encoders ...' I got a good laugh off of this, but is it really possible to get better quality from a lower bitrate?"
This discussion has been archived. No new comments can be posted.

BBC Lowers HDTV Bitrate; Users Notice

Comments Filter:
  • Focus group... (Score:2, Insightful)

    by gandhi_2 ( 1108023 ) on Thursday December 17, 2009 @02:11PM (#30476142) Homepage

    ...of blind retards.

  • by Locke2005 ( 849178 ) on Thursday December 17, 2009 @02:12PM (#30476146)
    They also lowered their math standards. From 16MBps to 9.7 MBps is a 40% reduction, not "almost 50%".
  • by secondsun ( 195377 ) <secondsun@gmail.com> on Thursday December 17, 2009 @02:13PM (#30476186) Journal

    To one significant figure, they are.

  • by jandrese ( 485 ) <kensama@vt.edu> on Thursday December 17, 2009 @02:15PM (#30476210) Homepage Journal
    It's not impossible to get better results out of lower bitrates, but you have to pay the penalty elsewhere, typically in encode/decode complexity.

    If your decode hardware is fixed (it's generic HDTV hardware), then there is much less room for improvement, and half the bitrate is an enormous drop. It's no surprise that the BBC viewers complained.
  • Re:Yes (Score:3, Insightful)

    by Bakkster ( 1529253 ) <Bakkster.man@NOspam.gmail.com> on Thursday December 17, 2009 @02:16PM (#30476232)

    In other words, lower bitrate can be better, but only if you compare to shitty and inefficient compression.

  • Re:Yes (Score:5, Insightful)

    by natehoy ( 1608657 ) on Thursday December 17, 2009 @02:25PM (#30476368) Journal

    You can also get better compression by specifying a more sophisticated compression method within the same codec, for example, since many codecs support various levels of compression.

    Generally, "better" compression (fitting a higher resolution and/or framerate into a smaller size) requires a lot more power to encode and often some more power to decode. You can use less bitrate to get a quality signal there, but you need "smarter" coders and decoders at the respective ends of the transmission. So the BBC may have upgraded their compression engine to something that can do "better" compression, thereby fitting the same resolution and framerate into a 40% smaller stream. But their customers' television sets might not have the horsepower to decode it at full quality.

    That could easily explain why the BBC's testing went so well but their consumers (with varying brands of TV sets probably mostly tested for British use with the old compression) can't keep up and render an inferior picture.

    It's also possible that, by compressing the video stream into a denser compression method, signal loss is having a greater effect than it did with the old compression method. The viewers may be seeing artifacts that are the decoder's attempts to fill in the blanks. The old compression method might have allowed a certain amount of redundancy or error correction that the new one lacks, and the loss of part of the signal has a more visible effect on the new one.

  • Re:Yes (Score:3, Insightful)

    by HateBreeder ( 656491 ) on Thursday December 17, 2009 @02:35PM (#30476548)

    > can't keep up and render an inferior picture.

    It's not like there's half-way here. This is the digital age - if it can't keep up you won't see anything!

  • by Edgewize ( 262271 ) on Thursday December 17, 2009 @02:37PM (#30476578)

    Lossy compression formats depend on an understanding of human perception. Nobody has a perfect model of the human brain, and nobody has a perfect algorithm for deciding what data to keep and what data to throw away.

    If you have a better model of human perception than your competitors, then your encoder will yield higher quality output. If you spend 50% of your bits on things that nobody will notice, and I spend 25% of my bits on things that nobody will notice, then my 650kbps stream is going to look better than your 900kbps stream.

    LAME did not win out as the MP3 encoder of choice just because it is free. It won out because its psychoacoustic model yields better-sounding MP3s at 128kbps than its competitors managed at 160kbps or even 192kbps.

  • Re:Yes (Score:5, Insightful)

    by Fantom42 ( 174630 ) on Thursday December 17, 2009 @02:38PM (#30476598)

    In other words, lower bitrate can be better, but only if you compare to shitty and inefficient compression.

    And by this you mean compression that is state of the art two minutes ago, vs. today. Seriously, this field is moving pretty fast, and what you call shitty and inefficient was not long ago the best people could do. A few years ago when I was messing with the x264-svn libraries, stuff would get tweaked daily.

    Not to mention there are other factors at play with regards to compression. A well-engineered system isn't necessarily going to go for the maximum compression rate for video quality. One has to look at other limitations, such as the decoding hardware, the method by which the video is being delievered, and even the viewing devices on the receiving end.

    What is disheartening about the article is that it looks like the BBC are just in denial mode, and not really taking the complaints seriously. "Standard viewing equipment"? Seriously, what exactly are they getting at with that comment? On top of that it looks like they are trying to blame the quality of the source material, which certainly muddies the picture, but certainly the customers that are complaining would be used to these variations in quality before the change and not just suddenly notice it at the same time this equipment was rolled out.

    I have respect for them sticking to their guns, but not when they are doing it with such lame excuses. Then again, the BBC spokesperson and reporter may not be the most tech savvy individuals, and its likely some of the message here is lost in translation. Lossy codec indeed.

  • by tepples ( 727027 ) <tepplesNO@SPAMgmail.com> on Thursday December 17, 2009 @02:41PM (#30476654) Homepage Journal
    But then iPlayer appears to use H.264, which allows for more efficient encoding than the MPEG-2 codec used for digital TV broadcasts.
  • Re:Focus group... (Score:2, Insightful)

    by Anonymous Coward on Thursday December 17, 2009 @02:50PM (#30476814)

    As a 43 year old father of a special needs child I find your comment retarded (And I deplore the use of the word). The salient point was that the wife of the commenter in the article has cataracts, not that she is female. If the comment had been "Even my husband can see a reduction in picture quality and he's got cataracts" it would have been no less salient.

  • Re:Focus group... (Score:3, Insightful)

    by brkello ( 642429 ) on Thursday December 17, 2009 @02:56PM (#30476920)
    What a weird post. Would you find it less offensive if you weren't a C programmer? It may be a stereotype, but it is there for a reason. This hits home with my mom who says she can't tell the difference between standard def and high def television. Does that mean all women can't? Nope. But it was an amusing quote...loosen up. Stop looking for things to be offended about.
  • by clone53421 ( 1310749 ) on Thursday December 17, 2009 @03:32PM (#30477522) Journal

    Calm down; that’s how the Brits shorten “Mathematics”.

    It even makes sense, to a fashion: I shortened the plural word “British” to “Brits”. Why not shorten the plural word “Mathematics” to “Maths”?

  • Re:Yes, of course (Score:4, Insightful)

    by RivieraKid ( 994682 ) on Thursday December 17, 2009 @03:33PM (#30477536)

    You do understand though, that the lost information in your example is lost at the capture stage not the compression stage don't you?

    Lossless compression is just that - lossless. Try compressing your copy of notepad.exe with WinZip, extract it and tell me if it still works. That's lossless compression. The result of compression then decompression is bitwise identical to the original. It has nothing to do with whether the original data is an accurate representation of what it claims to be.

  • by RivieraKid ( 994682 ) on Thursday December 17, 2009 @03:37PM (#30477598)

    While I agree in principle with you, there comes a time when "almost" is simply hyperbole.

    When the difference between the actual percentage and the "almost" percentage is a quarter of the original figure, then that's just plain silly. (there's a 10% difference between 40% and 50%, which happens to be 25% of the actual value)

  • by madsenj37 ( 612413 ) on Thursday December 17, 2009 @03:38PM (#30477622)
    You must use the incorrect term 'aluminium' as well.
  • Re:Focus group... (Score:1, Insightful)

    by Anonymous Coward on Thursday December 17, 2009 @03:47PM (#30477794)
    Man is God.
  • better quality... (Score:3, Insightful)

    by bkr1_2k ( 237627 ) on Thursday December 17, 2009 @04:00PM (#30478012)

    Depending upon the configuration settings (frames per second, bit rate, I frame P frame structure etc.) it is most certainly possible to have a lower bit rate setting with better quality video than that of a higher bit rate setting. For example, if you drop the frame rate on a lower bitrate you often increase the quality of the video. So theoretically you can get easily the same quality at say 5 Mbps with a 15 fps as you can at 10 Mbps with 30 fps. I don't have specific numbers but subjectively (and empirically) it's quite possible.

    There are definitely things that do make a difference here though, such as motion or high speed camera work. These types of video often suffer more noticeably than others so anyone watching sports, for example, will see the differences in quality more readily than someone watching a soap opera.

  • Re:Focus group... (Score:3, Insightful)

    by jasonwc ( 939262 ) on Thursday December 17, 2009 @04:49PM (#30478786)

    No, those aren't the only differences. For one, you are limited in the number of reference frames you can use at a given resolution at 4.1. For example, at full 1920x1200 I don't think you can use more than 4 or 5 reference frames at 4.1, but I've seen 5.1 encodes that use 16 reference frames for animated films that achieve very high compression ratios while maintaining transparency.

    4.1's maximum maximum allowed bitrate is not the constraint. Doom9 or Wikipedia can provide much more detailed information about the differnces between levels, but I know from HDBits that the # of reference frames is one of the big differences.

  • Re:Focus group... (Score:3, Insightful)

    by MaxQuordlepleen ( 236397 ) <el_duggio@hotmail.com> on Thursday December 17, 2009 @08:34PM (#30481764) Homepage

    I deplore the use of the word

    Really? I don't get that. The reason that people use the word as an insult is because it is a word used to describe people who have intellectual challenges. It doesn't matter what you do, the euphemism treadmill [wikipedia.org] will get you in the end. As evidence I submit this anecdote: I was spending some time with a couple of young gentlmen (kids of a friend of mine) who are 9 and 11 years old. Big kid was teasing little kid, so big kid said "you're a special needs kid!". Does this help you see the futility of rotating in new acceptable terms every couple of decades?

What is research but a blind date with knowledge? -- Will Harvey

Working...