BBC Lowers HDTV Bitrate; Users Notice 412
aws910 writes "According to an article on the BBC website, BBC HD lowered the bitrate of their broadcasts by almost 50% and are surprised that users noticed. From the article: 'The replacement encoders work at a bitrate of 9.7Mbps (megabits per second), while their predecessors worked at 16Mbps, the standard for other broadcasters.' The BBC claims 'We did extensive testing on the new encoders which showed that they could produce pictures at the same or even better quality than the old encoders ...' I got a good laugh off of this, but is it really possible to get better quality from a lower bitrate?"
Crap HD Quality (Score:3, Interesting)
Try watching a football game here in the US and you will see what crap quality can be. The turf turns into squares of blur when the camera moves, then returns to blades of grass when the picture is stationary. As soon as you spot it you will hate it. If you don't see it then OK for you.
I used to have a friend who could spot the two little circles in the top right of a movie in the theater telling the projectionist to change the reel. Once he saw them the movies were never the same again.
Re:Yes, of course (Score:4, Interesting)
Re:Summary rounding error (Score:2, Interesting)
Depends on your definition of "almost" and what precision you're rounding to.
Re:It is absolutely possible (Score:5, Interesting)
It's always nice when free software solutions trash the commercial alternatives.
Re:Focus group... (Score:5, Interesting)
In the US, Comcast uses codex compression to squeeze HD on their cable systems. When people get to see native resolution at the TV store, then get the Comcast version when they plug in their shiny new HD TV, they wonder WTF? That the beeb would put their foot on the garden hose and expect no one to notice is ludicrous.
I wish the FCC would get involved in the US to force cable companies to limit the number of channels supported and broadcast them in the highest sustainable resolution-- or tell their users the truth about what's happening and why. Maybe we can start to get rid of the excess junk channels.
Re:Focus group... (Score:4, Interesting)
Adding to that, Comcast's programming is 720p, with much of it upscaled. The Blu-Ray source you see at the stores are often 1080p, or at least 1080i. You're comparing rotten wormy apples to nice juicy oranges, where Comcast's feeds are the rotten wormy apples.
Re:Focus group... (Score:4, Interesting)
A codec is a compressor/decompressor piece of code that's used in one of two circumstances-- lossy or non-lossy stream compression, usually (but not always) of audio/video information. The eye and ear can detect certain types lossy compression effects, and some people are better at detecting problems than others. Generally, more compression yields more information loss that is sensed by low quality video (jaggies, weird frame transitions, noise, fewer colors, or distorted sound of various kinds). But more compression means less bandwidth used, so that more streams can be handled per given bandwidth 'space'.
In the US, the current max horizontal by vertical HD TV resolution is 1080 pixels, and its data rate at full color value is about 16megabits/sec. There are two types, interlaced and progressive scans. Interlaced writes and holds information from frame to frame while progressive writes whole frames (a simple explanation) and progressive is preferred but requires more intelligent electronics to produce. The 1080p HD picture is preferred. An interim size, 720p, is often what cable companies send down the wires to your set. The native resolution refers to the uncompressed data rate, or one that's used with a non-lossy compressor (meaning that the decompressor can re-interpret the compressed stream to reproduce the original image 100%).
Re:Focus group... (Score:3, Interesting)
Codex is the plural of 'codec'. It could also be stated 'codecs'. It's an abbreviation of COmpressor/DECompressor, in the plural.
Saying "Comcast uses codex compression" without specifying any particular type of compression/decompression is rather awkward. "Comcast uses compression" is no less accurate, unless you specifically wanted to distinguish the type of compression that uses a compressor/decompressor from the type of compression that doesn't, if that's even possible.
Besides which, I don't believe you that codex is the plural of codec.
Re:Focus group... (Score:3, Interesting)
In the UK, for tax purposes, the "father" in a lesbian relationship is the one that didn't get pregnant with the child in question.
Re:Focus group... (Score:3, Interesting)
to limit the number of channels supported
Of course, as long as most storylines are low def, having a lot of channels might not be a bad idea; don't assume they'll necessarily kill off the channels you consider excess junk.
Personally I can't say I care much about resolution. As long as the format is progressive, digital and anywhere near SD increases in resolution are about the last thing I'll notice in any normal viewing situation. Improved contrast, lack of compression artifacts, sound, content, basically anything is more important than actual resolution. Having run a couple of blind-tests on myself when deciding on what quality to use when ripping my DVD's I could barely tell the difference between 720p and lower progressive resolutions, and lowering resolution in exchange for bitrate usable for artifact reduction was a positive tradeoff down to the ranges of 360p, as long as we're talking actual moving pictures and normal viewing (paused frames and 3 inch viewing distance are another thing, of course).
I can live with the trade-off. For the Borg with high quality ocular implants, or those with extra eagle in their ancestry (or dashboard), and for the half dozen movies worth the extra space per century, there's non-spectrum-limited media like blu-ray. But given the choice between very high definition crap, or twice the amount of crap, I'll take the extra helping of crap.
I agree that they certainly should tell their users tho, and preferably be required to demonstrate carefully and repeatedly the exact differences in quality so people can decide for themselves. Perhaps they could even be required to make available lower-quality broadcasts in higher quality under non-prime-time when bandwidth might be available.