18% of Consumers Can't Tell HD From SD 603
An anonymous reader writes "Thinking about upgrading to an HDTV this holiday season? The prices might be great, but some people won't be appreciating the technology as much as everyone else. A report by Leichtman Research Group is claiming that 18% of consumers who are watching standard definition channels on a HDTV think that the feed is in hi-def." (Here's the original story at PC World.)
Closer to 75% in my experience (Score:5, Informative)
There's an ongoing battle in my family between keying in the "standard definition" version of channels and the "high definition". They all think I'm this weird limey geek (I'm the only English person in the family) who's obsessed with it. They're right of course. You should've seen the argument when I blocked the SD channels *grin*.
The fact is, most people really don't care so long as the TV is reasonably sharp and the sound is reasonably good. Standard definition is perfectly watchable to the average user, HDTV is still seen as just another buzz word. The majority of people with newer HDTVs are watching them with the coaxial cable stuffed into the antenna port in SD, and they're none the wiser.
Many variables (Score:5, Informative)
Including:
- type of screen - plasma vs LCD, SD would be more noticeable on the latter IMHO.
- 720p, 1080i or 1080p? All are technically "HD".
- distance from screen - it is well established that HD only improves your experience if you are close enough to overcome your eyes' limited ability to resolve that level of detail.
- quality of signal - I have seen "HD" signals which were so compressed and crappy they looked worse than well-encoded SD signals. Similarly, many "HD" broadcasts are just re-encoded from non-HD content.
My gf routinely has the SD, rather than HD, version of various TV channels on because evidently from her point of view there is no discernable difference. This is a 42" plasma from about 4 metres away.
In any event, this just highlights that, as with all audio-visual products, how it actually looks/sounds to you is far more important than its specs. IMHO you are much better off with a good 720p plasma (Pana or Pioneer) than a mediocre 1080p LCD, for example - you will get better colour, much less ghosting, and (if set up correctly) a more faithful reproduction of the source material rather than a sharpened, cartoon-y looking version like many LCDs produce.
In addition, your expected use is critical - movies and sport tend to suggest a plasma will suit your needs, whereas lots of normal broadcast TV/desktop-type computer use might be better suited to an LCD.
Re:Frame rate (Score:5, Informative)
They always shoot (or at least play) films at 25/30fps, and that irritates me no end. They basically look quite jerky when you know what to look for.
Re:Are they nuts? (Score:5, Informative)
There's no such thing as HD rabbit ears, or a HD antenna*. Antenna manufacturers like to pretend that you need special equipment, but US DTV is broadcast on a subset of the frequencies used for OTA NTSC. Any existing antenna will work fine.
* You might handle multipath differently, and the UHF range is a little smaller, but that's about it.
Re:Are they nuts? (Score:5, Informative)
High Definition TV != Digital TV mandated throughout the US although it becomes possible to transmit DHTV over the air when the switch is made. This too is often a common misunderstanding.
Re:Closer to 75% in my experience (Score:3, Informative)
Motion blur (Score:5, Informative)
Does any framerate greater than your monitor's refresh rate matter?
Yes. If your engine can render at 120 fps, it can render the scene twice and combine the two images to add motion blur. This makes fast motions, such as projectile motions and the constant quick pans of any first-person game, look more realistic. It's also why film looks acceptable despite 24 fps.
Re:Frame rate (Score:2, Informative)
Yes - the 60-120FPS thing is a limitation of our current display tech. However, OLED potentially at least has the ability to display blinding frame rate speeds (never mind the blurring aspect, as OLED has a response time in the micro seconds - 500x faster than LCD).
OLED can turn pixels on and off in microseconds. So does DLP, which uses pulse width modulation on each micromirror to create shades of gray. Yet there really isn't a lot of 120 fps material to display, unless these 120 Hz TVs are doing motion vector interpolation on the 24 to 60 fps input signals.
Link for Motion Blur etc. (Score:5, Informative)
Re:Many variables (Score:1, Informative)
Actually, I often tune the SD version of a channel rather than the HD version for another reason - the upscaler in my TV (a Sony Bravia) is better than the one that the TV stations seem to be using here in Australia. Nearly everything broadcast on the HD channels is upscaled from the SD feed, and usually badly. Only big name shows, usually from the US, are broadcast in proper HD.
The other bother is that the HD channels are still so proud of their existence that every second ad shown on them is for the HD channel itself. Another one doesn't bother to broadcast things like news, instead playing elevator music and showing a TV guide for what you could be watching instead.
I Agree with Compro01 (Score:3, Informative)
Re:Frame rate (Score:3, Informative)
I totally agree. The linked video was useless as a comparison, because the video itself was running at 24fps. I did notice a difference on the bouncing ball (the 2nd ball is relevant, the first is just camera technique and you could make it look similarly blurry at 60fps if you wanted), but I could not see any difference in the ut2k4 side by side. Nor should I have been able to according to scientific fact.
For gaming I do think having a higher fps (e.g. 60) is better. This is because the frame rate isn't constant but depends on how much work the graphics card is doing. The frame rate drops when you have water, reflections, rockets exploding all around you etc. If you get 60fps normally, this means you have some room for your frame rate to drop to drop without looking jerky.
As for movies and TV, I'm sorry that the OP can't enjoy them, unlike normal people (it is possible he's not making up his complaining and just has extraordinarily good vision). They have, to my knowledge, always been screened at 24fps or thereabouts and will always be broadcast at a similar rate. Any extra frame rate is just wasted.
Re:Many variables (Score:3, Informative)
Re:Frame rate (Score:3, Informative)
You're quite right that the 60fps version had a few glitches in it (thanks to bad video quality). It's surprisingly hard to get smooth and fast video on the PC for various reasons, unless you know what exact codecs to use.
An earlier post mentioned this comparison which is probably better:
http://www.avsforum.com/avs-vb/showthread.php?t=1069482 [avsforum.com]
Re:Truly (Score:3, Informative)
Of all of that... Nova is the only thing I can think of that's worth
the HD treatment. Even then, the niftiness of Nova is limited to
broadcast where you can be assured that the original absurdly high
bitrate is reaching your TV set.
The rest are more than adequately displayed in 480p widescreen.
Dr Who isn't even something they want to produce in HD. They're
worried the detail resolution will show everyone how crappy their
props are.
HD is a fix for an artificial problem: Namely crappy SD digital broadcasts.
Beating a crap SD digital cable signal is easy. Beating a quality
480p SD source (transcoded or not) is a little harder.
Re:Frame rate (Score:5, Informative)
I wonder why the people who complain about 75 Hz CRT monitors being flickery are perfectly willing to work in 50/60 Hz lamp flicker.
1) They aren't staring at the lamp for 8 hour a day.
2) Incandescant bulbs don't actually flicker on/off, they just deviate a little. Think about how it works, when the current changes direction, and the power drops off, yes the light emitting filament starts to cool down but it stays glowing plenty long enough to still be glowing at nearly full brightness when the power comes back up the other side. So instead of '100%-0%-100%-0%' its more a slightly wiggling 100%-95%-100%-95% and few humans can see this slight brightness wobble.
3) As for flourescents, the older ones actually WERE horrible, and people OFTEN complained of headaches after working under them. Modern flourescents though, with modern ballast technology, cycle much faster, and are much less of a problem for people.
Re:Are they nuts? (Score:2, Informative)
Have you talked to your friendly RF-Guy lately? That trade is a black art.
Worthless So-Called Science (Score:5, Informative)
How can anyone take a study seriously that supposedly examines visual perception by talking to people over the phone? They learned nothing except that some people answer questions over the phone a certain way. That study design leads to the error of forced responses, producing responses where none would have been forthcoming except for the question having been asked. Such answers have nothing to do with any perceptual ability, bias or preference.
Re:Link for Motion Blur etc. (Score:4, Informative)
Re:Many variables (Score:5, Informative)
The primary reason I like the digital channels is that they are true 16:9 widescreen.
That's an American thing, where the broadcasters decided not to standardise on 16:9 or DVB until they could bundle it with HD.
In the UK (an probably the rest of Europe - not sure) 16:9 SD DVB-T has been broadcast since 1998, all new sets (for some years hence) can receive it.
The difference between a SD DVD and a HD-DVD is striking at first, but within 5 minutes of a film starting, I stop caring.
Re:Many variables (Score:3, Informative)
What country do YOU live in!?
I've been watching digital TV exclusively for years, and have yet to see ANYTHING that was not broadcast in its Original Aspect Ratio.
Reruns of shows from before 16:9 was common always throws up the black bars.
Even relatively high-res shows like NBC's "Poker After Dark" gets a 4:3 picture with network logos on the sides. Nobody EVER stretches the screen out.
If you're watching the same broadcasts I am, I think you might have the settings on your TV misconfigured to fit images to the screen for you or something.
Re:Many variables (Score:3, Informative)
The DLP rainbow is only perceptible to some people. Sounds like you're one of the unlucky few.
I don't see it, but I don't buy DLP sets because I like having people over frequently to watch TV, and some of my guests might be sensitive to the effect.
Re:Motion blur (Score:5, Informative)
Games don't do motion blur by just bluring two frames over each other (which would be rather awful), but by recording the velocity vector of a pixel and bluring that pixel with it as post processing effect, i.e. you need only a single frame and a bit more GPU power for the effect. Not all new games do that, but quite a few.
However there are TVs that interpolate inbetween frames, like Sony's 200Hz Motionflow, which takes a regular 25Hz input signal and then calculates the inbetweens to fill it up to 200Hz. There is similar stuff from other companies too.
Re:Frame rate (Score:3, Informative)
Lamp flicker is at 100/120 Hz (two power "boosts" during each sinus cycle doubles the frequency).
Re:Frame rate (Score:3, Informative)
AC power is almost universally a sine wave (or close) with a frequency of 50 or 60 Hz (i.e. cycles per second). Each cycle contains two peaks, one negative and one positive. As a light bulb works equally well irrespective of the direction of current, you get a 100 or 120 Hz cycle in the power output.