18% of Consumers Can't Tell HD From SD 603
An anonymous reader writes "Thinking about upgrading to an HDTV this holiday season? The prices might be great, but some people won't be appreciating the technology as much as everyone else. A report by Leichtman Research Group is claiming that 18% of consumers who are watching standard definition channels on a HDTV think that the feed is in hi-def." (Here's the original story at PC World.)
Re:Are they nuts? (Score:3, Interesting)
I have a HDTV and can tell the difference but don't care. I am not willing to pay the price difference for HD tv shows. My HDTV isnt going to waste tho, I do use it for high def gaming.
Frame rate (Score:5, Interesting)
Perhaps even more irritating than this, is how some people can't distinguish between 30 and 60 FPS (or at least don't care), when of course there is a massive difference. The latter is much smoother for all kinds of programmes and games. 120 FPS of course would be even better...
It would be more interesting if... (Score:5, Interesting)
I'd be more interested in a comparison between upscaled SD and HD. That is, an upscaled DVD (even the Xbox 360 upscale would do...no need to go fancy), vs a 720p source. I bet that 18% would become much, much higher... I have 2 TV of exactly the same size and resolution, and I tried putting them side by side... aside for the annoying 4:3 ratio that most DVDs are in, Its freakishly hard to tell the difference on anything below 40-45 inches (at a reasonable distance... of course its easy if you have your face in the TV).
The biggest reason SD "looks so awful about seeing HD" is because the built in upscalers of most HDTV is completly horrible, and make SD sources look faaaaaar worse than they should.
Re:Frame rate (Score:3, Interesting)
Re:Frame rate (Score:2, Interesting)
Re:Frame rate (Score:4, Interesting)
In a very roundabout way, yes.
If a graphics card can barely average 60 FPS (or whatever your monitor's refresh rate is - my ThinkPad runs its LCD at 50 Hz,) then it's going to have dips well below 60 FPS.
Re:Its worth noting (Score:5, Interesting)
It is still important. I can most definately tell if what I'm watching is from a crappy VHS or from a DVD. That was obvious the first time I ever saw a movie on DVD. Walked in a room, saw people watching a DVD movie, and was like "Wow....so thats a DVD movie eh?". An HD source vs an SD source (to be fair, I'm talking about a movie or TV show... other kinds of content will be easier) gets a lot trickier.
I remember last time I brought my Xbox 360 to a family member's place. All of their TV's HDMI connectors were taken (which is what I normally use), so I brought the component cables (which can do 720p just fine). Since I had never used component, the console went back to default: 4:3, 480 lines. After playing a few hours, I started noticing something weird.... the ratio (the game i was playing didn't make it totally obvious like most would). So I went in the config to set it back to 16:9, when I noticed... 480 resolution? The hell? Switched it back up to 720p... There was a difference, but it wasn't all that obvious (no, it wasn't one of those 520p games that they upscale).
I'm sure I'm not the majority and that most people would have been able to tell much faster, but point still stands though: for a large amount of people its fairly irrelevent if you give them HD for a month or a year. As long as there's no artefact in the picture (like VHS), how many pixels you pump in Sex in the City won't matter.
Re:Frame rate (Score:5, Interesting)
Re:Frame rate (Score:3, Interesting)
We just bought a new tv not too long ago (Sony Bravia) with the option af setting the refresh rate to 120Hz, makes an amazing difference watching sports or anything with fast motion, but makes regular tv shows very eerie - almost cheap looking. I don't know how anyone could not tell the difference with fast motion, maybe if you were watching the fireplace channel...
Re:It would be more interesting if... (Score:5, Interesting)
UsersAreMorons is an inappropriate tag (Score:4, Interesting)
Please use usersAreBlind instead ;-)
In all seriousness though, blaming people for being unable to tell the difference between SD and HD isn't a positive thing. The irony being that if they can't tell the difference they get to save themselves a whole lot of money. Thoguh personally I'd rather have decent eyesight and make the choice of SD vs HD based on whether I think it's worth it. I can tell the difference and I'll be sticking with SD until HD is much cheaper by the way.
Content Quality versus Visual Quality (Score:5, Interesting)
Humans are often easily distracted creatures, as demonstrated by numerous examples of highly successful ad campaigns over the years. As long as you present the audience with enough interesting or flashy content, the quality of the medium becomes less relevant.
The solution to speeding up HD adoption, is to make the content itself less interesting. The viewers will have no choice but to start taking notice of external annoyances like picture quality.
Statistically speaking... (Score:3, Interesting)
Re:Frame rate (Score:3, Interesting)
That may have something to do with how LCD displays have a bad response time (causing blurring - a separate issue from frame rate). Alternatively, perhaps the programmes you view were shot at 30 or 60fps, so they weren't meant for 120fps TVs anyway.
OLED technology should fix both issues in the future, as they have incredible response times, and probably excellent frame rate potential.
Re:Are they nuts? (Score:3, Interesting)
Nope. in fact I highly suspect their findings as too low. MOST people that buy a HDTV and have my company install it cant tell the difference between a SD broadcast and a HD broadcast, because they sit 15 feet away from that 50" plasma above their fireplace.
you need to sit 6-8 feet from a 42-50" display to really see the difference, more than that and your eyes cant see the resolution.
It's even worse if your HD signal is a crappy signal like Comcast. The Comcast local PBS Hd QAM channel looks like hell compared to the signal I get over an antenna. It's like night and day here they compress the OTA channels so hard. as well as discovery HD looking like crap as well...
18% at a proper viewing distance I can understand. but that number will grow exponentially the farther they sit from the TV. and most "trendy" homes have the TV way too far from the seating. and putting it above the fireplace is just plain old stupid,.
Re:Many variables (Score:4, Interesting)
Last I heard the only things actually broadcast in HD are the World series and the Super Bowl, so yeah, I'd say that if you're watching a lot of broadcast TV but not much sports, you're just as well off getting a 720p anyway.
Incidentally, I spend a lot of time answering questions about TVs, selling them is part of my job. Funny thing though, all of our display model HDTVs are playing a looped DVD over split coax that isn't even terminated on the unused outlets... people will stand there oohing and ahhing over how great the picture is despite the fact that it is absolutely not HD in any way shape or form. Makes it pretty hard to convince people Sony sets are worth more than Olevia ones, too.
This headline comes as so little a surprise to me that I have trouble believing anyone even doubts it.
I can't see a difference ... (Score:5, Interesting)
Re:Frame rate (Score:5, Interesting)
Some displays will also use interpolation to "create" frames rather than simply repeating each frame for a set period of time. This technology, IMHO, isn't quite up to snuff and gives films/shows a somewhat odd synthetic appearance. Keep in mind that this tech is separate from the 5:5 pulldown described above.
Re:Closer to 75% in my experience (Score:4, Interesting)
I wouldn't be surprised to hear that they are less concerned with the quality difference between the SD and HD stations than they are with how much slower their channel surfing is with the HD versions.
Re:Many variables (Score:5, Interesting)
I waited until this was consistently, noticeably no longer the case before buying a plasma. I still would not by an LCD, although the higher end Sony 1080p models are starting to look pretty amazing when set up with optimal source material.
I also had a decent Sony CRT, which I gave to my parents when I got a Panasonic plasma. Although I thought after a while that maybe the plasma wasn't *that* much better, I have since been and re-watched the Sony, and frankly the plasma blows it back into last century, where it belongs. You just cannot beat the clarity (not to mention size and response time) of plasmas IMHO.
Re:Are they nuts? (Score:3, Interesting)
video quality is really not that important (Score:5, Interesting)
In my experience, people tend to care more about things other than the video resolution when watching TV. Like, say, the plot, or the character development.
Watching hokey, on the other hand, I can understand why people would want to see the puck better, but in the general case I think no one gives a *** about resolution.
If it's a good movie I'll happily watch it at 320, blurry, at 15 FPS, if that's all I can get.
Frankly, when it comes down to it, the sound quality matters more than the video.
If you can't hear what the actors are saying you may as well turn it off, but if you can basically get the idea of what's going on, video isn't that critical.
Maybe I just have low standards.
One important detail (Score:5, Interesting)
There is something that they aren't accounting for. People (especially less tech savvy people) not realizing that they aren't watching HD, they just assume if it's on a newer plasma/lcd, then it's HD.
For example, I have a relative who was watching football today on my cousin's plasma. He of course tuned to the channel he gets at home (CBS), the non-HD version. Simply because he had no idea that verizon offers HD versions of pretty much all basic cable just by going to channels above 500 in my area.
At some point, it occurred to me that the picture didn't quite look up to snuff, so I asked him what channel he was on (since often SD os broadcast on HD channels because the original signal was SD), he said 7. I said "a-ha! you should switch to the HD version of this channel!".
He was confused, but told me to go for it. He was *amazed* at the difference in clarity. He said claimed it looked like he was down on the field.
Not being able to tell the difference is very difference from not knowing there is a difference available.
I would wager that if you put the 2 screen side by side, one showing the signal in true HD and the other in SD. Anyone without vision problems can tell the difference.
Re:Frame rate (Score:3, Interesting)
Clearly the person who put the video together needs to work on their marketing skills however. When trying to convince people at agree with you declaring that everyone else would "have to be blind" to disagree with them. Even if i had noticed a difference i'd be tempted to say i didn't just to spite them after that.
Re:It would be more interesting if... (Score:5, Interesting)
I have a DVD player with an Realta HQV video processor and it really does a great job. In order to visually benefit from something like BluRay over a top notch scaler you have to have a pristine master and a high quality large screen (1080p at least 50"). It is difficult to get that good a master from older film or most video. That is great news since the vast majority of my current DVD collection will remain satisfactory for a long time.
But - new films mastered in HBR sound formats and 1080p on a good screen are enough better in both sound and appearance that I have stopped buying DVDs. I am renting until an acceptable BD player becomes available at which time I will start buying BluRay disks.
Yep (Score:3, Interesting)
It's strange, but I work with stereoscopic video and have noticed that even 640X480 in stereo 3D looks a lot sharper than 1920X1080 mono.
It is a psycho-visual effect, for sure. But it is real.
IMHO - forget about HD and use the bandwidth for 3D.
Re:Are they nuts? (Score:3, Interesting)
you need to sit 6-8 feet from a 42-50" display to really see the difference, more than that and your eyes cant see the resolution.
Oh, my, you must be blind.
I sit over 10 feet away from my 38" TV (an honest-to-goodness picture tube set), and everyone who has seen the TV can tell the difference. It's measured resolution is about 1400x900, so most all HD is at or close to full resolution.
Meanwhile, SD is at most 720x480, and usually a lot less than that. It's easy to tell the difference.
Now, the difference between 1920x1080 and 1280x720 is something that you can't really tell without a large display with the ability to fully resolve 1920x1080.
Re:Are they nuts? (Score:2, Interesting)
What this tells me... (Score:5, Interesting)
Isn't that people are stupid, but that the HD content we currently have isn't exactly HD. Even the snazziest Blu-Ray displays in places like the Sony Store or any big electronics retailer seem to have really nice-looking visuals, but they also seem to have a big problem not only with interlacing(?! Isn't this 1080p?!), but also with video compression artifacts. In many cases, when I look at the TV's on display, I can't usually tell that what I'm looking at is HD, unless the video's been specifically tailored to show off the resolution. TV broadcasts (the few that are HD around here), Blu-Ray movies (especially live action), doesn't matter. It all looks quite muddy, and I'm distracted often by the block and ring artifacting, just as I was when DVD was first released.
I don't have an HDTV or an HD player, myself, so I'm not intimately familiar with how current movies are being compressed on the disc, but... Don't they have any room to turn up the bitrate a little? I mean, sure, it's not reasonable to expect an uncompressed image (though I'd really like it), but seriously, the video compression quality sucks.
You can have as high a resolution as you want, but when artifacts are large enough to casually notice, you've defeated the purpose of that resolution; I would have rathered a cleaner lower-definition source than that.
Eh, I can tell but so what? (Score:5, Interesting)
When we still had the SD DVR and I had to stretch Stargate Atlantis (meaning the effective resolution was sub-SD) to fill the screen, I got tweaked more than a little. But other than that (which doesn't happen anymore with the non-4:3-aware HD DVR), I can honestly say that I don't much care. Yeah, I can pause Law & Order and count the strands of Elizabeth Rohm's hair or stop Atlantis and count the stubble in John Sheperd's beard - but so what?
I'm here to watch the criminals get caught or the Wraith be foiled again, not to stroke my e-penis to the thought of how awesome my screen's picture is. Unless the picture is suffering horrible abberations or the audio is like 64kbps mp3, those don't really impede the story.
In conclusion: It's absolutely astonishing how many details your brain can paint in or interpolate if you let it.
Re:It would be more interesting if... (Score:2, Interesting)
I have a DVD player with an Realta HQV video processor and it really does a great job. In order to visually benefit from something like BluRay over a top notch scaler you have to have a pristine master and a high quality large screen (1080p at least 50"). It is difficult to get that good a master from older film or most video. That is great news since the vast majority of my current DVD collection will remain satisfactory for a long time.
I don't have a player with a Realta HQV processor -- just an upconverting PlayStation 3 -- but I do work in the video industry and I've seen the Teranex in action and in my opinion there's a significant difference between Blu-ray and "a top-notch scaler." I watch at home on a Sony 47-inch XBR4, and the difference is substantial.
As far as "older film" goes, I have The Wild Bunch, The Searchers, and Black Narcissus, and they crush the older DVDs like the proverbial grapes. Just marvelous viewing. It's not just a question of resolution but also color fidelity. (And, of course, the Blu-ray masters may well be the newest transfers, which makes any comparison to older discs a little unfair. But still -- I look at enough video on enough different screens in enough different high-end facilities to know what I'm seeing.)
I know a lot of people claim there's not a big difference between good SD upconversion and true HD, but these people either sit too far from their screens or just don't know what they're looking for. (Some of them may be the same people who keep their LCD screens set to "Vivid" and complain about grainy pictures.) As a cinephile who'd like to see more titles beyond the frat-boy demographic become available in the format for purely selfish reasons, it frustrates me a little when people argue that HD doesn't represent a significant quality gain over SD.
Re:Are they nuts? (Score:3, Interesting)
Only if you have more than one TV.
Keep in mind that a signal amplifier amplifies the noise just as much (sometimes more) as the signal you're interested in. You don't really need one if you're not splitting the signal downstream.
That's a little too broad of a statement, and is of course not true in many situations. You also need one if you don't live near the transmitter. I have a good antenna (I forget the brand and model) and a good masthead amplifier. The signal is so weak I get about 2/3 of the channels that I know are out there, even with the amplifier. I have a Samsung DTB-H260F receiver, which is a reasonably good unit. I have all RG6 quad shield cable. It's still not good enough. The picture drops out a lot on some of the channels, and just enough to be annoying on the others. It's the same on the other TV in the bedroom too. It doesn't matter if I connect a single TV directly to the masthead antenna's output, either.
Why? I'm over 50 miles from the transmitters, I have huge tall trees throughout my neighborhood and there's a river valley between me and the transmitters that cuts the signal in half according to the online signal strength maps.
The next thing I'm going to try is putting the antenna up higher on a mast. 1080 on a 108" projected image is worth the effort.
But, my main point was that blanket statements about when you need a decent antenna or an amplifier are often going to be false because these are not one-size-fits-all things. ATSC reception demands a consistent, strong signal but conversely seems to have low tolerance for multipath, which is a problem you can incur with a "too good" antenna or amplifier.
Re:One important detail (Score:3, Interesting)
Who cares? Do you want to go out creating more videophiles? Does this world lack enough audiophiles for you? Side by side, I can tell the difference, but most content on television is not actually improved by increased bitrates or resolutions. Sitcoms and dramas, in particular -- you don't need to see the flaws of the actors' and actress' faces, in fact, they distract. The only time HD matters is sports and special effects laden movies.
If you showed a relative that he's missing out, did you do him a favor? Maybe he's got a 10+ year old TV with analog cable (or broadcast), and can't afford a whiz-bang LCD (let alone plasma) or digital cable -- which is a lot of America, by the way. People are often much happier before they found out there were others doing better than they were.
I know the difference between HD and SD, and I can afford the upgrade, but I choose not to. I have a 32" Trinitron, and it does what I need (AppleTV for transcoded DVDs, DVDs, Wii). There's actually nothing I care about that a 720p (or higher) would improve, aside from power consumption and weight.
Re:Are they nuts? (Score:5, Interesting)
Time Warner in Charlotte, NC advertises "Free HD, for only $9.95 more a month, while out competitors (satellite) charge more than $100 a year for the same service".
Being that my brain hurts whenever I get close to figuring out how $9.95 a month is "free", and being that my soul hurts for paying the fools that would be proud of $9.95 a month compared to $100 per year, I'm not amicable to explanations as to why I should consider $9.95 a month to mean "free".
Re:Many variables (Score:2, Interesting)
I've found that, as is often the case, the source is the key. My 42" plasma is NOT HD enabled, so I am reliant on digital SD broadcasts (satellite and terrestrial) and DVD as my sources. DVD looks great as you'd expect, but the real difference is when viewing broadcast TV. It is SO obvious which progs have been shot in HD and which havn't.
The bottom line is , for most people, programs played on an SD set are more than acceptable providing the source was shot in HD, and this may explain the results of the poll in TFA. I'm sure this is true for CRTs as well as flat screens.
So the answer might simply be that all programs should be shot in HD, then the viewer experience will always be good, regardless of their equipment..
Comment removed (Score:3, Interesting)
Re:Are they nuts? (Score:3, Interesting)
Amplifiers worked great with NTSC because it made the "sync" signal stronger and allowed the TV's tuner to lock into the station. While this process also added noise, the human brain has millions of years of development that allows it to "see through" the noise and extract an image. (For example I was watching CSI - it was a blurry image, but my brain could still see the hot blonde in the white noise.)
Digital receivers don't like noise, so adding the amp often makes things worse. A computer, unlike our brains, can't deal with it. Instead of extracting a hot blonde, it just gives up.
You have to wonder how this was done (Score:3, Interesting)
As stated, it implies that 18% of consumers can't distinguish SD from HD in a direct A/B comparison. I find this frankly unbelievable.
On the other hand, I would not be at all surprised if 18% of consumers, particularly those who don't normally watch HD, might be unable to recognize HD when either SD or HD is shown on an unfamiliar monitor without the opportunity to make a direct A/B comparison.
Another question is whether they were actually being asked to distinguish 480i SD from 720p or 1080i/p HD, or whether the "SD" was really 480p ED. On anything other than a very large-screen monitor, the distinction between ED and HD is fairly subtle. Actually, I expect the percentage of people unable to tell whether a picture is ED or HD would be considerably greater than 18%
Is it confusion about the signal or the TV? (Score:3, Interesting)
I run into a lot of non-tech people who have difficulty understanding the difference between an HD TV and an HD signal. Such people would probably answer the question they thought they were asked, by correctly identifying the TV as being high definition, without ever really understanding that an HD TV can display both HD and SD content.
Yes, the researchers probably explained the difference to the respondents during the course of the study, but many such people still don't understand the difference between HD & SD signals even after you explain it to them.