Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Television Media

18% of Consumers Can't Tell HD From SD 603

An anonymous reader writes "Thinking about upgrading to an HDTV this holiday season? The prices might be great, but some people won't be appreciating the technology as much as everyone else. A report by Leichtman Research Group is claiming that 18% of consumers who are watching standard definition channels on a HDTV think that the feed is in hi-def." (Here's the original story at PC World.)
This discussion has been archived. No new comments can be posted.

18% of Consumers Can't Tell HD From SD

Comments Filter:
  • Its worth noting (Score:5, Insightful)

    by Bazar ( 778572 ) on Thursday November 27, 2008 @09:43PM (#25912481)

    The links don't say that 18% can't tell the Difference

    Just that 18% can't tell if what their seeing is HD

    An analogy would be playing mp3's, and asking people if it was 320kbps, or 64kbps.

    Most people won't be able to tell the encoding rate just by hearing it, but if you play two different versions side by side they should be able to pick out the difference.

    They probably can tell the difference, but they can't spot HD just by looking at it.

    Give them an HD Content for a month and they'll quickly learn however.

  • by TimHunter ( 174406 ) on Thursday November 27, 2008 @09:48PM (#25912509)

    20 year old eyes are much better than 50 year old eyes. I wonder how many of the 18% are older folks? I'm 55 and I'm hard-pressed to distinguish between SD and HD.

  • This means 82% can (Score:5, Insightful)

    by cpct0 ( 558171 ) <slashdot@mi c h e l d o n a i s.com> on Thursday November 27, 2008 @09:51PM (#25912531) Homepage Journal

    Of course the psychology of words will make you believe this is horrible, when in fact, 82% can tell the difference!

    Then, like said elsewhere, a properly upscaled good-resolution SD is very potent. What is crap is the digital signals we're being fed.

    A story that happened to me. I used to listen to Paramount channel for ST:V a few years ago (god I'm old), and this was the only digital channel I used to have. Sometimes, I couldn't listen to some shows immediately, so I time-shifted them on a VHS, in EP (that's the 8 hours per cassette mode, young folks ;) ), and even then, with quality degraded, I could still see the digital scans when scenes were changed, or during space-blacks! Now that my boobtube provider is putting approximately 3 times the amount of channels into the same QAM, quality is even worse than before.

  • $1000 Better... (Score:5, Insightful)

    by wzinc ( 612701 ) on Thursday November 27, 2008 @10:11PM (#25912635)

    Is HD better than SD, yes. Is it worth the $1000 extra you have to spend on everything to get HD? IMHO, no, but I know others feel differently.

  • Re:Many variables (Score:5, Insightful)

    by AKAImBatman ( 238306 ) * <[moc.liamg] [ta] [namtabmiaka]> on Thursday November 27, 2008 @10:16PM (#25912667) Homepage Journal

    My gf routinely has the SD, rather than HD, version of various TV channels on because evidently from her point of view there is no discernable difference. This is a 42" plasma from about 4 metres away.

    I can tell the difference, and I don't care too much about the quality improvement. The primary reason I like the digital channels is that they are true 16:9 widescreen. Opening up the edges of the scene makes a much bigger difference than the horizontal resolution, as far as I'm concerned.

    Of course, that only applies to regular television shows. Camera operators have been trained for decades to keep the camera tight on the subjects. Thus the extra detail is not needed. If you're talking about a complex scene like sports, however, all bets are off. I don't usually watch football (save for the Superbowl), but even a blind man can tell that an HD picture shows you more of the action than an SD picture. :-)

    BTW, one reason why many people can't tell the difference is that the LCD or Plasma screens are already WAY sharper than the CRTs people used to watch. In result, even an SD signal looks a lot better. (Unless you're playing video games. Then SD looks worse.)

  • Truly (Score:2, Insightful)

    by smittyoneeach ( 243267 ) * on Thursday November 27, 2008 @10:27PM (#25912731) Homepage Journal
    The difference between dreck and HDTV-dreck is a difference that makes little difference.
  • Re:Many variables (Score:3, Insightful)

    by Skippy_kangaroo ( 850507 ) on Thursday November 27, 2008 @10:48PM (#25912885)

    the LCD or Plasma screens are already WAY sharper than the CRTs people used to watch
    Depends on how long ago.

    Many LCDs and plasmas are not as good as latest generation CRTs.

    The picture I get on my 100Hz widescreen SD CRT is still well ahead of many LCD or Plasma sets I've seen. The response time of the LCDs is one big difference - they don't deal with fast motion well. (A stationary shot of a sportsground is OK, but as soon as they pan the grass goes all blurry.) What many people might confuse is watching a digital TV signal versus an analogue TV signal. That was a big change in quality for me.

    Sure the top end flat-screen TVs might be ahead of the best CRTs, but I think the average CRT is still ahead of the majority of flat-screens that seem to be being snapped up by budget concious consumers. A digital signal makes a big difference, after that, not so much.

  • by EmbeddedJanitor ( 597831 ) on Thursday November 27, 2008 @10:48PM (#25912889)
    Once you get to a certain level of quality/performance it it quite hard for anyone but the technophiles to appreciate any improvement.

    Is HD really that much better than SD? Is a dual core really that much better than a single core? Is 100Mbits/sec really better than 20Mbits/s?Is a $5000 hifi really better than a $200 one?

    Once people have something that is "good enough", they don't value an improvement. This is vexing for companies trying to psh consumers to the next level.

  • by w0mprat ( 1317953 ) on Thursday November 27, 2008 @10:52PM (#25912915)
    Generally SD looks noticeably better when upscaled on a respectable HDTV. Especially when the person has upgraded from a CRT, old rear projection or some older not so good panel TV. Also, a current HDTV will have superior colour &/ contrast (often artificially boosted) than the older SD screen.

    These factors would account for a good fraction of the statistic the being rest of the would be accounted for by the Idiot Factor - or to be fair, that many people have slightly off eyesight, or may be just sitting too far away.
  • by Cowclops ( 630818 ) on Thursday November 27, 2008 @11:00PM (#25912947)

    And in other news: 82% of people CAN tell the difference between SD and HD.

    www.cowclops.net/resolutionchart1.png

    You want your optimal viewing distance to be on the line for whichever format you watch the most of, which is about where you'd notice the quality difference between that and the next worst format. If you have a TV smaller than 42" or so or you're sitting very far away for whatever screen size you have, you won't be able to tell the difference.

    And yes, I'm going too post this on every "Stupid people can't tell SD from HD" story until people stop asserting that HD isn't that much of an improvement over SD. I use a 720p projector on a 65" screen that I sit 10 feet away from and Transformers on HD-DVD looks CONSIDERABLY better than Transformers on DVD.

  • by Latent Heat ( 558884 ) on Thursday November 27, 2008 @11:12PM (#25913029)
    How much of this effect you are seeing is compression or coder artifact, and how much is the LCD display (if that is what you are using)?

    People are catching on to the "sample-and-hold" effect that even the fastest response-time LCDs produce loads of motion blur on account that they hold the image rather than scan-strobe it as a traditional video monitor. Google "LCD motion blur sample and hold" to see what people say on this.

  • So what... (Score:4, Insightful)

    by danwesnor ( 896499 ) on Thursday November 27, 2008 @11:16PM (#25913049)
    About 70% of consumers think that hooking up an HDTV to an SDTV cable box makes it HDTV. And 99% of consumers don't realize that the big box stores have a nasty habit of piping SDTV into the cheaper HDTVs while the expensive boxes get the real deal.
    They can't tell, but I see a world of difference, and that's all that matters.
  • Re:Are they nuts? (Score:4, Insightful)

    by rnaiguy ( 1304181 ) on Thursday November 27, 2008 @11:43PM (#25913217)
    Some of the higher priced ones (~$30) are worth it for a signal amplifier.
  • 18%? (Score:4, Insightful)

    by iceT ( 68610 ) on Thursday November 27, 2008 @11:43PM (#25913221)

    Who gives a shit? 18% of the people probably still think the world is flat. I bet a lot of those people said that because they're resisting buying a new TV.. or, like my Mom, who bought a new SD TV 4 years ago, and really doesn't want to buy a new one yet.

    Where's this story: 82% of the people think that HD television is better than SD television. If that's not news worth, why is this?

  • by Anonymous Coward on Thursday November 27, 2008 @11:57PM (#25913299)

    You wrote:

    This is a 42" plasma from about 4 metres away.

    You've told us this is a 42" (1.07m) screen. I'll assume this is the diagonal measure and the screen is 16:9, rather than 16:10. For a 1.07m diagonal 16:9 screen, the width is 0.93m and the height is 0.52m (* = at 16:10, this would be 0.91m by 0.57m).

    Now refer to: http://en.wikipedia.org/wiki/Visual_acuity [wikipedia.org] and http://en.wikipedia.org/wiki/Optical_resolution [wikipedia.org], which tell us:

    20/20 is the visual acuity needed to discriminate two points separated by 1 arc minute--about 1/16 of an inch at 20 feet.

    Someone with 20/20 vision standing 4m from a screen can only discern pixels that are at least 1.16 mm apart. So to distinguish a black and white stripe pattern from solid gray at that distance, you would need a screen at least 0.56m tall in 480p, or at least tall 1.26m in 1080p.

    Since the screen height is only 0.52m, we conclude that someone with 20/20 vision should see 480p and 1080p stripe patterns as solid gray from 4m away.

    *** Disclaimers: (1) I know your 4m number is just an approximate guess; it could off by enough that someone with 20/20 vision could actually see the stripe pattern. (2) Manufacturers fudge their numbers on screen size and physical resolution; plasma cells are a strange beast. (3) If the screen is 16:10 then the height is 0.57m, which would mean someone with 20/20 vision could see a stripe pattern at 4m if its native resolution were 480p; however, the set's resampling mode may cause it to display gray lines in 1080p, or it might display some double-height lines and some single height lines. Ultimately the conversion process biases the experiment too much to have any merit in the real world.

  • Re:Frame rate (Score:5, Insightful)

    by sumdumass ( 711423 ) on Friday November 28, 2008 @12:07AM (#25913353) Journal

    There is even better evidence that the HD providers are compressing the channels and the HD streams they are watching aren't actually HD quality representations of the original content.

    I don't have cable tv myself but a friend who does remarked at how sharp my TV was when watching a Blueray DVD. Even the over the air TV stations were coming in more clear then his Cable HD was on most channels. I took the DVD over to his house and we hooked it up, the comparisons where amazingly different. The HD channels he had (some basic HD package with his cable provider) looked like watching older DivX standard videos with a 340 size or something. All the blacks and fields of the same color were blotchy and blocky, there was a considerable lag between scenes and so on. When we connected the Blueray and watched Narnia or something stupid like that. The picture was every bit as sharp as mine even though we had separate TVs.

    Gamers and so on might be able to tell the difference in a lot of this but I think that most cable/satellite HD content isn't actually HD in it's delivery so most people also haven't experienced real HD long enough to know the difference.

  • Re:Truly (Score:2, Insightful)

    by Anonymous Coward on Friday November 28, 2008 @12:23AM (#25913423)

    Oh come on, not the whole "all TV is shit" argument again. Amongst the good, quality programming that I watch, there's:

    • Doctor Who
    • Battlestar Galactica
    • House
    • Daily Show / Colbert
    • Corner Gas
    • Mythbusters
    • Curb Your Enthusiasm
    • Nova
    • The Nature of Things
    • Dirty Jobs

    And others. Are you telling me that all of these shows are shit?

  • by sam.haskins ( 1106069 ) on Friday November 28, 2008 @12:42AM (#25913511)
    Honestly, I would understand if people were looking on CRTs, that could display all video at native resolution, but I really cannot abide anything displayed at non-native resolution. Like, I seriously cannot understand how you could stand to watch video at even 720p on a 1080i display. I can't stand it when people at work have 1280x1024 LCDs at 800x600. I can't stand video wrong either.
  • Re:Many variables (Score:5, Insightful)

    by SanityInAnarchy ( 655584 ) <ninja@slaphack.com> on Friday November 28, 2008 @01:04AM (#25913615) Journal

    The primary reason I like the digital channels is that they are true 16:9 widescreen. Opening up the edges of the scene makes a much bigger difference than the horizontal resolution, as far as I'm concerned.

    Except, frustratingly, they're often not.

    Here's what usually happens: No one wants to put those vertical bars up. So when showing a 4:3 show on your 16:9 screen, they usually scale it -- which looks awful (squashed). This is true whether it's an SD feed scaled up, an HD version of a movie that was simply shot in 4:3, or even an SD clip in an otherwise widescreen show.

    Worse are the widescreen shows broadcast as 4:3 SD -- then you've got a little widescreen box right in the middle of your bigger widescreen TV.

    It's maddening.

    I'm going to say that, once again, broadcast TV fails. Why would I want to watch the show all censored, with ads every 5 minutes (and some in the middle of the show), compressed to hell, and now they even fuck up the aspect ratio, when I can just head over to my nearest torrent site^W^WNetflix queue and get a much higher quality version that just works on my computer?

  • Re:Are they nuts? (Score:3, Insightful)

    by vux984 ( 928602 ) on Friday November 28, 2008 @01:15AM (#25913673)

    Mine does, with Shaw Cable, and I have the Shaw HD PVR box. Half a dozen HD channels or so are included with the package, and by subscribing to moviecentral or superchannel I qualify for the HD feeds on those.

    But there are SEVERAL channels I currently get in SD... like A&E, TSN, etc that I would have to subscribe to an extra package upgrade to get the HD version of it, which I think is pure money grabbing B.S.

  • Re:Motion blur (Score:3, Insightful)

    by SirJorgelOfBorgel ( 897488 ) on Friday November 28, 2008 @01:39AM (#25913779)

    It's also why film looks acceptable despite 24 fps.

    I'm sorry, what? Granted, I'm only using a very expensive very large LCD instead of the plasma everybody here seems delirious about, and my eyes aren't 20/20. However, if you are watching any recent movie with even minor action in it in 24 fps on a sufficiently large screen (lets say starting at somewhere around 40", from not too far away) and you are not completely annoyed with the low fps when the image is panning, you definitely need to get your eyes checked out (or perhaps it's your brain can't handle the fast images? who knows). And no, it's not my TV, I see it on many brands in many sizes, of various reputed quality.

    It's like those old DOS games who'se authors hadn't yet figured out how to program the video card's registers to pan smoothly and instead move X pixels per screen update (if that doesn't make sense to you, don't even worry about it), but now it's a hell of a lot more pixels.

    And I may not be a salesman or a researcher, but if you can't see the difference between HD and SD (barring a really crappy cable provider), you also need to get your eyes checked out. Not a single person that has ever watched a movie or otherwise HD TV in my apartment has gone without saying something along the lines of "WOW! WTF?", and this is in a PAL country, so the difference is generally less than those of you watching in NTSC (though granted, if you got a really good cable - and I am talking about line quality here - analogue PAL can actually look halfway decent).

    It's like people not being able to discern even the best of CGI in the movies of today and things that were actually filmed... Are you even looking at the same thing I am?

  • Re:Truly (Score:1, Insightful)

    by Anonymous Coward on Friday November 28, 2008 @02:28AM (#25913973)

    I do watch many Discovery shows where HD really shines (nature, megabuilders type shows, etc). In addition to sports. I think the sports and concerts on HDNet are great. That said, I appreciate the difference in shows like Dexter, Heroes, Galactica, etc. SD may be "adequate" but these shows have visuals that are improved by HD to the point that I do enjoy them more. It's more pronounced in films than television as the bigger budgets and better locations are more interesting to look at. I don't think many care whether or not Judge Judy is in HD. If you watch loads of trash TV there really isn't a difference. I personally don't watch much TV but of the shows I do watch they are exclusively in HD.

    I think some of the motivations for the "adequate" argument mirrors what you see in console wars. People who don't have one of the consoles try to put down the other to try to justify to themselves not having the other one and feel good about what they do have. So you get people saying how the other sucks, then they get the other and immediately change their tune. All along they weren't really saying anything substantive about the products, they were trying to feel good about not having the money to own both. Obviously old equipment is adequate for enjoyment and loads of people are still playing PS2 games. The same follows for television and DVD but the argument itself is disingenuous as people tend to actually be arguing about the COST of the difference in enjoyment from SD to HD (when they aren't misinformed or have bad vision anyway).

  • Re:Motion blur (Score:3, Insightful)

    by neomunk ( 913773 ) on Friday November 28, 2008 @02:33AM (#25913993)

    24 fps is the standard used in movie theaters, I think that's what the GP was talking about.

  • by crazybilly ( 947714 ) on Friday November 28, 2008 @03:28AM (#25914167) Homepage Journal
    Mod parent up. This is EXACTLY this issue. People (ie. my in-laws) buy an HD-capable LCD tv. They bring it home. They unhook the old TV. They hook the new TV up the same way. They tune to the same stations. Everything looks weird. But they're getting HD, right?

    Who knows? Not them, that's for sure. They just want to watch football. And if the picture's coming and and the TV's one of them HD TVs, well, it must be HD.

    18% doesn't suprise me one bit. Unless we're talking about side-by-side testing. I'd believe that. But if we're talking about asking people if the show they're watching is in HD or not and then finding out if it really is, I'd be willing to guess that the number of people watching SD on an HD-capable TV is more like 25-35%.

  • by CelticLo ( 575344 ) on Friday November 28, 2008 @04:39AM (#25914363)
    I have a HD cable PVR and a 720p HDTV. My SD feed is greatly improved by the HD PVR solely because it's a better specced box than the standard cable box provided by Virgin Media in the UK.

    I believe it is the same here in the UK with the Sky (Fox) satellite boxes; the "HD Ready PVR" has far better decoders than the standard sat receivers on the SD channels as well as the superior digital out HDMI. [Shame its encumbered by the usual DRM]

    So a phone poll asking users would be flawed if this is the case in the polls geographiocal catch area.
  • Re:Yep (Score:3, Insightful)

    by theaveng ( 1243528 ) on Friday November 28, 2008 @08:58AM (#25915285)

    Even though the 1080i x30fps has about 3 times more pixels than 480p x60fps, the addition of depth perception provides far more information to the human brain. Which would you rather watch? - A flat video of the Victoria Secret fashion show? - A "deep" video where the curves stand-out and look touchable?

    I know not what course others may take, but as for me, Give Me 3D.

  • Re:Many variables (Score:3, Insightful)

    by ciderVisor ( 1318765 ) on Friday November 28, 2008 @10:08AM (#25915613)

    The difference between a SD DVD and a HD-DVD is striking at first, but within 5 minutes of a film starting, I stop caring.

    A hugely important point. I still watch VHS movies on my CRT TV and still find them amazingly immersive. So long as you can make out the images and sound without too many distracting artifacts, then all that matters is that you enjoy the movie. That's why BluRay and HDTV haven't become 'must-have' tech. I saw the BluRay remastered version of 2001 at a friend's house a couple of weeks ago and it was truly stunning - you could see every hair on each character's head. At the same time, though, I'm just as happy watching it at home on my old VHS / CRT combo and getting into the story.

  • Damned lies... (Score:3, Insightful)

    by Ginger Unicorn ( 952287 ) on Friday November 28, 2008 @10:47AM (#25915861)
    So what the article should say is that 81% of consumers CAN tell the difference between HD and SD. I wonder what percentage of those surveyed had bad eyesight? I'm pretty sure it's more than 18% of them. This is a non story.
  • by reallocate ( 142797 ) on Friday November 28, 2008 @11:25AM (#25916145)

    It's depressing that so many folks here are using this survey to blast people as morons. Depressing, but not terribly surprising.

    Very, very, very few customers looking to buy a new TV are going to have a clue about things like FPS or pixels or whatever. There's no reaon why they should.

    People will judge the quality of a TV's display by looking at it. It seems obvious that, given the variations in our eyesight, a lot of people aren't going to notice the difference between SD and HD, just as a lot of people can't notice the difference between sound reproduced on an audiophile's high-end dream and a $200 box.

    It's not important and, frankly, most people don't care about HDTV. If the programming isn't worth watching, who cares about anything else?

Work continues in this area. -- DEC's SPR-Answering-Automaton

Working...