Please create an account to participate in the Slashdot moderation system


Forgot your password?
Television Media

18% of Consumers Can't Tell HD From SD 603

An anonymous reader writes "Thinking about upgrading to an HDTV this holiday season? The prices might be great, but some people won't be appreciating the technology as much as everyone else. A report by Leichtman Research Group is claiming that 18% of consumers who are watching standard definition channels on a HDTV think that the feed is in hi-def." (Here's the original story at PC World.)
This discussion has been archived. No new comments can be posted.

18% of Consumers Can't Tell HD From SD

Comments Filter:
  • by Anonymous Coward on Thursday November 27, 2008 @09:36PM (#25912447)

    I'm half blind, and SD makes me want to gouge my eyes out after watching HD.

    • by compro01 ( 777531 ) on Thursday November 27, 2008 @09:39PM (#25912461)

      No, they just used Comcast "HD" for the tests.

    • Re: (Score:3, Interesting)

      I have a HDTV and can tell the difference but don't care. I am not willing to pay the price difference for HD tv shows. My HDTV isnt going to waste tho, I do use it for high def gaming.

      • Many variables (Score:5, Informative)

        by caitsith01 ( 606117 ) on Thursday November 27, 2008 @10:05PM (#25912611) Journal


        - type of screen - plasma vs LCD, SD would be more noticeable on the latter IMHO.

        - 720p, 1080i or 1080p? All are technically "HD".

        - distance from screen - it is well established that HD only improves your experience if you are close enough to overcome your eyes' limited ability to resolve that level of detail.

        - quality of signal - I have seen "HD" signals which were so compressed and crappy they looked worse than well-encoded SD signals. Similarly, many "HD" broadcasts are just re-encoded from non-HD content.

        My gf routinely has the SD, rather than HD, version of various TV channels on because evidently from her point of view there is no discernable difference. This is a 42" plasma from about 4 metres away.

        In any event, this just highlights that, as with all audio-visual products, how it actually looks/sounds to you is far more important than its specs. IMHO you are much better off with a good 720p plasma (Pana or Pioneer) than a mediocre 1080p LCD, for example - you will get better colour, much less ghosting, and (if set up correctly) a more faithful reproduction of the source material rather than a sharpened, cartoon-y looking version like many LCDs produce.

        In addition, your expected use is critical - movies and sport tend to suggest a plasma will suit your needs, whereas lots of normal broadcast TV/desktop-type computer use might be better suited to an LCD.

        • Re:Many variables (Score:5, Insightful)

          by AKAImBatman ( 238306 ) * <akaimbatman@gm a i l . com> on Thursday November 27, 2008 @10:16PM (#25912667) Homepage Journal

          My gf routinely has the SD, rather than HD, version of various TV channels on because evidently from her point of view there is no discernable difference. This is a 42" plasma from about 4 metres away.

          I can tell the difference, and I don't care too much about the quality improvement. The primary reason I like the digital channels is that they are true 16:9 widescreen. Opening up the edges of the scene makes a much bigger difference than the horizontal resolution, as far as I'm concerned.

          Of course, that only applies to regular television shows. Camera operators have been trained for decades to keep the camera tight on the subjects. Thus the extra detail is not needed. If you're talking about a complex scene like sports, however, all bets are off. I don't usually watch football (save for the Superbowl), but even a blind man can tell that an HD picture shows you more of the action than an SD picture. :-)

          BTW, one reason why many people can't tell the difference is that the LCD or Plasma screens are already WAY sharper than the CRTs people used to watch. In result, even an SD signal looks a lot better. (Unless you're playing video games. Then SD looks worse.)

          • Re: (Score:3, Insightful)

            the LCD or Plasma screens are already WAY sharper than the CRTs people used to watch
            Depends on how long ago.

            Many LCDs and plasmas are not as good as latest generation CRTs.

            The picture I get on my 100Hz widescreen SD CRT is still well ahead of many LCD or Plasma sets I've seen. The response time of the LCDs is one big difference - they don't deal with fast motion well. (A stationary shot of a sportsground is OK, but as soon as they pan the grass goes all blurry.) What many people might confuse is watching a

            • Re:Many variables (Score:5, Interesting)

              by caitsith01 ( 606117 ) on Thursday November 27, 2008 @11:16PM (#25913051) Journal

              Sure the top end flat-screen TVs might be ahead of the best CRTs, but I think the average CRT is still ahead of the majority of flat-screens that seem to be being snapped up by budget concious consumers. A digital signal makes a big difference, after that, not so much.

              I waited until this was consistently, noticeably no longer the case before buying a plasma. I still would not by an LCD, although the higher end Sony 1080p models are starting to look pretty amazing when set up with optimal source material.

              I also had a decent Sony CRT, which I gave to my parents when I got a Panasonic plasma. Although I thought after a while that maybe the plasma wasn't *that* much better, I have since been and re-watched the Sony, and frankly the plasma blows it back into last century, where it belongs. You just cannot beat the clarity (not to mention size and response time) of plasmas IMHO.

          • Re:Many variables (Score:5, Insightful)

            by SanityInAnarchy ( 655584 ) <> on Friday November 28, 2008 @01:04AM (#25913615) Journal

            The primary reason I like the digital channels is that they are true 16:9 widescreen. Opening up the edges of the scene makes a much bigger difference than the horizontal resolution, as far as I'm concerned.

            Except, frustratingly, they're often not.

            Here's what usually happens: No one wants to put those vertical bars up. So when showing a 4:3 show on your 16:9 screen, they usually scale it -- which looks awful (squashed). This is true whether it's an SD feed scaled up, an HD version of a movie that was simply shot in 4:3, or even an SD clip in an otherwise widescreen show.

            Worse are the widescreen shows broadcast as 4:3 SD -- then you've got a little widescreen box right in the middle of your bigger widescreen TV.

            It's maddening.

            I'm going to say that, once again, broadcast TV fails. Why would I want to watch the show all censored, with ads every 5 minutes (and some in the middle of the show), compressed to hell, and now they even fuck up the aspect ratio, when I can just head over to my nearest torrent site^W^WNetflix queue and get a much higher quality version that just works on my computer?

            • Re: (Score:3, Informative)

              by Golias ( 176380 )

              What country do YOU live in!?

              I've been watching digital TV exclusively for years, and have yet to see ANYTHING that was not broadcast in its Original Aspect Ratio.

              Reruns of shows from before 16:9 was common always throws up the black bars.

              Even relatively high-res shows like NBC's "Poker After Dark" gets a 4:3 picture with network logos on the sides. Nobody EVER stretches the screen out.

              If you're watching the same broadcasts I am, I think you might have the settings on your TV misconfigured to fit images to

          • Re:Many variables (Score:5, Informative)

            by slim ( 1652 ) <{john} {at} {}> on Friday November 28, 2008 @04:59AM (#25914439) Homepage

            The primary reason I like the digital channels is that they are true 16:9 widescreen.

            That's an American thing, where the broadcasters decided not to standardise on 16:9 or DVB until they could bundle it with HD.

            In the UK (an probably the rest of Europe - not sure) 16:9 SD DVB-T has been broadcast since 1998, all new sets (for some years hence) can receive it.

            The difference between a SD DVD and a HD-DVD is striking at first, but within 5 minutes of a film starting, I stop caring.

            • Re: (Score:3, Insightful)

              by ciderVisor ( 1318765 )

              The difference between a SD DVD and a HD-DVD is striking at first, but within 5 minutes of a film starting, I stop caring.

              A hugely important point. I still watch VHS movies on my CRT TV and still find them amazingly immersive. So long as you can make out the images and sound without too many distracting artifacts, then all that matters is that you enjoy the movie. That's why BluRay and HDTV haven't become 'must-have' tech. I saw the BluRay remastered version of 2001 at a friend's house a couple of weeks ago and it was truly stunning - you could see every hair on each character's head. At the same time, though, I'm just as hap

        • Re:Many variables (Score:4, Interesting)

          by Miseph ( 979059 ) on Thursday November 27, 2008 @10:46PM (#25912869) Journal

          Last I heard the only things actually broadcast in HD are the World series and the Super Bowl, so yeah, I'd say that if you're watching a lot of broadcast TV but not much sports, you're just as well off getting a 720p anyway.

          Incidentally, I spend a lot of time answering questions about TVs, selling them is part of my job. Funny thing though, all of our display model HDTVs are playing a looped DVD over split coax that isn't even terminated on the unused outlets... people will stand there oohing and ahhing over how great the picture is despite the fact that it is absolutely not HD in any way shape or form. Makes it pretty hard to convince people Sony sets are worth more than Olevia ones, too.

          This headline comes as so little a surprise to me that I have trouble believing anyone even doubts it.

        • Yep (Score:3, Interesting)

          by maz2331 ( 1104901 )

          It's strange, but I work with stereoscopic video and have noticed that even 640X480 in stereo 3D looks a lot sharper than 1920X1080 mono.

          It is a psycho-visual effect, for sure. But it is real.

          IMHO - forget about HD and use the bandwidth for 3D.

          • Re: (Score:3, Insightful)

            by theaveng ( 1243528 )

            Even though the 1080i x30fps has about 3 times more pixels than 480p x60fps, the addition of depth perception provides far more information to the human brain. Which would you rather watch? - A flat video of the Victoria Secret fashion show? - A "deep" video where the curves stand-out and look touchable?

            I know not what course others may take, but as for me, Give Me 3D.

      • Re: (Score:3, Interesting)

        by hairyfeet ( 841228 )

        And you have just hit on the problem with HD. For most folks DVD has reached the "good enough" point and there simply isn't enough of a difference for them to justify the added expense. I know that even though I watch all of my TV and DVD shows on a monitor capable of 1600by1200 there just isn't a big enough difference to my eyes to be worth the extra trouble of going HD. I just keep my CRT set to 1024by768 and am quite happy with the picture. And DVDs have gotten cheap enough that the price difference of B

    • Re: (Score:3, Interesting)

      by Lumpy ( 12016 )

      Nope. in fact I highly suspect their findings as too low. MOST people that buy a HDTV and have my company install it cant tell the difference between a SD broadcast and a HD broadcast, because they sit 15 feet away from that 50" plasma above their fireplace.

      you need to sit 6-8 feet from a 42-50" display to really see the difference, more than that and your eyes cant see the resolution.

      It's even worse if your HD signal is a crappy signal like Comcast. The Comcast local PBS Hd QAM channel looks like hell

      • Re: (Score:3, Interesting)

        by nabsltd ( 1313397 )

        you need to sit 6-8 feet from a 42-50" display to really see the difference, more than that and your eyes cant see the resolution.

        Oh, my, you must be blind.

        I sit over 10 feet away from my 38" TV (an honest-to-goodness picture tube set), and everyone who has seen the TV can tell the difference. It's measured resolution is about 1400x900, so most all HD is at or close to full resolution.

        Meanwhile, SD is at most 720x480, and usually a lot less than that. It's easy to tell the difference.

        Now, the difference between 1920x1080 and 1280x720 is something that you can't really tell without a large display with the ability to fully resolve 19

  • Frame rate (Score:5, Interesting)

    by Twinbee ( 767046 ) on Thursday November 27, 2008 @09:42PM (#25912471) Homepage

    Perhaps even more irritating than this, is how some people can't distinguish between 30 and 60 FPS (or at least don't care), when of course there is a massive difference. The latter is much smoother for all kinds of programmes and games. 120 FPS of course would be even better...

    • Re: (Score:3, Interesting)

      by Tubal-Cain ( 1289912 ) *
      Does any framerate greater than your monitor's refresh rate matter?
      • Re:Frame rate (Score:4, Interesting)

        by bhtooefr ( 649901 ) <bhtooefr@bhtooefr . o rg> on Thursday November 27, 2008 @09:49PM (#25912513) Homepage Journal

        In a very roundabout way, yes.

        If a graphics card can barely average 60 FPS (or whatever your monitor's refresh rate is - my ThinkPad runs its LCD at 50 Hz,) then it's going to have dips well below 60 FPS.

      • Re: (Score:2, Funny)

        by tehniobium ( 1042240 )
        Yeah, its cool :)

        Also gamers compare FPS instead of comparing penis size. Surely that's a good thing, even if both are reasonably pointless?
      • Motion blur (Score:5, Informative)

        by tepples ( 727027 ) <> on Thursday November 27, 2008 @10:48PM (#25912883) Homepage Journal

        Does any framerate greater than your monitor's refresh rate matter?

        Yes. If your engine can render at 120 fps, it can render the scene twice and combine the two images to add motion blur. This makes fast motions, such as projectile motions and the constant quick pans of any first-person game, look more realistic. It's also why film looks acceptable despite 24 fps.

        • by spaceturtle ( 687994 ) on Thursday November 27, 2008 @11:08PM (#25913005)
          Here is a link [] that discusses this further. They mention that a human can see an object that is displayed for one 500th of a second, if it is bright enough. In RL your eyes do the motion blur for you. This is also similar to how anti-aliasing [] works, which in its basic form is rending the frames at a higher resolution [] than the monitor can display and then downsizing the picture so we can averaging the pixels.
        • Re: (Score:3, Insightful)

          It's also why film looks acceptable despite 24 fps.

          I'm sorry, what? Granted, I'm only using a very expensive very large LCD instead of the plasma everybody here seems delirious about, and my eyes aren't 20/20. However, if you are watching any recent movie with even minor action in it in 24 fps on a sufficiently large screen (lets say starting at somewhere around 40", from not too far away) and you are not completely annoyed with the low fps when the image is panning, you definitely need to get your eyes checked out (or perhaps it's your brain can't handle t

          • Re: (Score:3, Insightful)

            by neomunk ( 913773 )

            24 fps is the standard used in movie theaters, I think that's what the GP was talking about.

    • Re: (Score:2, Interesting)

      by Xorlium ( 1409453 )
      Hehe, I can definitely distinguis between HD and SD, but I definitely can't distinguish between 30fps and 60fps :) I don't see frames, I see things moving!
      • Re:Frame rate (Score:5, Informative)

        by Twinbee ( 767046 ) on Thursday November 27, 2008 @10:06PM (#25912617) Homepage
        Try this - you may notice the difference after all. Honestly, it's not *that* hard to spot: Vid comparison of 24fps versus 60fps []

        They always shoot (or at least play) films at 25/30fps, and that irritates me no end. They basically look quite jerky when you know what to look for.
        • Re: (Score:3, Interesting)

          by Daetrin ( 576516 )
          Well except for a couple of the really quick turns i couldn't really tell the difference at all between the two. For the comparison of the balls bouncing back and forth they clearly looked different, but i couldn't tell which i actually preferred. The 60 fps one was obviously sharper, but it seemed to be flickering between positions or something and looked somewhat odd.

          Clearly the person who put the video together needs to work on their marketing skills however. When trying to convince people at agree wit
          • Re: (Score:3, Informative)

            by Twinbee ( 767046 )

            You're quite right that the 60fps version had a few glitches in it (thanks to bad video quality). It's surprisingly hard to get smooth and fast video on the PC for various reasons, unless you know what exact codecs to use.

            An earlier post mentioned this comparison which is probably better:

    • Re:Frame rate (Score:5, Interesting)

      by datapharmer ( 1099455 ) on Thursday November 27, 2008 @09:50PM (#25912521) Homepage
      There is good evidence that humans are adapting to the frame rate and that for along time 30 FPS was enough to not notice the flicker... this is a ongoing problem, but it doesn't mean that some people still don't notice while others (such as gamers) may be more apt to notice.
      • Re:Frame rate (Score:5, Insightful)

        by sumdumass ( 711423 ) on Friday November 28, 2008 @12:07AM (#25913353) Journal

        There is even better evidence that the HD providers are compressing the channels and the HD streams they are watching aren't actually HD quality representations of the original content.

        I don't have cable tv myself but a friend who does remarked at how sharp my TV was when watching a Blueray DVD. Even the over the air TV stations were coming in more clear then his Cable HD was on most channels. I took the DVD over to his house and we hooked it up, the comparisons where amazingly different. The HD channels he had (some basic HD package with his cable provider) looked like watching older DivX standard videos with a 340 size or something. All the blacks and fields of the same color were blotchy and blocky, there was a considerable lag between scenes and so on. When we connected the Blueray and watched Narnia or something stupid like that. The picture was every bit as sharp as mine even though we had separate TVs.

        Gamers and so on might be able to tell the difference in a lot of this but I think that most cable/satellite HD content isn't actually HD in it's delivery so most people also haven't experienced real HD long enough to know the difference.

    • Re: (Score:3, Interesting)

      by s.bots ( 1099921 )

      We just bought a new tv not too long ago (Sony Bravia) with the option af setting the refresh rate to 120Hz, makes an amazing difference watching sports or anything with fast motion, but makes regular tv shows very eerie - almost cheap looking. I don't know how anyone could not tell the difference with fast motion, maybe if you were watching the fireplace channel...

  • Its worth noting (Score:5, Insightful)

    by Bazar ( 778572 ) on Thursday November 27, 2008 @09:43PM (#25912481)

    The links don't say that 18% can't tell the Difference

    Just that 18% can't tell if what their seeing is HD

    An analogy would be playing mp3's, and asking people if it was 320kbps, or 64kbps.

    Most people won't be able to tell the encoding rate just by hearing it, but if you play two different versions side by side they should be able to pick out the difference.

    They probably can tell the difference, but they can't spot HD just by looking at it.

    Give them an HD Content for a month and they'll quickly learn however.

    • Re:Its worth noting (Score:5, Interesting)

      by Shados ( 741919 ) on Thursday November 27, 2008 @09:50PM (#25912517)

      It is still important. I can most definately tell if what I'm watching is from a crappy VHS or from a DVD. That was obvious the first time I ever saw a movie on DVD. Walked in a room, saw people watching a DVD movie, and was like " thats a DVD movie eh?". An HD source vs an SD source (to be fair, I'm talking about a movie or TV show... other kinds of content will be easier) gets a lot trickier.

      I remember last time I brought my Xbox 360 to a family member's place. All of their TV's HDMI connectors were taken (which is what I normally use), so I brought the component cables (which can do 720p just fine). Since I had never used component, the console went back to default: 4:3, 480 lines. After playing a few hours, I started noticing something weird.... the ratio (the game i was playing didn't make it totally obvious like most would). So I went in the config to set it back to 16:9, when I noticed... 480 resolution? The hell? Switched it back up to 720p... There was a difference, but it wasn't all that obvious (no, it wasn't one of those 520p games that they upscale).

      I'm sure I'm not the majority and that most people would have been able to tell much faster, but point still stands though: for a large amount of people its fairly irrelevent if you give them HD for a month or a year. As long as there's no artefact in the picture (like VHS), how many pixels you pump in Sex in the City won't matter.

    • Into this seemingly complex insight is also the dynamic variable of eyesight. If your eyes are so bad you always see in SD it doesn't matter if you have HD. ;)

      Having said that I think us nerds need to be mindful that not everyone pays attention to minute details. We are inherently trained to pickup details down to the pixel because a difference of a period (.) and a comma (,) can be the difference between a syntax error and a successful compile.

  • by Shados ( 741919 ) on Thursday November 27, 2008 @09:43PM (#25912483)

    I'd be more interested in a comparison between upscaled SD and HD. That is, an upscaled DVD (even the Xbox 360 upscale would need to go fancy), vs a 720p source. I bet that 18% would become much, much higher... I have 2 TV of exactly the same size and resolution, and I tried putting them side by side... aside for the annoying 4:3 ratio that most DVDs are in, Its freakishly hard to tell the difference on anything below 40-45 inches (at a reasonable distance... of course its easy if you have your face in the TV).

    The biggest reason SD "looks so awful about seeing HD" is because the built in upscalers of most HDTV is completly horrible, and make SD sources look faaaaaar worse than they should.

    • by ogminlo ( 941711 ) on Thursday November 27, 2008 @09:56PM (#25912551)
      I'm a broadcast engineer, and when we brought in a Teranex VC100 (broadcast version of the HQV chipset) to test how it compared to real HD we were stunned to discover that even our snobbiest and best-trained eyes could hardly tell the difference between upscaled anamorphic 480i and true 1080i. The testing was performed on a $60K Sony BVM series HD broadcast monitor. Granted, it was not trying to make 1080p and both were 29.97 fps, but the results were still very impressive. We were hoping to see it fail so we could justify a bunch of HD equipment, but the Teranex did too good a job. There is a consumer version of this chip- the Realta HQV. Higher-end home theater gear uses it to scale HDMI up to 1080p. Upconvert a 16:9 DVD with an HQV device, and you get 99% of the quality of Blu-Ray.
      • by the eric conspiracy ( 20178 ) on Thursday November 27, 2008 @11:56PM (#25913291)

        I have a DVD player with an Realta HQV video processor and it really does a great job. In order to visually benefit from something like BluRay over a top notch scaler you have to have a pristine master and a high quality large screen (1080p at least 50"). It is difficult to get that good a master from older film or most video. That is great news since the vast majority of my current DVD collection will remain satisfactory for a long time.

        But - new films mastered in HBR sound formats and 1080p on a good screen are enough better in both sound and appearance that I have stopped buying DVDs. I am renting until an acceptable BD player becomes available at which time I will start buying BluRay disks.

  • by Zerbey ( 15536 ) * on Thursday November 27, 2008 @09:46PM (#25912495) Homepage Journal

    There's an ongoing battle in my family between keying in the "standard definition" version of channels and the "high definition". They all think I'm this weird limey geek (I'm the only English person in the family) who's obsessed with it. They're right of course. You should've seen the argument when I blocked the SD channels *grin*.

    The fact is, most people really don't care so long as the TV is reasonably sharp and the sound is reasonably good. Standard definition is perfectly watchable to the average user, HDTV is still seen as just another buzz word. The majority of people with newer HDTVs are watching them with the coaxial cable stuffed into the antenna port in SD, and they're none the wiser.

  • by TimHunter ( 174406 ) on Thursday November 27, 2008 @09:48PM (#25912509)

    20 year old eyes are much better than 50 year old eyes. I wonder how many of the 18% are older folks? I'm 55 and I'm hard-pressed to distinguish between SD and HD.

  • This means 82% can (Score:5, Insightful)

    by cpct0 ( 558171 ) <slashdot@mic[ ] ['hel' in gap]> on Thursday November 27, 2008 @09:51PM (#25912531) Homepage Journal

    Of course the psychology of words will make you believe this is horrible, when in fact, 82% can tell the difference!

    Then, like said elsewhere, a properly upscaled good-resolution SD is very potent. What is crap is the digital signals we're being fed.

    A story that happened to me. I used to listen to Paramount channel for ST:V a few years ago (god I'm old), and this was the only digital channel I used to have. Sometimes, I couldn't listen to some shows immediately, so I time-shifted them on a VHS, in EP (that's the 8 hours per cassette mode, young folks ;) ), and even then, with quality degraded, I could still see the digital scans when scenes were changed, or during space-blacks! Now that my boobtube provider is putting approximately 3 times the amount of channels into the same QAM, quality is even worse than before.

  • by Anonymous Coward

    That's depressing. I mean, how hard is it really? One's as big as a postage stamp and goes in your camera, and the other goes in your computer ... oh, wait ...

  • by syousef ( 465911 ) on Thursday November 27, 2008 @10:01PM (#25912579) Journal

    Please use usersAreBlind instead ;-)

    In all seriousness though, blaming people for being unable to tell the difference between SD and HD isn't a positive thing. The irony being that if they can't tell the difference they get to save themselves a whole lot of money. Thoguh personally I'd rather have decent eyesight and make the choice of SD vs HD based on whether I think it's worth it. I can tell the difference and I'll be sticking with SD until HD is much cheaper by the way.

  • by Bones3D_mac ( 324952 ) on Thursday November 27, 2008 @10:03PM (#25912591)

    Humans are often easily distracted creatures, as demonstrated by numerous examples of highly successful ad campaigns over the years. As long as you present the audience with enough interesting or flashy content, the quality of the medium becomes less relevant.

    The solution to speeding up HD adoption, is to make the content itself less interesting. The viewers will have no choice but to start taking notice of external annoyances like picture quality.

  • $1000 Better... (Score:5, Insightful)

    by wzinc ( 612701 ) on Thursday November 27, 2008 @10:11PM (#25912635)

    Is HD better than SD, yes. Is it worth the $1000 extra you have to spend on everything to get HD? IMHO, no, but I know others feel differently.

  • by Doc Ruby ( 173196 ) on Thursday November 27, 2008 @10:18PM (#25912679) Homepage Journal

    I'm impressed that the 18% number isn't higher. I mean, come on. The bottom 18% of your high school class were "F" students. And that was when someone was regularly feeding them info, telling them how to tell what was going on, regularly testing them. These people are morons. 82% noticing it's HD is pretty impressive.

  • by actionbastard ( 1206160 ) on Thursday November 27, 2008 @10:22PM (#25912703)
    Nearly four out of five viewers can tell the difference. This correlates well with other studies that show four out of five respondents [] answer surveys.
  • by Godji ( 957148 ) on Thursday November 27, 2008 @10:33PM (#25912769) Homepage
    80% of consumers can't tell 192kbps mp3 from FLAC. 70% of comsumers can't tell IE from Firefox. 60% of consumers can't tell their head from their ass. Your point?

    Of course I've pulled these numbers out of my ass, where I pull 63% of all statistics I post on Slashdot.
  • by w0mprat ( 1317953 ) on Thursday November 27, 2008 @10:52PM (#25912915)
    Generally SD looks noticeably better when upscaled on a respectable HDTV. Especially when the person has upgraded from a CRT, old rear projection or some older not so good panel TV. Also, a current HDTV will have superior colour &/ contrast (often artificially boosted) than the older SD screen.

    These factors would account for a good fraction of the statistic the being rest of the would be accounted for by the Idiot Factor - or to be fair, that many people have slightly off eyesight, or may be just sitting too far away.
  • by wylderide ( 933251 ) on Thursday November 27, 2008 @10:52PM (#25912917) Homepage
    ... It's the same old crappy writing and acting, characters and dialogue. Now, with HD, you get a crystal clear image of the crap they put on the millions of channels. Yay! Maybe once they put out something worth watching I'll worry about the picture quality.
  • by Cowclops ( 630818 ) on Thursday November 27, 2008 @11:00PM (#25912947)

    And in other news: 82% of people CAN tell the difference between SD and HD.

    You want your optimal viewing distance to be on the line for whichever format you watch the most of, which is about where you'd notice the quality difference between that and the next worst format. If you have a TV smaller than 42" or so or you're sitting very far away for whatever screen size you have, you won't be able to tell the difference.

    And yes, I'm going too post this on every "Stupid people can't tell SD from HD" story until people stop asserting that HD isn't that much of an improvement over SD. I use a 720p projector on a 65" screen that I sit 10 feet away from and Transformers on HD-DVD looks CONSIDERABLY better than Transformers on DVD.

  • by tompaulco ( 629533 ) on Thursday November 27, 2008 @11:01PM (#25912965) Homepage Journal
    18% of audiophiles were surprised that they could tell no difference between sound coming through standard 18 gauge wiring and sound coming through $200 per foot premium cables. The other 82% of audiophiles distinctly heard the difference. However, it turns out that the engineers performing the test forgot to actually switch over from the cheap ones to the expensive ones so both tests were on the same cheap wires.
  • So what... (Score:4, Insightful)

    by danwesnor ( 896499 ) on Thursday November 27, 2008 @11:16PM (#25913049)
    About 70% of consumers think that hooking up an HDTV to an SDTV cable box makes it HDTV. And 99% of consumers don't realize that the big box stores have a nasty habit of piping SDTV into the cheaper HDTVs while the expensive boxes get the real deal.
    They can't tell, but I see a world of difference, and that's all that matters.
  • by SoberSage ( 1418881 ) on Thursday November 27, 2008 @11:25PM (#25913103)
    People can't tell the difference from HD and SD because when using cable (Charter, USCable, Comcast, etc.) there is not a difference they compress the crap out of the signal to fit more on the available room on a cable line. I know this because I use to work for a contractor that was working for a major cable company here in MN. I am not talking from experience but I hear that satellite HD is much better then cable can. That is just hear say from my side. However DVD is much better then either and watching Blue-Ray is way better then DVD. Happy Thanksgiving!
  • by radarsat1 ( 786772 ) on Thursday November 27, 2008 @11:31PM (#25913133) Homepage

    In my experience, people tend to care more about things other than the video resolution when watching TV. Like, say, the plot, or the character development.

    Watching hokey, on the other hand, I can understand why people would want to see the puck better, but in the general case I think no one gives a *** about resolution.

    If it's a good movie I'll happily watch it at 320, blurry, at 15 FPS, if that's all I can get.

    Frankly, when it comes down to it, the sound quality matters more than the video.

    If you can't hear what the actors are saying you may as well turn it off, but if you can basically get the idea of what's going on, video isn't that critical.

    Maybe I just have low standards.

  • One important detail (Score:5, Interesting)

    by Pr0xY ( 526811 ) on Thursday November 27, 2008 @11:36PM (#25913189)

    There is something that they aren't accounting for. People (especially less tech savvy people) not realizing that they aren't watching HD, they just assume if it's on a newer plasma/lcd, then it's HD.

    For example, I have a relative who was watching football today on my cousin's plasma. He of course tuned to the channel he gets at home (CBS), the non-HD version. Simply because he had no idea that verizon offers HD versions of pretty much all basic cable just by going to channels above 500 in my area.

    At some point, it occurred to me that the picture didn't quite look up to snuff, so I asked him what channel he was on (since often SD os broadcast on HD channels because the original signal was SD), he said 7. I said "a-ha! you should switch to the HD version of this channel!".

    He was confused, but told me to go for it. He was *amazed* at the difference in clarity. He said claimed it looked like he was down on the field.

    Not being able to tell the difference is very difference from not knowing there is a difference available.

    I would wager that if you put the 2 screen side by side, one showing the signal in true HD and the other in SD. Anyone without vision problems can tell the difference.

    • Re: (Score:3, Interesting)

      by chrysrobyn ( 106763 )

      I would wager that if you put the 2 screen side by side, one showing the signal in true HD and the other in SD. Anyone without vision problems can tell the difference.

      Who cares? Do you want to go out creating more videophiles? Does this world lack enough audiophiles for you? Side by side, I can tell the difference, but most content on television is not actually improved by increased bitrates or resolutions. Sitcoms and dramas, in particular -- you don't need to see the flaws of the actors' and actress'

  • 18%? (Score:4, Insightful)

    by iceT ( 68610 ) on Thursday November 27, 2008 @11:43PM (#25913221)

    Who gives a shit? 18% of the people probably still think the world is flat. I bet a lot of those people said that because they're resisting buying a new TV.. or, like my Mom, who bought a new SD TV 4 years ago, and really doesn't want to buy a new one yet.

    Where's this story: 82% of the people think that HD television is better than SD television. If that's not news worth, why is this?

  • Bad math (Score:3, Funny)

    by Groggnrath ( 1089073 ) <> on Friday November 28, 2008 @12:19AM (#25913407)
    18% of consumers can't tell the difference.
    72% can.
    Misleading facts and poor mathematics.
  • by Runefox ( 905204 ) on Friday November 28, 2008 @12:56AM (#25913573) Homepage

    Isn't that people are stupid, but that the HD content we currently have isn't exactly HD. Even the snazziest Blu-Ray displays in places like the Sony Store or any big electronics retailer seem to have really nice-looking visuals, but they also seem to have a big problem not only with interlacing(?! Isn't this 1080p?!), but also with video compression artifacts. In many cases, when I look at the TV's on display, I can't usually tell that what I'm looking at is HD, unless the video's been specifically tailored to show off the resolution. TV broadcasts (the few that are HD around here), Blu-Ray movies (especially live action), doesn't matter. It all looks quite muddy, and I'm distracted often by the block and ring artifacting, just as I was when DVD was first released.

    I don't have an HDTV or an HD player, myself, so I'm not intimately familiar with how current movies are being compressed on the disc, but... Don't they have any room to turn up the bitrate a little? I mean, sure, it's not reasonable to expect an uncompressed image (though I'd really like it), but seriously, the video compression quality sucks.

    You can have as high a resolution as you want, but when artifacts are large enough to casually notice, you've defeated the purpose of that resolution; I would have rathered a cleaner lower-definition source than that.

  • by The Master Control P ( 655590 ) <.ejkeever. .at.> on Friday November 28, 2008 @01:26AM (#25913725)
    We've got a 42" 1080p lcd tv and a Dish HD to feed it. The video is (presumably) 1080p over analog component. I can see the difference, but I truly don't give a shit for the most part.

    When we still had the SD DVR and I had to stretch Stargate Atlantis (meaning the effective resolution was sub-SD) to fill the screen, I got tweaked more than a little. But other than that (which doesn't happen anymore with the non-4:3-aware HD DVR), I can honestly say that I don't much care. Yeah, I can pause Law & Order and count the strands of Elizabeth Rohm's hair or stop Atlantis and count the stubble in John Sheperd's beard - but so what?

    I'm here to watch the criminals get caught or the Wraith be foiled again, not to stroke my e-penis to the thought of how awesome my screen's picture is. Unless the picture is suffering horrible abberations or the audio is like 64kbps mp3, those don't really impede the story.

    In conclusion: It's absolutely astonishing how many details your brain can paint in or interpolate if you let it.
  • by DynaSoar ( 714234 ) on Friday November 28, 2008 @02:44AM (#25914027) Journal

    How can anyone take a study seriously that supposedly examines visual perception by talking to people over the phone? They learned nothing except that some people answer questions over the phone a certain way. That study design leads to the error of forced responses, producing responses where none would have been forthcoming except for the question having been asked. Such answers have nothing to do with any perceptual ability, bias or preference.

  • by Kokuyo ( 549451 ) on Friday November 28, 2008 @04:44AM (#25914385) Journal

    I know I am way too late to enter this discussion, still I would like to mention that my Panasonic plasma tv does a beautiful upscaling. Frankly, when I was watching Casino Royale (BluRay) on the LCD TV of a friend of mine, I had a headache because the whole thing was too damn sharp, especially with fast movement. I just don't have that problem with upscaled DVDs.

    And frankly, while a new release on DVD costs 22 Swiss Francs (about 18 US$) a BluRay is anywhere between 35 and 50 CHF (about 29 to 41 US$). I just don't see even ONE good reason to give the movie industry that kind of money.

  • Damned lies... (Score:3, Insightful)

    by Ginger Unicorn ( 952287 ) on Friday November 28, 2008 @10:47AM (#25915861)
    So what the article should say is that 81% of consumers CAN tell the difference between HD and SD. I wonder what percentage of those surveyed had bad eyesight? I'm pretty sure it's more than 18% of them. This is a non story.
  • by reallocate ( 142797 ) on Friday November 28, 2008 @11:25AM (#25916145)

    It's depressing that so many folks here are using this survey to blast people as morons. Depressing, but not terribly surprising.

    Very, very, very few customers looking to buy a new TV are going to have a clue about things like FPS or pixels or whatever. There's no reaon why they should.

    People will judge the quality of a TV's display by looking at it. It seems obvious that, given the variations in our eyesight, a lot of people aren't going to notice the difference between SD and HD, just as a lot of people can't notice the difference between sound reproduced on an audiophile's high-end dream and a $200 box.

    It's not important and, frankly, most people don't care about HDTV. If the programming isn't worth watching, who cares about anything else?

  • Well (Score:4, Funny)

    by jav1231 ( 539129 ) on Friday November 28, 2008 @11:40AM (#25916247)
    It really takes a side-by-side for me. When HD first came out I'd goto Best Buy and frankly spent more time trying to deduce if there was enough difference to warrant $2k - $3k for an HD TV. I determined the difference simply didn't compel me. It still doesn't but as prices drop I'll be more likely to get an HD TV. I don't watch much TV and I certainly could care less if the chick getting banged in my pr0n has a mole on her vulva or not. Hell, I'm only in for 3 minutes then I'm taking asleep!
  • by tgibbs ( 83782 ) on Friday November 28, 2008 @11:45AM (#25916289)

    As stated, it implies that 18% of consumers can't distinguish SD from HD in a direct A/B comparison. I find this frankly unbelievable.

    On the other hand, I would not be at all surprised if 18% of consumers, particularly those who don't normally watch HD, might be unable to recognize HD when either SD or HD is shown on an unfamiliar monitor without the opportunity to make a direct A/B comparison.

    Another question is whether they were actually being asked to distinguish 480i SD from 720p or 1080i/p HD, or whether the "SD" was really 480p ED. On anything other than a very large-screen monitor, the distinction between ED and HD is fairly subtle. Actually, I expect the percentage of people unable to tell whether a picture is ED or HD would be considerably greater than 18%

  • by Benfea ( 1365845 ) on Friday November 28, 2008 @12:05PM (#25916435)

    I run into a lot of non-tech people who have difficulty understanding the difference between an HD TV and an HD signal. Such people would probably answer the question they thought they were asked, by correctly identifying the TV as being high definition, without ever really understanding that an HD TV can display both HD and SD content.

    Yes, the researchers probably explained the difference to the respondents during the course of the study, but many such people still don't understand the difference between HD & SD signals even after you explain it to them.

  • by Minwee ( 522556 ) <> on Friday November 28, 2008 @02:44PM (#25917593) Homepage
    82% of viewers think that the Emperor is a very snappy dresser.

White dwarf seeks red giant for binary relationship.