Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Television Media

18% of Consumers Can't Tell HD From SD 603

An anonymous reader writes "Thinking about upgrading to an HDTV this holiday season? The prices might be great, but some people won't be appreciating the technology as much as everyone else. A report by Leichtman Research Group is claiming that 18% of consumers who are watching standard definition channels on a HDTV think that the feed is in hi-def." (Here's the original story at PC World.)
This discussion has been archived. No new comments can be posted.

18% of Consumers Can't Tell HD From SD

Comments Filter:
  • Re:Are they nuts? (Score:3, Interesting)

    by tiananmen tank man ( 979067 ) on Thursday November 27, 2008 @09:41PM (#25912469)

    I have a HDTV and can tell the difference but don't care. I am not willing to pay the price difference for HD tv shows. My HDTV isnt going to waste tho, I do use it for high def gaming.

  • Frame rate (Score:5, Interesting)

    by Twinbee ( 767046 ) on Thursday November 27, 2008 @09:42PM (#25912471)

    Perhaps even more irritating than this, is how some people can't distinguish between 30 and 60 FPS (or at least don't care), when of course there is a massive difference. The latter is much smoother for all kinds of programmes and games. 120 FPS of course would be even better...

  • by Shados ( 741919 ) on Thursday November 27, 2008 @09:43PM (#25912483)

    I'd be more interested in a comparison between upscaled SD and HD. That is, an upscaled DVD (even the Xbox 360 upscale would do...no need to go fancy), vs a 720p source. I bet that 18% would become much, much higher... I have 2 TV of exactly the same size and resolution, and I tried putting them side by side... aside for the annoying 4:3 ratio that most DVDs are in, Its freakishly hard to tell the difference on anything below 40-45 inches (at a reasonable distance... of course its easy if you have your face in the TV).

    The biggest reason SD "looks so awful about seeing HD" is because the built in upscalers of most HDTV is completly horrible, and make SD sources look faaaaaar worse than they should.

  • Re:Frame rate (Score:3, Interesting)

    by Tubal-Cain ( 1289912 ) * on Thursday November 27, 2008 @09:45PM (#25912489) Journal
    Does any framerate greater than your monitor's refresh rate matter?
  • Re:Frame rate (Score:2, Interesting)

    by Xorlium ( 1409453 ) on Thursday November 27, 2008 @09:48PM (#25912505)
    Hehe, I can definitely distinguis between HD and SD, but I definitely can't distinguish between 30fps and 60fps :) I don't see frames, I see things moving!
  • Re:Frame rate (Score:4, Interesting)

    by bhtooefr ( 649901 ) <[gro.rfeoothb] [ta] [rfeoothb]> on Thursday November 27, 2008 @09:49PM (#25912513) Homepage Journal

    In a very roundabout way, yes.

    If a graphics card can barely average 60 FPS (or whatever your monitor's refresh rate is - my ThinkPad runs its LCD at 50 Hz,) then it's going to have dips well below 60 FPS.

  • Re:Its worth noting (Score:5, Interesting)

    by Shados ( 741919 ) on Thursday November 27, 2008 @09:50PM (#25912517)

    It is still important. I can most definately tell if what I'm watching is from a crappy VHS or from a DVD. That was obvious the first time I ever saw a movie on DVD. Walked in a room, saw people watching a DVD movie, and was like "Wow....so thats a DVD movie eh?". An HD source vs an SD source (to be fair, I'm talking about a movie or TV show... other kinds of content will be easier) gets a lot trickier.

    I remember last time I brought my Xbox 360 to a family member's place. All of their TV's HDMI connectors were taken (which is what I normally use), so I brought the component cables (which can do 720p just fine). Since I had never used component, the console went back to default: 4:3, 480 lines. After playing a few hours, I started noticing something weird.... the ratio (the game i was playing didn't make it totally obvious like most would). So I went in the config to set it back to 16:9, when I noticed... 480 resolution? The hell? Switched it back up to 720p... There was a difference, but it wasn't all that obvious (no, it wasn't one of those 520p games that they upscale).

    I'm sure I'm not the majority and that most people would have been able to tell much faster, but point still stands though: for a large amount of people its fairly irrelevent if you give them HD for a month or a year. As long as there's no artefact in the picture (like VHS), how many pixels you pump in Sex in the City won't matter.

  • Re:Frame rate (Score:5, Interesting)

    by datapharmer ( 1099455 ) on Thursday November 27, 2008 @09:50PM (#25912521) Homepage
    There is good evidence that humans are adapting to the frame rate and that for along time 30 FPS was enough to not notice the flicker... this is a ongoing problem, but it doesn't mean that some people still don't notice while others (such as gamers) may be more apt to notice.
  • Re:Frame rate (Score:3, Interesting)

    by s.bots ( 1099921 ) on Thursday November 27, 2008 @09:55PM (#25912547)

    We just bought a new tv not too long ago (Sony Bravia) with the option af setting the refresh rate to 120Hz, makes an amazing difference watching sports or anything with fast motion, but makes regular tv shows very eerie - almost cheap looking. I don't know how anyone could not tell the difference with fast motion, maybe if you were watching the fireplace channel...

  • by ogminlo ( 941711 ) on Thursday November 27, 2008 @09:56PM (#25912551)
    I'm a broadcast engineer, and when we brought in a Teranex VC100 (broadcast version of the HQV chipset) to test how it compared to real HD we were stunned to discover that even our snobbiest and best-trained eyes could hardly tell the difference between upscaled anamorphic 480i and true 1080i. The testing was performed on a $60K Sony BVM series HD broadcast monitor. Granted, it was not trying to make 1080p and both were 29.97 fps, but the results were still very impressive. We were hoping to see it fail so we could justify a bunch of HD equipment, but the Teranex did too good a job. There is a consumer version of this chip- the Realta HQV. Higher-end home theater gear uses it to scale HDMI up to 1080p. Upconvert a 16:9 DVD with an HQV device, and you get 99% of the quality of Blu-Ray.
  • by syousef ( 465911 ) on Thursday November 27, 2008 @10:01PM (#25912579) Journal

    Please use usersAreBlind instead ;-)

    In all seriousness though, blaming people for being unable to tell the difference between SD and HD isn't a positive thing. The irony being that if they can't tell the difference they get to save themselves a whole lot of money. Thoguh personally I'd rather have decent eyesight and make the choice of SD vs HD based on whether I think it's worth it. I can tell the difference and I'll be sticking with SD until HD is much cheaper by the way.

  • by Bones3D_mac ( 324952 ) on Thursday November 27, 2008 @10:03PM (#25912591)

    Humans are often easily distracted creatures, as demonstrated by numerous examples of highly successful ad campaigns over the years. As long as you present the audience with enough interesting or flashy content, the quality of the medium becomes less relevant.

    The solution to speeding up HD adoption, is to make the content itself less interesting. The viewers will have no choice but to start taking notice of external annoyances like picture quality.

  • by actionbastard ( 1206160 ) on Thursday November 27, 2008 @10:22PM (#25912703)
    Nearly four out of five viewers can tell the difference. This correlates well with other studies that show four out of five respondents [google.com] answer surveys.
  • Re:Frame rate (Score:3, Interesting)

    by Twinbee ( 767046 ) on Thursday November 27, 2008 @10:27PM (#25912727)

    That may have something to do with how LCD displays have a bad response time (causing blurring - a separate issue from frame rate). Alternatively, perhaps the programmes you view were shot at 30 or 60fps, so they weren't meant for 120fps TVs anyway.

    OLED technology should fix both issues in the future, as they have incredible response times, and probably excellent frame rate potential.

  • Re:Are they nuts? (Score:3, Interesting)

    by Lumpy ( 12016 ) on Thursday November 27, 2008 @10:45PM (#25912851) Homepage

    Nope. in fact I highly suspect their findings as too low. MOST people that buy a HDTV and have my company install it cant tell the difference between a SD broadcast and a HD broadcast, because they sit 15 feet away from that 50" plasma above their fireplace.

    you need to sit 6-8 feet from a 42-50" display to really see the difference, more than that and your eyes cant see the resolution.

    It's even worse if your HD signal is a crappy signal like Comcast. The Comcast local PBS Hd QAM channel looks like hell compared to the signal I get over an antenna. It's like night and day here they compress the OTA channels so hard. as well as discovery HD looking like crap as well...

    18% at a proper viewing distance I can understand. but that number will grow exponentially the farther they sit from the TV. and most "trendy" homes have the TV way too far from the seating. and putting it above the fireplace is just plain old stupid,.

  • Re:Many variables (Score:4, Interesting)

    by Miseph ( 979059 ) on Thursday November 27, 2008 @10:46PM (#25912869) Journal

    Last I heard the only things actually broadcast in HD are the World series and the Super Bowl, so yeah, I'd say that if you're watching a lot of broadcast TV but not much sports, you're just as well off getting a 720p anyway.

    Incidentally, I spend a lot of time answering questions about TVs, selling them is part of my job. Funny thing though, all of our display model HDTVs are playing a looped DVD over split coax that isn't even terminated on the unused outlets... people will stand there oohing and ahhing over how great the picture is despite the fact that it is absolutely not HD in any way shape or form. Makes it pretty hard to convince people Sony sets are worth more than Olevia ones, too.

    This headline comes as so little a surprise to me that I have trouble believing anyone even doubts it.

  • by wylderide ( 933251 ) on Thursday November 27, 2008 @10:52PM (#25912917) Homepage
    ... It's the same old crappy writing and acting, characters and dialogue. Now, with HD, you get a crystal clear image of the crap they put on the millions of channels. Yay! Maybe once they put out something worth watching I'll worry about the picture quality.
  • Re:Frame rate (Score:5, Interesting)

    by Anonymous Coward on Thursday November 27, 2008 @10:55PM (#25912927)
    It's only really useful for film-based content. Film is shot at 24fps. On a normal 60Hz set, every frame alternates between being repeated 2 times or 3 times (2,3,2,3,2,3,2,3) in order to sync up with the 60Hz refresh rate (it's called 3:2 pulldown). This gives a slight stuttering effect, which is more pronounced during slow sideways panning shots. With displays that have a 120Hz refresh rate, this pulldown is eliminated be repeating every frame 5 times (5*24=120). This gives a more natural and fluid appearance.

    Some displays will also use interpolation to "create" frames rather than simply repeating each frame for a set period of time. This technology, IMHO, isn't quite up to snuff and gives films/shows a somewhat odd synthetic appearance. Keep in mind that this tech is separate from the 5:5 pulldown described above.
  • by LordKronos ( 470910 ) on Thursday November 27, 2008 @11:05PM (#25912983)

    I wouldn't be surprised to hear that they are less concerned with the quality difference between the SD and HD stations than they are with how much slower their channel surfing is with the HD versions.

  • Re:Many variables (Score:5, Interesting)

    by caitsith01 ( 606117 ) on Thursday November 27, 2008 @11:16PM (#25913051) Journal

    Sure the top end flat-screen TVs might be ahead of the best CRTs, but I think the average CRT is still ahead of the majority of flat-screens that seem to be being snapped up by budget concious consumers. A digital signal makes a big difference, after that, not so much.

    I waited until this was consistently, noticeably no longer the case before buying a plasma. I still would not by an LCD, although the higher end Sony 1080p models are starting to look pretty amazing when set up with optimal source material.

    I also had a decent Sony CRT, which I gave to my parents when I got a Panasonic plasma. Although I thought after a while that maybe the plasma wasn't *that* much better, I have since been and re-watched the Sony, and frankly the plasma blows it back into last century, where it belongs. You just cannot beat the clarity (not to mention size and response time) of plasmas IMHO.

  • Re:Are they nuts? (Score:3, Interesting)

    by dlZ ( 798734 ) on Thursday November 27, 2008 @11:17PM (#25913055) Journal
    I have the most basic package, though. It's $7.55 a month, and gives me 2-13 (something like that) and a couple random higher channels like Spike (which comes through even if you have no TV service and just RR, though.) It also knocks $5 off of my RR bill, so I think it's worth the $2.55 + tax so I can get some stations in (reception at my place sucks, and over the air HD gives you a black screen most of the time.)
  • by radarsat1 ( 786772 ) on Thursday November 27, 2008 @11:31PM (#25913133) Homepage

    In my experience, people tend to care more about things other than the video resolution when watching TV. Like, say, the plot, or the character development.

    Watching hokey, on the other hand, I can understand why people would want to see the puck better, but in the general case I think no one gives a *** about resolution.

    If it's a good movie I'll happily watch it at 320, blurry, at 15 FPS, if that's all I can get.

    Frankly, when it comes down to it, the sound quality matters more than the video.

    If you can't hear what the actors are saying you may as well turn it off, but if you can basically get the idea of what's going on, video isn't that critical.

    Maybe I just have low standards.

  • One important detail (Score:5, Interesting)

    by Pr0xY ( 526811 ) on Thursday November 27, 2008 @11:36PM (#25913189)

    There is something that they aren't accounting for. People (especially less tech savvy people) not realizing that they aren't watching HD, they just assume if it's on a newer plasma/lcd, then it's HD.

    For example, I have a relative who was watching football today on my cousin's plasma. He of course tuned to the channel he gets at home (CBS), the non-HD version. Simply because he had no idea that verizon offers HD versions of pretty much all basic cable just by going to channels above 500 in my area.

    At some point, it occurred to me that the picture didn't quite look up to snuff, so I asked him what channel he was on (since often SD os broadcast on HD channels because the original signal was SD), he said 7. I said "a-ha! you should switch to the HD version of this channel!".

    He was confused, but told me to go for it. He was *amazed* at the difference in clarity. He said claimed it looked like he was down on the field.

    Not being able to tell the difference is very difference from not knowing there is a difference available.

    I would wager that if you put the 2 screen side by side, one showing the signal in true HD and the other in SD. Anyone without vision problems can tell the difference.

  • Re:Frame rate (Score:3, Interesting)

    by Daetrin ( 576516 ) on Thursday November 27, 2008 @11:51PM (#25913251)
    Well except for a couple of the really quick turns i couldn't really tell the difference at all between the two. For the comparison of the balls bouncing back and forth they clearly looked different, but i couldn't tell which i actually preferred. The 60 fps one was obviously sharper, but it seemed to be flickering between positions or something and looked somewhat odd.

    Clearly the person who put the video together needs to work on their marketing skills however. When trying to convince people at agree with you declaring that everyone else would "have to be blind" to disagree with them. Even if i had noticed a difference i'd be tempted to say i didn't just to spite them after that.
  • by the eric conspiracy ( 20178 ) on Thursday November 27, 2008 @11:56PM (#25913291)

    I have a DVD player with an Realta HQV video processor and it really does a great job. In order to visually benefit from something like BluRay over a top notch scaler you have to have a pristine master and a high quality large screen (1080p at least 50"). It is difficult to get that good a master from older film or most video. That is great news since the vast majority of my current DVD collection will remain satisfactory for a long time.

    But - new films mastered in HBR sound formats and 1080p on a good screen are enough better in both sound and appearance that I have stopped buying DVDs. I am renting until an acceptable BD player becomes available at which time I will start buying BluRay disks.

  • Yep (Score:3, Interesting)

    by maz2331 ( 1104901 ) on Friday November 28, 2008 @12:13AM (#25913383)

    It's strange, but I work with stereoscopic video and have noticed that even 640X480 in stereo 3D looks a lot sharper than 1920X1080 mono.

    It is a psycho-visual effect, for sure. But it is real.

    IMHO - forget about HD and use the bandwidth for 3D.

  • Re:Are they nuts? (Score:3, Interesting)

    by nabsltd ( 1313397 ) on Friday November 28, 2008 @12:21AM (#25913417)

    you need to sit 6-8 feet from a 42-50" display to really see the difference, more than that and your eyes cant see the resolution.

    Oh, my, you must be blind.

    I sit over 10 feet away from my 38" TV (an honest-to-goodness picture tube set), and everyone who has seen the TV can tell the difference. It's measured resolution is about 1400x900, so most all HD is at or close to full resolution.

    Meanwhile, SD is at most 720x480, and usually a lot less than that. It's easy to tell the difference.

    Now, the difference between 1920x1080 and 1280x720 is something that you can't really tell without a large display with the ability to fully resolve 1920x1080.

  • Re:Are they nuts? (Score:2, Interesting)

    by ngth82 ( 1261748 ) on Friday November 28, 2008 @12:55AM (#25913565)
    While the opposite is also true: some of the lower priced "cheaper" antennas actually work better than the higher priced ones. It depends on your location, and where the anntenna is installed relative to other large objects inside your house that may be blocking the signal or causing a great interference pattern due to reflections.
  • by Runefox ( 905204 ) on Friday November 28, 2008 @12:56AM (#25913573)

    Isn't that people are stupid, but that the HD content we currently have isn't exactly HD. Even the snazziest Blu-Ray displays in places like the Sony Store or any big electronics retailer seem to have really nice-looking visuals, but they also seem to have a big problem not only with interlacing(?! Isn't this 1080p?!), but also with video compression artifacts. In many cases, when I look at the TV's on display, I can't usually tell that what I'm looking at is HD, unless the video's been specifically tailored to show off the resolution. TV broadcasts (the few that are HD around here), Blu-Ray movies (especially live action), doesn't matter. It all looks quite muddy, and I'm distracted often by the block and ring artifacting, just as I was when DVD was first released.

    I don't have an HDTV or an HD player, myself, so I'm not intimately familiar with how current movies are being compressed on the disc, but... Don't they have any room to turn up the bitrate a little? I mean, sure, it's not reasonable to expect an uncompressed image (though I'd really like it), but seriously, the video compression quality sucks.

    You can have as high a resolution as you want, but when artifacts are large enough to casually notice, you've defeated the purpose of that resolution; I would have rathered a cleaner lower-definition source than that.

  • by The Master Control P ( 655590 ) <ejkeeverNO@SPAMnerdshack.com> on Friday November 28, 2008 @01:26AM (#25913725)
    We've got a 42" 1080p lcd tv and a Dish HD to feed it. The video is (presumably) 1080p over analog component. I can see the difference, but I truly don't give a shit for the most part.

    When we still had the SD DVR and I had to stretch Stargate Atlantis (meaning the effective resolution was sub-SD) to fill the screen, I got tweaked more than a little. But other than that (which doesn't happen anymore with the non-4:3-aware HD DVR), I can honestly say that I don't much care. Yeah, I can pause Law & Order and count the strands of Elizabeth Rohm's hair or stop Atlantis and count the stubble in John Sheperd's beard - but so what?

    I'm here to watch the criminals get caught or the Wraith be foiled again, not to stroke my e-penis to the thought of how awesome my screen's picture is. Unless the picture is suffering horrible abberations or the audio is like 64kbps mp3, those don't really impede the story.

    In conclusion: It's absolutely astonishing how many details your brain can paint in or interpolate if you let it.
  • by Charlie Kane ( 1098491 ) on Friday November 28, 2008 @01:38AM (#25913775)

    I have a DVD player with an Realta HQV video processor and it really does a great job. In order to visually benefit from something like BluRay over a top notch scaler you have to have a pristine master and a high quality large screen (1080p at least 50"). It is difficult to get that good a master from older film or most video. That is great news since the vast majority of my current DVD collection will remain satisfactory for a long time.

    I don't have a player with a Realta HQV processor -- just an upconverting PlayStation 3 -- but I do work in the video industry and I've seen the Teranex in action and in my opinion there's a significant difference between Blu-ray and "a top-notch scaler." I watch at home on a Sony 47-inch XBR4, and the difference is substantial.

    As far as "older film" goes, I have The Wild Bunch, The Searchers, and Black Narcissus, and they crush the older DVDs like the proverbial grapes. Just marvelous viewing. It's not just a question of resolution but also color fidelity. (And, of course, the Blu-ray masters may well be the newest transfers, which makes any comparison to older discs a little unfair. But still -- I look at enough video on enough different screens in enough different high-end facilities to know what I'm seeing.)

    I know a lot of people claim there's not a big difference between good SD upconversion and true HD, but these people either sit too far from their screens or just don't know what they're looking for. (Some of them may be the same people who keep their LCD screens set to "Vivid" and complain about grainy pictures.) As a cinephile who'd like to see more titles beyond the frat-boy demographic become available in the format for purely selfish reasons, it frustrates me a little when people argue that HD doesn't represent a significant quality gain over SD.

  • Re:Are they nuts? (Score:3, Interesting)

    by name_already_taken ( 540581 ) on Friday November 28, 2008 @01:55AM (#25913861)

    Only if you have more than one TV.

    Keep in mind that a signal amplifier amplifies the noise just as much (sometimes more) as the signal you're interested in. You don't really need one if you're not splitting the signal downstream.

    That's a little too broad of a statement, and is of course not true in many situations. You also need one if you don't live near the transmitter. I have a good antenna (I forget the brand and model) and a good masthead amplifier. The signal is so weak I get about 2/3 of the channels that I know are out there, even with the amplifier. I have a Samsung DTB-H260F receiver, which is a reasonably good unit. I have all RG6 quad shield cable. It's still not good enough. The picture drops out a lot on some of the channels, and just enough to be annoying on the others. It's the same on the other TV in the bedroom too. It doesn't matter if I connect a single TV directly to the masthead antenna's output, either.

    Why? I'm over 50 miles from the transmitters, I have huge tall trees throughout my neighborhood and there's a river valley between me and the transmitters that cuts the signal in half according to the online signal strength maps.

    The next thing I'm going to try is putting the antenna up higher on a mast. 1080 on a 108" projected image is worth the effort.

    But, my main point was that blanket statements about when you need a decent antenna or an amplifier are often going to be false because these are not one-size-fits-all things. ATSC reception demands a consistent, strong signal but conversely seems to have low tolerance for multipath, which is a problem you can incur with a "too good" antenna or amplifier.

  • by chrysrobyn ( 106763 ) on Friday November 28, 2008 @02:12AM (#25913919)

    I would wager that if you put the 2 screen side by side, one showing the signal in true HD and the other in SD. Anyone without vision problems can tell the difference.

    Who cares? Do you want to go out creating more videophiles? Does this world lack enough audiophiles for you? Side by side, I can tell the difference, but most content on television is not actually improved by increased bitrates or resolutions. Sitcoms and dramas, in particular -- you don't need to see the flaws of the actors' and actress' faces, in fact, they distract. The only time HD matters is sports and special effects laden movies.

    If you showed a relative that he's missing out, did you do him a favor? Maybe he's got a 10+ year old TV with analog cable (or broadcast), and can't afford a whiz-bang LCD (let alone plasma) or digital cable -- which is a lot of America, by the way. People are often much happier before they found out there were others doing better than they were.

    I know the difference between HD and SD, and I can afford the upgrade, but I choose not to. I have a 32" Trinitron, and it does what I need (AppleTV for transcoded DVDs, DVDs, Wii). There's actually nothing I care about that a 720p (or higher) would improve, aside from power consumption and weight.

  • Re:Are they nuts? (Score:5, Interesting)

    by neomunk ( 913773 ) on Friday November 28, 2008 @02:15AM (#25913929)

    Time Warner in Charlotte, NC advertises "Free HD, for only $9.95 more a month, while out competitors (satellite) charge more than $100 a year for the same service".

    Being that my brain hurts whenever I get close to figuring out how $9.95 a month is "free", and being that my soul hurts for paying the fools that would be proud of $9.95 a month compared to $100 per year, I'm not amicable to explanations as to why I should consider $9.95 a month to mean "free".

  • Re:Many variables (Score:2, Interesting)

    by Smivs ( 1197859 ) <smivs@smivsonline.co.uk> on Friday November 28, 2008 @05:04AM (#25914453) Homepage Journal

    I've found that, as is often the case, the source is the key. My 42" plasma is NOT HD enabled, so I am reliant on digital SD broadcasts (satellite and terrestrial) and DVD as my sources. DVD looks great as you'd expect, but the real difference is when viewing broadcast TV. It is SO obvious which progs have been shot in HD and which havn't.
    The bottom line is , for most people, programs played on an SD set are more than acceptable providing the source was shot in HD, and this may explain the results of the poll in TFA. I'm sure this is true for CRTs as well as flat screens.
    So the answer might simply be that all programs should be shot in HD, then the viewer experience will always be good, regardless of their equipment..

  • Comment removed (Score:3, Interesting)

    by account_deleted ( 4530225 ) on Friday November 28, 2008 @06:08AM (#25914653)
    Comment removed based on user account deletion
  • Re:Are they nuts? (Score:3, Interesting)

    by theaveng ( 1243528 ) on Friday November 28, 2008 @08:18AM (#25915115)

    Amplifiers worked great with NTSC because it made the "sync" signal stronger and allowed the TV's tuner to lock into the station. While this process also added noise, the human brain has millions of years of development that allows it to "see through" the noise and extract an image. (For example I was watching CSI - it was a blurry image, but my brain could still see the hot blonde in the white noise.)

    Digital receivers don't like noise, so adding the amp often makes things worse. A computer, unlike our brains, can't deal with it. Instead of extracting a hot blonde, it just gives up.

  • by tgibbs ( 83782 ) on Friday November 28, 2008 @11:45AM (#25916289)

    As stated, it implies that 18% of consumers can't distinguish SD from HD in a direct A/B comparison. I find this frankly unbelievable.

    On the other hand, I would not be at all surprised if 18% of consumers, particularly those who don't normally watch HD, might be unable to recognize HD when either SD or HD is shown on an unfamiliar monitor without the opportunity to make a direct A/B comparison.

    Another question is whether they were actually being asked to distinguish 480i SD from 720p or 1080i/p HD, or whether the "SD" was really 480p ED. On anything other than a very large-screen monitor, the distinction between ED and HD is fairly subtle. Actually, I expect the percentage of people unable to tell whether a picture is ED or HD would be considerably greater than 18%

  • by Benfea ( 1365845 ) on Friday November 28, 2008 @12:05PM (#25916435)

    I run into a lot of non-tech people who have difficulty understanding the difference between an HD TV and an HD signal. Such people would probably answer the question they thought they were asked, by correctly identifying the TV as being high definition, without ever really understanding that an HD TV can display both HD and SD content.

    Yes, the researchers probably explained the difference to the respondents during the course of the study, but many such people still don't understand the difference between HD & SD signals even after you explain it to them.

Lots of folks confuse bad management with destiny. -- Frank Hubbard

Working...