Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Television Media Hardware

1080p, Human Vision, and Reality 403

An anonymous reader writes "'1080p provides the sharpest, most lifelike picture possible.' '1080p combines high resolution with a high frame rate, so you see more detail from second to second.' This marketing copy is largely accurate. 1080p can be significantly better that 1080i, 720p, 480p or 480i. But, (there's always a "but") there are qualifications. The most obvious qualification: Is this performance improvement manifest under real world viewing conditions? After all, one can purchase 200mph speed-rated tires for a Toyota Prius®. Expectations of a real performance improvement based on such an investment will likely go unfulfilled, however! In the consumer electronics world we have to ask a similar question. I can buy 1080p gear, but will I see the difference? The answer to this question is a bit more ambiguous."
This discussion has been archived. No new comments can be posted.

1080p, Human Vision, and Reality

Comments Filter:
  • by davidwr ( 791652 ) on Tuesday April 10, 2007 @10:13AM (#18674987) Homepage Journal
    My "real-world" conditions may be a 50" TV seen from 8' away.

    Another person may watch the same 50" set from 4' away.

    Your kids may watch it from 1' away just to annoy you.

    2 arc-minutes of angle is different in each of these conditions.

    Don't forget: You may be watching it on a TV that has a zoom feature. You need all the pixels you can get when zooming in.
  • My wife and I were looking at TVs, and we walked past some gorgeous 52" LCDs that support 1080p, and I told her this is what I wanted.

    Then she walked past a smalled 32" LCD that only supported 720p/1080i and she said, "this picture looks so much better, and the TV is $1000 less! Why?"

    I casually explained that the expensive TV was tuned to a normal TV broadcast, while the cheaper TV was being tuned to ESPNHD. She looked and realized that the most expensive TV getting a crappy signal isn't going to look all the great.

    I still want a nice LCD that supports 1080p, but I'm not pushing for it immediately until I can afford a PS3 and a nice staple of BluRay movies to go along with it.

    720p looks IMMENSELY better than 480i, or any crappy upscaled images my fancy DVD player and digital cable box can put out. I have yet to see a nice, natural 1080p image myself, but I'm willing to bet I will be able to tell the difference.

    If anyone recalls, there were people who insisted that you couldn't really tell the difference between a progressive and interlaced picture.
  • I recently saw an article posted by Secrets of Home Theatre, very well known for their DVD benchmark process and articles.

    The article is here [hometheaterhifi.com].

    They show numerous examples of how the processing involved can indeed lead to a better image on 1080p sets. Mind you it is not just the resolution, but how 480 material being processed and scaled can look better on a 1080p screen than on a 720p (or more likely 768p) screen. It is a very interesting read. Although if you are already conversant in scaling and video processing some of it can be very basic. I count that as a feature though as most non-technical people should be able to read it and come away with the information they are presenting.

    Definitely interesting as a counterpoint.

  • by Anonymous Coward on Tuesday April 10, 2007 @10:27AM (#18675193)
    And I like to thank you for using the word "moot" correctly in a sentence. Every time someone writes "its a mute point" I throw up a little bit. Not the kind of throw-up where you actually vomit everything out of your stomach, but the kind of throw up where you kinda puke a little bit in your mouth.
  • by Anonymous Coward on Tuesday April 10, 2007 @10:36AM (#18675351)
    Yes, ATSC supports 1080p, and has from the beginning. You idiots who think 1080p has to mean 60p are muddying the waters. Don't you watch DVDs and think they're just fine? They're 24p, doofus. If you can do 60i in 1080, then you can do up to 30p with identical bandwidth. Just read info on the standard instead of the bullshit the 720p proponents say.

    Wikipedia article on ATSC [wikipedia.org]
  • by Hoi Polloi ( 522990 ) on Tuesday April 10, 2007 @10:39AM (#18675399) Journal
    I assume this is why TV tubes are made with leaded glass, to absoarb the soft x-rays being generated. This is also why tossing out a TV tube improperly is a pollution no-no.
  • Sigh... (Score:1, Interesting)

    by Anonymous Coward on Tuesday April 10, 2007 @10:56AM (#18675721)
    Putting 200mph tires on a Toyota Prius is down right stupid. Putting them on my Porsche 928 is not quite as stupid.

    The issue with 1080p is not as clear cut though. There are a LOT of factors that you have to put into consideration. 1080p has higher resolution and higher frame rate. Period. It has a much higher resolution. Is this better or not? That will SERIOUSLY depend on what media you are using to watch on the 1080p, and moreover, if you are using a very sharp LCD/Plasma display or not. (My guess is the answer is "yes" for most people.)

    One big factor is that these 1080p screens are not very well adept for use with analog material. They can take very precise digital data very well, but converting analog material to digital screens will leave very irritating artifacts. That is, pixels jumping back and forth, which REALLY show up badly in slow scenes or those scenes with next to no motion.

    Analog, SD TVs are very good at overcoming these problems. The [i]nterlace and analog construct allows a lot of this to be fuzzed. It can be fuzzed enough that you don't notice it. In that sense, it's not fair to compair 1080p with a tube SD picture. 1080p can NOT fuzz an image. It is not capable of doing so without jumping a pixel, which is noticeable. Software can do the fuzzing, but that will degrade the image.

    By definition, Analog can "fuzz" what Digital can't. And most people will notice. Will they CARE is an entirely different issue. I tend to have very good hearing and can tell a GREAT difference between a CD and an mp3/AAC/OGG song. This does not require high-end equipment. I can tell the difference between an Apple Lossless file, and an AAC file from the exact same song on the exact same iPod. But the question is, do I care? The answer is no.

    Either way, I think that 1080p is too LOW of a resolution, as long as the screen is digital. You can actually tell the crappy picture, which wouldn't bother you at 1/4 the resolution if it were an analog tube.

    But seriously, it's a matter of software medium. Play a DVD on an SD TV (tube) set, and then compare it to S-VHS, and then VHS. You can STILL see the difference, and this article is bickering about the same thing. Been there, done that. 8mm film is awesome, even though it has lower resolution than most digital camcorders these days. A 50 year old soviet made 16mm camera shoot will run circles around 1080p with modern film. So what!?
  • by Anonymous Coward on Tuesday April 10, 2007 @11:01AM (#18675791)
    What about big TVs?

    Real world example, with slightly lower resolutions. My TV is 92" - a 720p projector and screen. I sit about 11.5 feet back from the screen. I upgraded from a 480p projector a few months ago. The old projector displayed an HD signal quite well, but downscaled the resolution. Still, it looked good. On certain scenes - usually with large patches of light, solid color - I was able to clearly see the pixel structure. On other scenes, it gave a grainy appearance, but the pixels weren't as clearly delineated. Sitting any closer (which might be necessary with other people watching), highlighted these problems.

    With the new projector, HD content is usually better. Older movies, or movies deliberately made with an older, or grainier, appearance, are usually about the same, no real improvement. More recent movies, sharper images, etc. are much nicer. TV shot on film and broadcast in HD is better, sharper, more depth. TV shot with HD cameras - sports, documentaries, etc., or even the evening news on one local channel - is far, far superior. Much sharper, clearer, far more detail, and with depth approaching 3D at times. Much less visible pixel structure. I can sit 3-4 feet closer without any problems (meaning, we can pull up some extra chairs and more people can watch). But even so, it's not as sharp and detailed as 1080p TVs and HD-DVD/Blu-Ray content that I've seen in stores. OK, so that's carefully chosen content and tweaked monitors for maximum "wow" factor in the stores, and standing next to the monitor is MUCH closer than actual viewing distance, but still, there's a layer of detail that's simply not there at home. There's graininess on some content that shouldn't be there. And viewing distance is compromised - as good as my picture at home is, if I stand four feet behind my couch, the picture looks even better.

    From my experience, I doubt that on a 50" screen, going up to 1080p will make much difference. But with a big screen, yeah, it will. I can see the limitations of my setup, and I know, based on what I see and the difference from my previous, lower resolution projector, that taking the next step up will be a significant improvement on some types of content. The same kind of changes I've already experienced, but more so. Not on older movies, not on some TV. But on newer movies, and on TV shot on HD, I will definitely see the extra detail, it will give more punch to the image at normal viewing distance, it will allow for closer seating, and, with a short-throw projector, would even allow for a larger screen. Oh, and 1080i vs 1080p? On a 50", who cares? At 92", yeah, it matters.

    I can't afford 1080p right now. But when a 1080p DLP projector with a short throw and (hopefully) lens-shifting becomes available at an affordable price, I'll be watching on a new 110" - maybe even 120", a full TEN FEET - and chuckling at those who argue over its value for a "big" 50" plasma.
  • by ronanbear ( 924575 ) on Tuesday April 10, 2007 @11:22AM (#18676105)
    It's not just that. Sitting too close to the TV (especially CRTs) strains your eyes. That's because the light itself is non parallel (single source) and your eyes have to adjust their focus to see the whole picture. Your eyes actually correct the light but over prolonged time you tend to get headaches and tired.
  • by maxume ( 22995 ) on Tuesday April 10, 2007 @11:23AM (#18676137)
    This is slashdot, no one reads the comments.

    Really, I thought your comment was pretty interesting, but I didn't see why you were so round-a-bout about the part where the 'apparent' size of the display is(often) the overriding factor in how much resolution is useful when looking at it, rather than the actual size, or even the resolution of the display. To me, that was the valuable nugget, and the rest were details supporting it.

    (and my apologies for being a bit more of an internet fuckwad than I should have been)
  • by badasscat ( 563442 ) <basscadet75@@@yahoo...com> on Tuesday April 10, 2007 @11:27AM (#18676239)
    I'm constantly arguing with people about whether or not you can see the difference between 1080p and 720p or not. As in most other things, many people want an absolute answer: yes you can, or no you can't. They read things containing ambiguity and conclude the answer must be "no you can't." But that's not what the word "ambiguity" means.

    As you point out, not everybody has the same visual acuity. My vision is corrected to better than 20/20, and even beyond that, to some extent I've been trained in visual acuity first by going to film school and then by working in the TV industry for some years. My job is to look for picture detail.

    I have a 42" 1080p LCD monitor, from which I sit about 6 feet away. I can easily distinguish the difference between 720p and 1080i programming (which gets properly deinterlaced by my TV set, so I'm seeing a full 1920x1080 image). Now, some of that is probably the scaling being done by my cable box, but almost surely not all - and anyway, scaling is never taken into account in the opposite argument (ie. nobody stops to consider that "720p" TV sets are almost all 1366x768 or 1024x768 resolution, meaning they have to scale everything).

    I think the bottom line is some people see the difference, some don't, and yes, it depends on how close you sit and how big your TV is. It depends on a lot of things. There's no "yes, you can see the difference" or "no, you can't".

    One thing I would say is that with prices being what they are these days, I don't see any reason whatsoever not to buy a 1080p set. The price difference between 720p and 1080p in the same line of sets is usually only about 10-20%.
  • by Craig Ringer ( 302899 ) on Tuesday April 10, 2007 @11:32AM (#18676347) Homepage Journal

    Sure, it's detailed. Too bad the colour is still a poor match to human vision.

    We see a huge dynamic range - we can see details in extremely dark areas and still perceive detail in very bright areas. What we see as bright or dark also depends on the surrounding lighting (and not just as your iris adapts, either, there are other effects at work). Even more importantly, our perception of colour intensity and brightness is not linear.

    To get truly amazing video, we'd need to switch to exponential format colour that better matches how we actually see and can represent appropriately high dynamic ranges while still preserving detail. We'd also need to dynamically adapt the display to lighting conditions, so it matched our perceptual white-point & black point. Of course, we'd need to do this _without_ being confused by the light from the display its self. And, of course, we'd need panel technology capable of properly reproducing the amazing range of intensities involved without banding or loss of detail.

    We're a very, very long way from video that's as good as it can get, as anyone in high quality desktop publishing, printing, photography or film can tell you. A few movie studios use production tools that go a long way in that direction and photographic tools are getting there too, but display tech is really rather inadequate, as are the colour formats in general use.

    I call marketing BS.

  • by pyite69 ( 463042 ) on Tuesday April 10, 2007 @11:40AM (#18676473)
    There are several problems:

    1) The ATSC specs don't provide a 60 frame 1080p mode - only 24p.
    2) There isn't a lot of content that can use 1080p - and it is likely to just be movies, which are 24p.

    There is one benefit of getting a 1080p display though: MythTV does a good job of deinterlacing 1080i to 1080p. You will probably also want to get some equipment to remove the MPEG artifacts too, which is not cheap.

    Mark
  • by BakaHoushi ( 786009 ) <Goss DOT Sean AT gmail DOT com> on Tuesday April 10, 2007 @11:52AM (#18676691) Homepage
    This is why I don't care much for HD. I just want to sit down and watch my God damn show or (more often) play my damn game. When you start talking about if the eye can even take in all that data and from what distance, what size screen... it's too much.

    It's still a fucking TV. Does it really matter just how blue X celebrity's jeans look? The reason I've stopped watching as much TV as I used to is because it's become so mindless... or at least I've come to realize how mindless it is. The picture is fine to me, even on my tiny, CRT 19" screen. HD might be a nice bonus, but it's just not worth the headache, the price, and the uncertainty.
  • by clickclickdrone ( 964164 ) on Tuesday April 10, 2007 @11:59AM (#18676779)
    There was a news piece I read recently where a BBC engineer was interviewed and said their experiments had showed that a faster framerate made a bigger difference to people's perception of an image's quality. They showed a well set up TV at standard res but higher framerate and compared it to a 1080p screen and the former looked better according to the writer. The BBC engineer noted that most of the 'HD is better' was smoke and mirrors anyway because most people's exposure to a normal picture is via a compressed digital feed of some sort and the apparent poor quality is a result of the compression, not the resolution.
    I certainly remember being very disappointed with both digital sat and cable images because of the poor colour graduations and sundry pixelation issues compared to my normal analogue signal so I can well believe it.
  • by hAckz0r ( 989977 ) on Tuesday April 10, 2007 @12:14PM (#18677029)
    Ok, why does everyone who has not driven a Prius think its slow? I traded in a high-test gas guzzling performance V8 equiped with ZR rated tires (220 mph) for a smaller Toyota Prius and I can honestly say I'm not missing anything for performance. My Prius can outperform the vast majority of the cars on the street today. Ok, it won't blow the doors off of a Corvette or a 5.0L Mustang, but I will easily get 3-4 times their gas mileage while trying. As far as the top end speed I'm just not going to incriminate myself in this forum, but believe me the tires delivered on the stock Prius should be upgraded!


    Two points on Prius "performance".

    1) Electric motors have maximum torque at zero RPM's, so its quick off the line even though you may have to wait for the gas engine to start and rev up before you have full torque for full acceleration.

    2) The computer controlled continously variable transmission (CVT) allows the small engine to work at maximum power throughout its acceleration, so there is no lag from shifting and slowing due to inefficient gear ratios. Smooth and constant acceleration which is optimized at all times once the engine is rev'ed up. When a Mustang shifts gears I generally catch up, then they take off again when they hit their sweet spot of their power range. Sometimes it can be annoying (lol) having to take your foot off the gas in the same rhythm as the car in front of you that is having to shift gears. Gas, break, gas, break, gas, break... (no, I don't really drive like that) :-]

  • by UncleTogie ( 1004853 ) * on Tuesday April 10, 2007 @12:53PM (#18677749) Homepage Journal

    "This is also why tossing out a TV tube improperly is a pollution no-no."
    Yeah..but, what are you gonna do?

    Our shop takes them to the local Goodwill computer store. Those folks are legally bound to dispose of it properly.... And 'cause I can see it coming, yes, the rest of us are similarly legally bound, just a lot less likely to get it to a recycling center.
  • Other Factors (Score:5, Interesting)

    by tji ( 74570 ) on Tuesday April 10, 2007 @12:58PM (#18677827)
    There are other variables than "How does 'The West Wing' look in HD when I'm sitting on my couch". Such as:

    - 1080p provides a good display option for the most common HD broadcast format, 1080i. Since most new displays are based on natively progressive technologies (DLP, LCD, LCOS), you can't just do a 1080i output. So, 1080p allows them to just paint the two 1080i fields together into a progressive frame for high quality display.

    - 720p upscales to 1080p easily. Probably better then downscaling 1080i to 720p and losing information.

    - Computers attached to HDTVs are becoming more and more common (not just game consoles, true computers). Scaling or interlacing has nasty effects on computer displays and all those thin horizontal/vertical lines and detailed fonts. 1080p gives a great display performance for Home Theater PCs.

    - You are not always sitting 12-15' back from the TV. 1080p maintains the quality when you do venture closer to the set.

    - Front Projectors are increasingly common (and cheap), so the display size can be quite large (100-120"), allowing you to see more of the 1080p detail.

    All that said.. If I were buying a new display today, I would still stick with 720p, for two main reasons:

    - Price / Performance. 720p displays are a bargain today, 1080p is still priced at a premium.

    - Quality of available content. The majority of what I watch in HD is from broadcast TV. Many broadcasters are bit-starving their HD channel by broadcasting sub-channels ( e.g. an SD mirror of the main channel, a full-time weather/radar channel, or some new crap channel from the network in an effort to milk more advertising $$). So, the 1080i broadcasts do not live up to the format's capabilities. Watching The Masters last weekend proved that dramatically. My local broadcaster has the bandwidth divided up quite aggressively, so any scenes with fast movement quickly degrade into a mushy field of macroblocks. Utter garbage, and very disappointing.
  • by aibrahim ( 59031 ) <slashmail AT zenera DOT com> on Tuesday April 10, 2007 @02:38PM (#18679621) Homepage Journal
    When talking about 10bit it is going to be 10bit per channel (R, G, B) so a total of 30bit. I would expect a claim of 18bit to be a misleading description of 6bit per channel colour depth.

    Well, to be nitpicky, that's YUV color space not RGB. Almost all LCD panels can display 8 bit per channel color, and of course they do so using RGB natively... so you have conversion issues when looking at video signals. (Computer signals are RGB to begin with so the conversion doesn't matter there.)

    Professional monitors offer more color depth, but ITU Recommendation 709 (Rec 709) states that 8 bits per channel, non-linearly coded, is sufficient for broadcast applications. So, the standard LCD panel isn't that far off the mark as a reciever for HDTV broadcast.

    By contrast Rec 709 suggests 14 bits linear per channel for smooth shading across the entire contrast range. For most production applications a 9 bit non-linear coding will suffice.
  • Re:Other Factors (Score:3, Interesting)

    by evilviper ( 135110 ) on Wednesday April 11, 2007 @08:44PM (#18697119) Journal
    Your description is a bit hard to follow. If I understand you correctly, you want to encode MPEG-2 at 30fps, and encode the other 30frames/sec with VC-1/H.264, except that the VC-1/H.264 stream uses (all) the 30 MPEG-2 frames as reference/keyframes...

    If that is in-fact your idea, there are innumerable technical problems, that would prevent it from working, even in theory. (I'm focusing on h.264, as I don't know VC-1 nearly as well)

    Let's say that h.264 can store the same quality video in 25% of the bit-rate... That should be an approximately appropriate figure.

    Before anything else, for the sake of backwards compatibility, you've already sacrificed half of that benefit, as h.264 is only storing 50% of the video, while MPEG-2 is storing the other half, so, even at this step, you're only (ideally) going to possibly be able to get a 50% lower bitrate, than using MPEG-2 on all 60frames/second.

    I'd guestimate that more than 1/3rd of the bit-rate savings from using h.264 over MPEG-2 are in straight I-frame compression. So, in your scheme, h.264's size benefits over MPEG-2 have already significantly shrunk. So, maybe now you're only getting a 33% bit-rate savings by using h.264.

    Another thing you lose is the ability to re-use the motion vectors from previous frames, rather than having to encode them again in-full, in each frame, as MPEG-2 doesn't have the same level of precision in motion vectors (qpel), not to mention advanced features (weighted/-patial prediction).

    You largely lose the benefits of the in-loop deblocking filter as well, as the MPEG-2 video can't take advantage of it, and so the error caused by blockiness accumulates much more than it would with pure h.264 video.

    Throw in the fact that one of the major bit-rate benefits of h.264 is better I-frame placement, which it really has no control over in such a scheme.

    There's many more such issues, but that should be more than enough to explain why you aren't getting much benefit from h.264 anymore. I've lost track of my percentages now, but I'm pretty sure we're already past to the point that you're getting no benefits from this dual-codec scheme, and I've only just started.

    So far, that's all really been just assuming ideal/lossless video encoding. The perceptual losses are even greater.

    A very big benefit of h.264 over MPEG-2, is that it simply does a better job throwing away more information in the picture, that is unlikely to be perceived by human eyes. But more importantly, it simply throws away DIFFERENT information than MPEG-2. So, when it has to reference MPEG-2 frames, it's going to throw away lots of the information they contain, while it's going to be lacking lots of other information it needs, which MPEG-2 has thrown away in it's lossy processing step. If you limit h.264 to the same level and type of lossy compression as MPEG-2, to prevent this problem, you're seriously cripping h.264 yet again, removing even more of the potential bit-rate benefits.

    And do you remember what I said about reducing frame-rate having diminishing returns? It's quite true, because the larger the difference (spacial and temporal) between frames, the more data that needs to be stored in each frame to represent the difference. Doubling the frame-rate of the MPEG-2 video, from 30fps to 60fps won't take 2X the bit-rate... Assuming you double the GOP size (which is only appropriate), you'll probably find that the 60fps MPEG-2 video only needs about a 1/3rd higher bit-rate than 30fps material. In your scheme, MPEG-2 can't use the h.264 frames as a reference at all, so it has to encode ALL the differences between frames, TWICE, once in the MPEG-2 video, and again to generate the h.264 "between" frames. If it could just be all MPEG-2, and depend on all 60fps being there, it can be significantly more efficient.

    Of course my numbers are all ball-park figures, not thoroughly tested in anything like this scenario (as if that were really possible), but they are based on lots of experience, and

Suggest you just sit there and wait till life gets easier.

Working...