Please create an account to participate in the Slashdot moderation system


Forgot your password?
Television Media Hardware

1080p, Human Vision, and Reality 403

An anonymous reader writes "'1080p provides the sharpest, most lifelike picture possible.' '1080p combines high resolution with a high frame rate, so you see more detail from second to second.' This marketing copy is largely accurate. 1080p can be significantly better that 1080i, 720p, 480p or 480i. But, (there's always a "but") there are qualifications. The most obvious qualification: Is this performance improvement manifest under real world viewing conditions? After all, one can purchase 200mph speed-rated tires for a Toyota Prius®. Expectations of a real performance improvement based on such an investment will likely go unfulfilled, however! In the consumer electronics world we have to ask a similar question. I can buy 1080p gear, but will I see the difference? The answer to this question is a bit more ambiguous."
This discussion has been archived. No new comments can be posted.

1080p, Human Vision, and Reality

Comments Filter:
  • Article Summary (Score:5, Informative)

    by Jaguar777 ( 189036 ) * on Tuesday April 10, 2007 @10:02AM (#18674793) Journal
    If you do the math you come to the conclusion that the human eye can't distinguish between 720p and 1080p when viewing a 50" screen from 8' away. However, 1080p can be very useful for much larger screen sizes, and is handy to have when viewing 1080i content.
  • by fyngyrz ( 762201 ) * on Tuesday April 10, 2007 @10:18AM (#18675041) Homepage Journal

    According to the linked text, the "average" person can see 2 pixels at about 2 minutes of arc, and has a field of view of 100 degrees. There are 30 sets of 2 minutes of arc in one degree, and one hundred of those in the field of view, so we get: 2 * 30 * 100, or about 6000 pixel acuity overall.

    1080p is 1920 horizontally and 1080 vertically at most. So horizontally, where the 100 degree figure is accurate, there is no question that 1080p is about 2/3 less than your ability to see detail, and the answer to the question in the summary is, yes, it is worth it.

    Vertically, let's assume (though it isn't true) that only having one eye-width available cuts your vision's arc in half (it doesn't, but roll with me here.) That would mean that instead of 6000 pixel acuity, you're down to 3000. 1080p is 1080 pixels vertically. In this case, you'd again be at 1/3 of your visual acuity, and again, the answer is yes, it is worth it. Coming back to reality, where you vertical field of view is actually greater than 50 degrees, your acuity is higher and it is even more worth it.

    Aside from these general numbers that TFA throws around (without making any conclusions), the human eye doesn't have uniform acuity across the field of view. You see more near the center of your cone of vision, and you perceive more there as well. Things out towards the edges are less well perceived. Doubt me? Put a hand up (or have a friend do it) at the edge of your vision - stare straight ahead, with the hand at the extreme edge of what you can see at the side. Try and count the number of fingers for a few tries. You'll likely find you can't (it can be done, but it takes some practice - in martial arts, my school trains with these same exercises for years so that we develop and maintain a bit more ability to figure out what is going on at the edges of our vision.) But the point is, at the edges, you certainly aren't seeing with the same acuity or perception that you are at the center focus of your vision.

    So the resolution across the screen isn't really benefiting your perception - the closer to the edge you go, the more degraded your perception is, though the pixel spacing remains constant. However - and I think this is the key - you can look anywhere, that is, place the center of your vision, anywhere on the display, and be rewarded with an image that is well within the ability of your eyes and mind to resolve well.

    There are some color-based caveats to this. Your eye sees better in brightness than it does in color. It sees better in some colors better than others (green is considerably better resolved than blue, for instance.) These differences in perception make TGA's blanket statement that your acuity is 2 pixels per two minutes of arc is more than a little bit of hand-waving. Still, the finest detail in the HD signal (and normal video, for that matter) is carried in the brightness information, and that is indeed where your highest acuity is, so technically, we're still kind of talking about the same general ballpark — the color information is less dense, and that corresponds to your lesser acuity in color.

    There is a simple and relatively easy to access test that you can do yourself. Go find an LCD computer monitor in the 17 inch or larger range that has a native resolution of 1280x1024. That's pretty standard for a few years back, should be easy to do. Verify that the computer attached to it is running in the same resolution. This is about 1/2 HD across, and 1 HD vertically. Look at it. Any trouble seeing the finest details? Of course not. Now go find a computer monitor that is closer to HD, or exactly HD. You might have to go to a dealer, but you can find them. Again, make sure that the computer is set to use this resolution. Now we're talking about HD. Can you see the finest details? I can - and easily. I suspect you can too, because my visual acuity is nothing special. But do the test, if you doubt that HD offers detail that is useful to your perceptions.

    Finally, n

  • by Zontar_Thing_From_Ve ( 949321 ) on Tuesday April 10, 2007 @10:19AM (#18675067)
    Last I checked, other then HD/BR DVD players, and normal DVD players that upscale to 1080p, there are no sources from cable or satellite that broadcast in anything other then 720, so its kind of a moot point. I have heard rumours verizon fios tv will have a few 1080p channels in a few months, but nothing substantial... and last I checked, there boxes do not do 1080p (I could be wrong about the boxes statement though)

    Wow, this is wrong. Since you mentioned Verizon, you must live in the USA. NBC and CBS both broadcast in 1080i right now. Discovery HD and Universal HD do too. Those come to mind fairly quickly. I'm sure there are others. By the way, I wouldn't hold my breath about 1080p TV broadcasts. The ATSC definition for high def TV used in the USA doesn't support it at this time because the bandwidth requirements to do this are enormous.
  • by Radon360 ( 951529 ) on Tuesday April 10, 2007 @10:19AM (#18675069)

    ...depending on how old you are. I think the concern was associated more with X-ray radiation emissions from CRT televisions, and older ones at that (prior to the introduction of the Radiation Control for Health and Safety Act of 1968 []). I would fathom to say that most of us on this site are too young to have been plopped in front of a TV that old for large amounts of time.

  • by JFMulder ( 59706 ) on Tuesday April 10, 2007 @10:22AM (#18675101)
    Mod parent up. I was about to post the same thing : the ATSC standard for broadcasting doesn't permit 1080p signals.
  • by StandardCell ( 589682 ) on Tuesday April 10, 2007 @10:32AM (#18675275)
    Having worked in the high-end DTV and image processing space, our rule of thumb was that the vast majority of people will not distinguish between 1080p and WXGA/720p at normal viewing distances for up to around a 37"-40" screen UNLESS you have native 1920x1080 computer output. It only costs about $50 more to add 1080p capability to the same size glass, but even that is too expensive for many people because of some of the other implications (i.e. more of and more expensive SDRAM for the scaler/deinterlacer especially for PiP, more expensive interfaces like 1080p-capable HDMI and 1080p-capable analog component ADCs, etc.). These few dollars are not just a few dollars in an industry where panel prices are dropping 30% per year. Designers of these "low-end" DTVs are looking to squeeze pennies out of every design. For this reason alone, it'll be quite a while before you see a "budget" 1080p panel in a 26"-40" screen size.

    At some point, panel prices will stabilize, but most people won't require this either way. And, as I mentioned, very few sources will output 1080p anyway. The ones I know of: Xbox360/PS3, HD-DVD, Blu-Ray and PCs. All broadcast infrastructure is capable of 10-bit 4:2:2 YCbCr color sampled 1920x1080, but even that is overkill and does not go out over broadcast infrastructure (i.e. ATSC broadcasts are max 1080i today). The other thing to distinguish is the frame rate. When most people talk about 1080p, they often are implying 1080p at 60 frames per second. Most Hollywood movies are actually 1080p but at 24fps which can be carried using 1080i bandwidths and using pulldown. And you don't want to change the frame rate of these movies anyway because it's a waste of bandwidth and, if you frame rate convert it using motion compensated techniques, you lose the suspension of reality that low frame rates give you. The TV's deinterlacer needs to know how to deal with pulldown (aka "film mode") but most new DTVs can do this fairly well.

    In other words, other than video games and the odd nature documentary that you might have a next-gen optical disc for on a screen size greater than 40" and for the best eyes in that case, 1080p is mostly a waste of time. I'm glad the article pointed this stuff out.

    More important things to look for in a display: color bit depth (10-bit or greater) with full 10-bit processing throughout the pipeline, good motion adaptive deinterlacing tuned for both high-motion and low-motion scenes, good scaling with properly-selected coefficients, good color management, MPEG block and mosquito artifact reduction, and good off-axis viewing angle both horizontally and vertically. I'll gladly take a WXGA display with these features over the 1080p crap that's foisted on people without them.

    If you're out buying a DTV, get a hold of the Silicon Optix HQV DVD v1.4 or the Faroudja Sage DVDs and force the "salesperson" to play the DVD using component inputs to the DTV. They have material that we constantly used to benchmark quality, and that will help you filter out many of the issues people still have with their new displays.
  • by Paladin128 ( 203968 ) <aaron@traa[ ]rg ['s.o' in gap]> on Tuesday April 10, 2007 @10:35AM (#18675321) Homepage
    Though you are correct that human acuity degenerates near its edges of visual range, some people actually look around the screen a bit. I'm setting up my basement as a home theater, and I'll have a 6' wide screen, where I'll be sitting about 9' away. My eyes tend to wander around the screen, so the sharpness at the edges does matter.
  • by Thaelon ( 250687 ) on Tuesday April 10, 2007 @10:37AM (#18675359)
    Here [] is a viewing distance calculator (in Excel) you can use to figure out way more about home theater setups than you'll ever really need.

    It has viewing distances for user selectable monitor/TV/projector resolutions & sizes, seating distances, optimal viewing distances, seating heights(?!), THX viability(?!) etc. It's well researched and cited.

    No I'm not affiliated with it, I just found it and liked it.
  • by redelm ( 54142 ) on Tuesday April 10, 2007 @10:41AM (#18675437) Homepage
    The photo standard for human visual acuity is 10 line-pairs per mm at normal still picture viewing distance (about one meter). 0.1 mil. But 20:20 is only 0.3 mil (1 minute of arc). A 50" diag 16:9 screen is 24.5" vertical. 1080 lines gives 0.58mm each. At 8' range this is 0.24 mil, within 20:20, but not within photo standards.

    Of course, we are looking at moving pictures, which have different, more subjective requirements. A lot depends on content and "immersion". Many people watch these horribly small LCDs (portable and aircraft) with often only 240 lines. Judged for picture quality, they're extremely poor. Yet people still watch, so the content must be compelling enjough to overlook the technical flaws. I personally sometimes experience the reverse effect at HiDef -- the details start to distract from the content!

  • by Anonymous Brave Guy ( 457657 ) on Tuesday April 10, 2007 @10:43AM (#18675465)

    Consider many people can't distinguish between a high definition picture and a standard definition picture warped to fit their HD screen, this question seems largely academic.

    That's because, given a good upscaler, you can't distinguish much difference between DVD quality (which is most people's benchmark of what their SD TV can do) and 720p (which is what most HDTVs show). If by "standard definition" you're talking about crappy, digitally compressed TV channels at lower resolutions, then sure, there's a difference there, though I do wonder how much of the perceived improvement is due simply to using less lossy compression, rather than to genuine resolution improvement.

    Even looking at DVD vs. HD, you can see the difference in things like crowd scenes, detailed nature shots, or sports where the players are filmed from way back so you can see the field as well — basically anything where there isn't enough detail in the source material for any upscaler to work with. However, for most things I watch at least, that doesn't apply. There basically isn't much difference in face shots, action scenes set in a street/building and filmed from fairly close in, or most CGI and special effects.

  • Re:Article Summary (Score:3, Informative)

    by ivan256 ( 17499 ) on Tuesday April 10, 2007 @10:55AM (#18675697)
    Do you own stock in Faroudja or something?

    Great. You can't tell. Here's a cookie.

    The rest of us can tell the difference between well encoded 1080i content and upscaled 480p content. I'm very sorry for you that you can't.

    (And I still think that your real problem is that your television does a crappy job of downscaling 1080i to 720p, and that's why you mistakenly believe your upscaled DVDs look just as good.)
  • Geek Cred (Score:2, Informative)

    by Laoping ( 398603 ) on Tuesday April 10, 2007 @11:12AM (#18675955)
    What you are forgetting is geeks like the shinny top of the line. I mean really, if we can't brag about our gear, what can we brag about. Take that buddy, your TV only does 720p, HA!

    Besides you can always pair your 1080p with this ( html).

    Ohhh Shiny!
  • God Bless Wikipedia (Score:2, Informative)

    by TheNinjaroach ( 878876 ) on Tuesday April 10, 2007 @11:17AM (#18676031)
    Should have checked with Wikipedia first, sorry for the double post. They are the same show:

    Planet Earth is a BBC nature documentary series narrated by David Attenborough, first transmitted in the UK from 5 March 2006. The US version is narrated by Sigourney Weaver.

    The series was co-produced with Discovery Channel and the Japan Broadcasting Corporation (NHK) in association with the CBC, and was described by its makers as "the definitive look at the diversity of our planet". It was also the first of its kind to be filmed entirely in high-definition format. The series has been nominated for the Pioneer Audience Award for Best Programme at the 2007 BAFTA TV awards.

    Clicky []
  • by Anonymous Coward on Tuesday April 10, 2007 @11:25AM (#18676197)
    Do you own a cell phone? Because there are different types of cell phone networks and the "dust" hasn't settled on which one is better.

    Buy a 720p TV. It's so much nicer watching HD content than it is watching SD stuff.
  • by Radon360 ( 951529 ) on Tuesday April 10, 2007 @11:40AM (#18676471)

    You are correct about the lead []. According to this site [], a CRT can have 5-8 pounds of lead in it.

  • by Kadin2048 ( 468275 ) <slashdot,kadin&xoxy,net> on Tuesday April 10, 2007 @11:43AM (#18676541) Homepage Journal
    I think there are multiple techniques used to control X-ray production. Leaded glass might be one of them.

    What's interesting to note is that although you generally think about the picture tube being the source of problematic X-rays, in reality it was some of the other tubes -- particularly rectifier tubes -- back in the guts of older TVs that really had issues. Since modern televisions usually don't contain any tubes besides the one you look at, we don't think about the others very often, but they were at one point a major concern.

    This Q&A [] from the Health Physics Society describes the issue: "The three major sources of x rays from these sets were the picture tube, the vacuum tube rectifier, and the shunt regulator tube. The latter (designations 6EF4 and 6LC6) were a particular problem. Over a third of the 6EF4 tubes tested produced exposure rates above 50 mR/hr at a distance of nine inches, and exposure rates up to 8 R/hr were observed at seven inches with one defective tube!" Just to put that in perspective, 8 R/hr is like ~150 chest X-rays per hour, or like getting a whole-body CAT scan once an hour. Granted, you probably don't usually sit seven inches away from your TVs power supply, but it's still unhealthy. (And a lot of people's cats do...)

    So really, sitting next to the side of that old (1965-70) TV could be a lot more hazardous than sitting in front of it.
  • by fyngyrz ( 762201 ) * on Tuesday April 10, 2007 @11:50AM (#18676647) Homepage Journal
    Well, I hope you are wearing some sweet HD VR goggles. I'm pretty sure my 48" home theater system does not take up 100 degrees of my field of view.

    There is no question that all systems are different; my theater uses a 17' diagonal 1080p projection arrangement, where the seating (that is, the seating that we intend to be theater seating) is all in a line at about 18'. So the display does cover very well in terms of average visual acuity and what 1080p offers. The distances were partially dictated by the building's interior floor plan (pre-existing; it was a church) and partially by the available screen real estate / wall space, which we used 100% of. But I knew this going in; I actually bought the building based on the potential for the theater, plus the ability to retrofit the remainder of the interior into anything we liked - it was basically an empty box when we purchased it.

    Some systems will cover all 100 degrees; some systems will be way too far away and too small to allow pixels to be resolved by any viewer in the room's normal seating. Some will cover more than 100 degrees and require scanning, or perhaps could be characterized as being "all-enveloping" - overscan for your mind. "Sweet VR goggles", indeed.

    The issue at hand, as postulated by the summary was: "Is this performance improvement [as represented by 1080p] manifest under real world viewing conditions?" The answer is certainly yes if you set your system up so as to make sure all the factors work together, but there are caveats (color, luma, center of vision, viewing preferences) that are not inconsequential and that is what I was bringing up so as to try, in my own feeble way, to address the performance improvement question a little more broadly.

    If you don't set your system up so that you can actually see the resolution of the set you choose, you have made some choices that have visual quality reducing consequences in your viewing situation. But turning that around, they have no consequences at all upon people who set their systems up so as to take best advantage of the resolution available. So I think it is still useful to go over how this all works generally, which is what I tried to do.

  • Re:Analogy (Score:3, Informative)

    by vought ( 160908 ) on Tuesday April 10, 2007 @12:36PM (#18677431)
    After all, one can purchase 200mph speed-rated tires for a Toyota Prius®.

    No you can't. []

    186/65R15 (Prius' standard tire size) is only built to an H speed rating. That makes it a 130mph speed-rated tire. No manufacturer builds this tire to a Z speed rating - 186mph+.

    Car and technology analogies are mostly flawed, as are most generalizations.
  • Seeing the Grids (Score:5, Informative)

    by Doc Ruby ( 173196 ) on Tuesday April 10, 2007 @12:44PM (#18677583) Homepage Journal

    Keep in mind this article is written in general terms, so you scientists out there don't need to stand in line to file corrections!

    I was in the Joint Photographic Experts Group (JPEG) when we invented the popular image format. While I worked for a digital camera company inventing an 8Kx8K pixel (40bits color) scanner, having studied in pre-med college both the physics of light and brain neurology of the visual system. So I'll just jump that line of "scientists" to file this correction.

    It's safe to say, however, that increasing resolution and image refresh rate alone are not enough to provide a startlingly better viewing experience in a typical flat panel or rear projection residential installation.

    It's safe to say that only once you've dismissed the scientists who would correct you.

    The lockstep TV screen is a sitting duck for the real operation of they eyes & brain which compensate for relatively low sampling rates with massively parallel async processing in 4D.

    Joseph Cornwall's mistake in his article is to talk like viewers are a single stationary eye nailed at precisely 8' perpendicular to a 50" flat TV, sampling the picture in perfect sync with the TV's framerate. But instead, the visual system is an oculomotor system, two "moving eyes", with continuous/asynchronous sampling. Each retinal cell signals at a base rate of about 40Hz per neuron. But adjacent neurons drift across different TV pixels coming through the eyes' lenses, while those neurons are independently/asynchronously modulating under the light. Those neurons are distributed in a stochastic pattern in the retina which will not coincide with any rectangular (or regular organization of any linear distribution) grid. The visual cortex is composed of layered sheets of neurons which compare adjacent neurons for their own "difference" signal, as well as corresponding regions from each eye. The eyes dart, roll and twitch across the image, the head shakes and waves. So the brain winds up getting lots of subsamples of the image. The main artifact of the TV the eye sees is the grid itself, which used to be only a stack of lines (of nicely continuous color in each line, on analog raster TVs). When compared retinal neurons are signaling at around 40Hz, but at slightly different phase offsets, the cortex sheets can detect that heterodyne at extremely high "beat" frequencies, passing a "buzz" to the rest of the brain that indicates a difference where there is none in the original object rendered into a grid on the TV. Plus all that neural apparatus is an excellent edge enhancer, both in space (the pixels) and in time (the regular screen refresh).

    Greater resolution gives the eyes more info to combine into the brain's image. The extra pixels make the grid turn from edges into more of a texture, with retinal cells resampling more pixels. The faster refresh rate means each retinal neuron has more chance to get light coordinated with its async neighbors, averaged by the retinal persistence into a single flow of frequency and amplitude modulation along the optic and other nerves.

    In fact, the faster refresh is the best part. That's why I got a 50" 1080p DLP: the micromirrors can flip thousands of times a second (LCD doesn't help, and plasma as it's own different pros/cons). 1600x1200 is 1.92Mpxl, at 24bit is 46.08Mb per image. 30Hz refresh would be 1.3824Gbps. But the HDMI cable delivering the image to the DLP is 10.2Gbps, so that's over 200FPS. I'm sure that we'll see better video for at least most of that range, if not all of it. What I'd really like to see is async DLP micromirrors, that flips mirrors off the "frame grid". At first probably just some displacement from the frame boundary, especially if the displacement changes unpredictably each flip. Later maybe a stochastic shift - all to make the image flow more continuously, rather than offering a steady beat the brain/eyes can detect. And also a stochastic di

  • by drinkypoo ( 153816 ) <> on Tuesday April 10, 2007 @12:56PM (#18677797) Homepage Journal

    Though you are correct that human acuity degenerates near its edges of visual range, some people actually look around the screen a bit.

    Thanks to saccades [], all people actually look around the screen a bit.

    Your brain does this automatically when it wants more information about something. It doesn't bother to inform you that it's done it because it would only be disorienting. So the eye movement is not represented in your mental map.

    Keep in mind that if you wear glasses that turn the image upside down for a few months, eventually you will learn to navigate and it will all make sense to you. It's very important to remember that you are not seeing what you think you are seeing. Your conscious mind is delivered a representation of what you are seeing, which has been heavily processed by your brain.

  • by ShakaUVM ( 157947 ) on Wednesday April 11, 2007 @12:08AM (#18685307) Homepage Journal
    There are massive differences between the resolution conversion chips. Two more or less identical models, a 42" Sony and a Sharp, look exactly the same in 1080p format. But in 720, the Sharp looks *terrible*, whereas the Sony still looks pretty good. The upconverters used in a lot of set top boxes are the el cheapo ones, so what you may be seeing are simply artifacts from the conversion.

    That said, I have very sensitive eyes too, and have to return more than half of the monitors I buy due to defects of one kind or another that other people have a hard time seeing. I can easily see the difference between the different HD formats, especially on small details like text in the background. I can only stand a 1080p set, and even those bug me since they're not as good as LCD monitors.

    The price difference between 720p and 1080p sets are a lot bigger than you'd think. With a quick search:
    Sony Grand Wega 42" 1080p: $2100
    Sony Grand Wega 42" 1080i: $1800
    Sony Grand Wega 42" 720p: $1400

    On most models I've been looking at (have been going TV shopping recently), the step up from a 1080i model to 1080p is on the order of a $2000 to $3000 (+50%!) price jump within the same model family.

    You can find cheap 1080p kits, but a lot of them, like the Sharp Aquos, have hidden issues, like it's inability to display SD TV signals very well (SD signals look like total crap on an Aquos, in fact).

"An open mind has but one disadvantage: it collects dirt." -- a saying at RPI