1080p, Human Vision, and Reality 403
An anonymous reader writes "'1080p provides the sharpest, most lifelike picture possible.' '1080p combines high resolution with a high frame rate, so you see more detail from second to second.' This marketing copy is largely accurate. 1080p can be significantly better that 1080i, 720p, 480p or 480i. But, (there's always a "but") there are qualifications. The most obvious qualification: Is this performance improvement manifest under real world viewing conditions? After all, one can purchase 200mph speed-rated tires for a Toyota Prius®. Expectations of a real performance improvement based on such an investment will likely go unfulfilled, however! In the consumer electronics world we have to ask a similar question. I can buy 1080p gear, but will I see the difference? The answer to this question is a bit more ambiguous."
Article Summary (Score:5, Informative)
It isn't that simple. (Score:5, Informative)
According to the linked text, the "average" person can see 2 pixels at about 2 minutes of arc, and has a field of view of 100 degrees. There are 30 sets of 2 minutes of arc in one degree, and one hundred of those in the field of view, so we get: 2 * 30 * 100, or about 6000 pixel acuity overall.
1080p is 1920 horizontally and 1080 vertically at most. So horizontally, where the 100 degree figure is accurate, there is no question that 1080p is about 2/3 less than your ability to see detail, and the answer to the question in the summary is, yes, it is worth it.
Vertically, let's assume (though it isn't true) that only having one eye-width available cuts your vision's arc in half (it doesn't, but roll with me here.) That would mean that instead of 6000 pixel acuity, you're down to 3000. 1080p is 1080 pixels vertically. In this case, you'd again be at 1/3 of your visual acuity, and again, the answer is yes, it is worth it. Coming back to reality, where you vertical field of view is actually greater than 50 degrees, your acuity is higher and it is even more worth it.
Aside from these general numbers that TFA throws around (without making any conclusions), the human eye doesn't have uniform acuity across the field of view. You see more near the center of your cone of vision, and you perceive more there as well. Things out towards the edges are less well perceived. Doubt me? Put a hand up (or have a friend do it) at the edge of your vision - stare straight ahead, with the hand at the extreme edge of what you can see at the side. Try and count the number of fingers for a few tries. You'll likely find you can't (it can be done, but it takes some practice - in martial arts, my school trains with these same exercises for years so that we develop and maintain a bit more ability to figure out what is going on at the edges of our vision.) But the point is, at the edges, you certainly aren't seeing with the same acuity or perception that you are at the center focus of your vision.
So the resolution across the screen isn't really benefiting your perception - the closer to the edge you go, the more degraded your perception is, though the pixel spacing remains constant. However - and I think this is the key - you can look anywhere, that is, place the center of your vision, anywhere on the display, and be rewarded with an image that is well within the ability of your eyes and mind to resolve well.
There are some color-based caveats to this. Your eye sees better in brightness than it does in color. It sees better in some colors better than others (green is considerably better resolved than blue, for instance.) These differences in perception make TGA's blanket statement that your acuity is 2 pixels per two minutes of arc is more than a little bit of hand-waving. Still, the finest detail in the HD signal (and normal video, for that matter) is carried in the brightness information, and that is indeed where your highest acuity is, so technically, we're still kind of talking about the same general ballpark — the color information is less dense, and that corresponds to your lesser acuity in color.
There is a simple and relatively easy to access test that you can do yourself. Go find an LCD computer monitor in the 17 inch or larger range that has a native resolution of 1280x1024. That's pretty standard for a few years back, should be easy to do. Verify that the computer attached to it is running in the same resolution. This is about 1/2 HD across, and 1 HD vertically. Look at it. Any trouble seeing the finest details? Of course not. Now go find a computer monitor that is closer to HD, or exactly HD. You might have to go to a dealer, but you can find them. Again, make sure that the computer is set to use this resolution. Now we're talking about HD. Can you see the finest details? I can - and easily. I suspect you can too, because my visual acuity is nothing special. But do the test, if you doubt that HD offers detail that is useful to your perceptions.
Finally, n
Re:Does anyone even broadcast 1080p.... (Score:5, Informative)
Wow, this is wrong. Since you mentioned Verizon, you must live in the USA. NBC and CBS both broadcast in 1080i right now. Discovery HD and Universal HD do too. Those come to mind fairly quickly. I'm sure there are others. By the way, I wouldn't hold my breath about 1080p TV broadcasts. The ATSC definition for high def TV used in the USA doesn't support it at this time because the bandwidth requirements to do this are enormous.
Mom might have been right.... (Score:2, Informative)
...depending on how old you are. I think the concern was associated more with X-ray radiation emissions from CRT televisions, and older ones at that (prior to the introduction of the Radiation Control for Health and Safety Act of 1968 [fda.gov]). I would fathom to say that most of us on this site are too young to have been plopped in front of a TV that old for large amounts of time.
Comment removed (Score:3, Informative)
More info on 720p/WXGA (Score:5, Informative)
At some point, panel prices will stabilize, but most people won't require this either way. And, as I mentioned, very few sources will output 1080p anyway. The ones I know of: Xbox360/PS3, HD-DVD, Blu-Ray and PCs. All broadcast infrastructure is capable of 10-bit 4:2:2 YCbCr color sampled 1920x1080, but even that is overkill and does not go out over broadcast infrastructure (i.e. ATSC broadcasts are max 1080i today). The other thing to distinguish is the frame rate. When most people talk about 1080p, they often are implying 1080p at 60 frames per second. Most Hollywood movies are actually 1080p but at 24fps which can be carried using 1080i bandwidths and using pulldown. And you don't want to change the frame rate of these movies anyway because it's a waste of bandwidth and, if you frame rate convert it using motion compensated techniques, you lose the suspension of reality that low frame rates give you. The TV's deinterlacer needs to know how to deal with pulldown (aka "film mode") but most new DTVs can do this fairly well.
In other words, other than video games and the odd nature documentary that you might have a next-gen optical disc for on a screen size greater than 40" and for the best eyes in that case, 1080p is mostly a waste of time. I'm glad the article pointed this stuff out.
More important things to look for in a display: color bit depth (10-bit or greater) with full 10-bit processing throughout the pipeline, good motion adaptive deinterlacing tuned for both high-motion and low-motion scenes, good scaling with properly-selected coefficients, good color management, MPEG block and mosquito artifact reduction, and good off-axis viewing angle both horizontally and vertically. I'll gladly take a WXGA display with these features over the 1080p crap that's foisted on people without them.
If you're out buying a DTV, get a hold of the Silicon Optix HQV DVD v1.4 or the Faroudja Sage DVDs and force the "salesperson" to play the DVD using component inputs to the DTV. They have material that we constantly used to benchmark quality, and that will help you filter out many of the issues people still have with their new displays.
Re:It isn't that simple. (Score:5, Informative)
Viewing distance calculator (Score:5, Informative)
It has viewing distances for user selectable monitor/TV/projector resolutions & sizes, seating distances, optimal viewing distances, seating heights(?!), THX viability(?!) etc. It's well researched and cited.
No I'm not affiliated with it, I just found it and liked it.
Perhaps true, but technically iffy (Score:5, Informative)
Of course, we are looking at moving pictures, which have different, more subjective requirements. A lot depends on content and "immersion". Many people watch these horribly small LCDs (portable and aircraft) with often only 240 lines. Judged for picture quality, they're extremely poor. Yet people still watch, so the content must be compelling enjough to overlook the technical flaws. I personally sometimes experience the reverse effect at HiDef -- the details start to distract from the content!
It depends what you watch (Score:3, Informative)
That's because, given a good upscaler, you can't distinguish much difference between DVD quality (which is most people's benchmark of what their SD TV can do) and 720p (which is what most HDTVs show). If by "standard definition" you're talking about crappy, digitally compressed TV channels at lower resolutions, then sure, there's a difference there, though I do wonder how much of the perceived improvement is due simply to using less lossy compression, rather than to genuine resolution improvement.
Even looking at DVD vs. HD, you can see the difference in things like crowd scenes, detailed nature shots, or sports where the players are filmed from way back so you can see the field as well — basically anything where there isn't enough detail in the source material for any upscaler to work with. However, for most things I watch at least, that doesn't apply. There basically isn't much difference in face shots, action scenes set in a street/building and filmed from fairly close in, or most CGI and special effects.
Re:Article Summary (Score:3, Informative)
Great. You can't tell. Here's a cookie.
The rest of us can tell the difference between well encoded 1080i content and upscaled 480p content. I'm very sorry for you that you can't.
(And I still think that your real problem is that your television does a crappy job of downscaling 1080i to 720p, and that's why you mistakenly believe your upscaled DVDs look just as good.)
Geek Cred (Score:2, Informative)
Besides you can always pair your 1080p with this (http://www.oppodigital.com/dv981hd/dv981hd_index
Ohhh Shiny!
God Bless Wikipedia (Score:2, Informative)
Clicky [wikipedia.org]
Re:It isn't that simple. (Score:1, Informative)
Buy a 720p TV. It's so much nicer watching HD content than it is watching SD stuff.
Re:Mom might have been right.... (Score:5, Informative)
You are correct about the lead [wikipedia.org]. According to this site [state.mn.us], a CRT can have 5-8 pounds of lead in it.
Re:Mom might have been right.... (Score:5, Informative)
What's interesting to note is that although you generally think about the picture tube being the source of problematic X-rays, in reality it was some of the other tubes -- particularly rectifier tubes -- back in the guts of older TVs that really had issues. Since modern televisions usually don't contain any tubes besides the one you look at, we don't think about the others very often, but they were at one point a major concern.
This Q&A [hps.org] from the Health Physics Society describes the issue: "The three major sources of x rays from these sets were the picture tube, the vacuum tube rectifier, and the shunt regulator tube. The latter (designations 6EF4 and 6LC6) were a particular problem. Over a third of the 6EF4 tubes tested produced exposure rates above 50 mR/hr at a distance of nine inches, and exposure rates up to 8 R/hr were observed at seven inches with one defective tube!" Just to put that in perspective, 8 R/hr is like ~150 chest X-rays per hour, or like getting a whole-body CAT scan once an hour. Granted, you probably don't usually sit seven inches away from your TVs power supply, but it's still unhealthy. (And a lot of people's cats do...)
So really, sitting next to the side of that old (1965-70) TV could be a lot more hazardous than sitting in front of it.
Re:It isn't that simple. (Score:3, Informative)
There is no question that all systems are different; my theater uses a 17' diagonal 1080p projection arrangement, where the seating (that is, the seating that we intend to be theater seating) is all in a line at about 18'. So the display does cover very well in terms of average visual acuity and what 1080p offers. The distances were partially dictated by the building's interior floor plan (pre-existing; it was a church) and partially by the available screen real estate / wall space, which we used 100% of. But I knew this going in; I actually bought the building based on the potential for the theater, plus the ability to retrofit the remainder of the interior into anything we liked - it was basically an empty box when we purchased it.
Some systems will cover all 100 degrees; some systems will be way too far away and too small to allow pixels to be resolved by any viewer in the room's normal seating. Some will cover more than 100 degrees and require scanning, or perhaps could be characterized as being "all-enveloping" - overscan for your mind. "Sweet VR goggles", indeed.
The issue at hand, as postulated by the summary was: "Is this performance improvement [as represented by 1080p] manifest under real world viewing conditions?" The answer is certainly yes if you set your system up so as to make sure all the factors work together, but there are caveats (color, luma, center of vision, viewing preferences) that are not inconsequential and that is what I was bringing up so as to try, in my own feeble way, to address the performance improvement question a little more broadly.
If you don't set your system up so that you can actually see the resolution of the set you choose, you have made some choices that have visual quality reducing consequences in your viewing situation. But turning that around, they have no consequences at all upon people who set their systems up so as to take best advantage of the resolution available. So I think it is still useful to go over how this all works generally, which is what I tried to do.
Re:Analogy (Score:3, Informative)
No you can't. [tirerack.com]
186/65R15 (Prius' standard tire size) is only built to an H speed rating. That makes it a 130mph speed-rated tire. No manufacturer builds this tire to a Z speed rating - 186mph+.
Car and technology analogies are mostly flawed, as are most generalizations.
Seeing the Grids (Score:5, Informative)
I was in the Joint Photographic Experts Group (JPEG) when we invented the popular image format. While I worked for a digital camera company inventing an 8Kx8K pixel (40bits color) scanner, having studied in pre-med college both the physics of light and brain neurology of the visual system. So I'll just jump that line of "scientists" to file this correction.
It's safe to say that only once you've dismissed the scientists who would correct you.
The lockstep TV screen is a sitting duck for the real operation of they eyes & brain which compensate for relatively low sampling rates with massively parallel async processing in 4D.
Joseph Cornwall's mistake in his article is to talk like viewers are a single stationary eye nailed at precisely 8' perpendicular to a 50" flat TV, sampling the picture in perfect sync with the TV's framerate. But instead, the visual system is an oculomotor system, two "moving eyes", with continuous/asynchronous sampling. Each retinal cell signals at a base rate of about 40Hz per neuron. But adjacent neurons drift across different TV pixels coming through the eyes' lenses, while those neurons are independently/asynchronously modulating under the light. Those neurons are distributed in a stochastic pattern in the retina which will not coincide with any rectangular (or regular organization of any linear distribution) grid. The visual cortex is composed of layered sheets of neurons which compare adjacent neurons for their own "difference" signal, as well as corresponding regions from each eye. The eyes dart, roll and twitch across the image, the head shakes and waves. So the brain winds up getting lots of subsamples of the image. The main artifact of the TV the eye sees is the grid itself, which used to be only a stack of lines (of nicely continuous color in each line, on analog raster TVs). When compared retinal neurons are signaling at around 40Hz, but at slightly different phase offsets, the cortex sheets can detect that heterodyne at extremely high "beat" frequencies, passing a "buzz" to the rest of the brain that indicates a difference where there is none in the original object rendered into a grid on the TV. Plus all that neural apparatus is an excellent edge enhancer, both in space (the pixels) and in time (the regular screen refresh).
Greater resolution gives the eyes more info to combine into the brain's image. The extra pixels make the grid turn from edges into more of a texture, with retinal cells resampling more pixels. The faster refresh rate means each retinal neuron has more chance to get light coordinated with its async neighbors, averaged by the retinal persistence into a single flow of frequency and amplitude modulation along the optic and other nerves.
In fact, the faster refresh is the best part. That's why I got a 50" 1080p DLP: the micromirrors can flip thousands of times a second (LCD doesn't help, and plasma as it's own different pros/cons). 1600x1200 is 1.92Mpxl, at 24bit is 46.08Mb per image. 30Hz refresh would be 1.3824Gbps. But the HDMI cable delivering the image to the DLP is 10.2Gbps, so that's over 200FPS. I'm sure that we'll see better video for at least most of that range, if not all of it. What I'd really like to see is async DLP micromirrors, that flips mirrors off the "frame grid". At first probably just some displacement from the frame boundary, especially if the displacement changes unpredictably each flip. Later maybe a stochastic shift - all to make the image flow more continuously, rather than offering a steady beat the brain/eyes can detect. And also a stochastic di
Re:It isn't that simple. (Score:5, Informative)
Thanks to saccades [wikipedia.org], all people actually look around the screen a bit.
Your brain does this automatically when it wants more information about something. It doesn't bother to inform you that it's done it because it would only be disorienting. So the eye movement is not represented in your mental map.
Keep in mind that if you wear glasses that turn the image upside down for a few months, eventually you will learn to navigate and it will all make sense to you. It's very important to remember that you are not seeing what you think you are seeing. Your conscious mind is delivered a representation of what you are seeing, which has been heavily processed by your brain.
Re:It isn't that simple. (Score:3, Informative)
That said, I have very sensitive eyes too, and have to return more than half of the monitors I buy due to defects of one kind or another that other people have a hard time seeing. I can easily see the difference between the different HD formats, especially on small details like text in the background. I can only stand a 1080p set, and even those bug me since they're not as good as LCD monitors.
The price difference between 720p and 1080p sets are a lot bigger than you'd think. With a quick search:
Sony Grand Wega 42" 1080p: $2100
Sony Grand Wega 42" 1080i: $1800
Sony Grand Wega 42" 720p: $1400
On most models I've been looking at (have been going TV shopping recently), the step up from a 1080i model to 1080p is on the order of a $2000 to $3000 (+50%!) price jump within the same model family.
You can find cheap 1080p kits, but a lot of them, like the Sharp Aquos, have hidden issues, like it's inability to display SD TV signals very well (SD signals look like total crap on an Aquos, in fact).