When is 720p Not 720p? 399
Henning Hoffmann writes "HDBlog has an interesting entry about many home theater displays.
Home theater displays around the resolution of 720p (most DLP, LCD, and LCOS displays) must convert 1080i material to their native resolution for display. No surprise there. But many displays do this by discarding half of the 1080i HD signal, effectively giving 720p viewers an SD signal - not watching HD at all! "
Home theater displays around the resolution of 720p (most DLP, LCD, and LCOS displays) must convert 1080i material to their native resolution for display. No surprise there. But many displays do this by discarding half of the 1080i HD signal, effectively giving 720p viewers an SD signal - not watching HD at all! "
For the inevitable /.ing (Score:5, Informative)
When is 720p not 720p?
Tom Norton, in his coverage of the Home Entertainment expo, brought something up that I was unaware of.
720p displays show native 720p signals directly, of course. They also upconvert SD signals (like DVD) up to 720p for display. And 720p displays must convert incoming 1080i signals to 720p before they can be displayed. No surprise there, this makes sense. But, Silicon Optix claims that most manufacturers do the 1080i conversion just by taking one 540 line field from each 1080i frame (which is composed of two 540 line fields) and scaling that one field up to 720p, ignoring the other field. Reason being, it takes a lot less processing power to do this than to convert the image to 1080p and scale that, which would use all the information in the original signal to derive the 720p signal. If you have a display like this, it means that you're watching 540 lines of resolution upconverted to 720p. This is not HD, just like watching a DVD upconverted to 720p is not HD. Sure, you'll get the full width of the 1080i resolution, but you're only getting half the height. While this is better than DVD, it's not HD in my mind. (Aside: Tom Norton mentions this in his review of the Screenplay 777 projector.)
If this is indeed the case, most people with 720p (or similar) projectors (and most DLP, LCD, and LCOS home theater projectors are exactly that) are not seeing what their displays are capable of. They're not, technically, even watching HD. This is crazy! How can this be? Why haven't we heard of this before? How are manufacturers getting away with it?
Over-reacting? Well, if you're an owner of a 720p (or any similar resolution) projector you're either gonna be really upset by this or you're just gonna be laisez-faire about it because there's nothing you can do and you're enjoying your projector just fine thank-you. But me, I don't even own any such projector and I'm a little ticked. But I guess I should really wait for evidence of how properly-done conversion looks in comparison before making any snap judgements. I'm sure that the people selling HQV (a processor chip that does it the RIGHT way) will set something up.
To me, this is a serious issue. Comments are welcome.
from: http://www.hdblog.net/ [hdblog.net]
Workaround is to use an HTPC... (Score:5, Informative)
As a reference, my Athlon XP running at 2.4 GHz (aproximately equivalent to an Athlon XP 3400+) with a Geforce 6800GT and TheaterTek 2.1 software will have (little) trouble achieving this, assuming the 1080i source isn't glitchy itself.
Alternative is to use the NVIDIA DVD Decoder version 1.0.0.67 ($20 US after 30 day trial) and ZoomPlayer 4.5 beta ($20 beta or nagware) for similar results.
TheaterTek is roughly $70 US and includes updated NVIDIA DVD Decoders - too bad NVIDIA hasn't updated their official DVD decoders with the bugfixes that is present in the TheaterTek package.
If you can't see the problem, is there a problem? (Score:4, Informative)
Well, a little worse, actually... (Score:4, Informative)
Except your 720p display will hopefully have a horizontal resolution of 1280. 1080i video has a horizontal resolution of 1920. So, you're keeping half of the vertical (1080 lines to 540) and you;re keeping 2/3rds of the horizontal (1920 down to 1280).
Ouch.
This is what you get..... (Score:3, Informative)
That said, I'm sure there a lot of people out who "don't care". It works form them, and that is all they care about.
Re:Reminds me of Sound Blaster (Score:5, Informative)
The first incorrect thing in the
Anyone who is serious about getting the absolute most out of their display will have an external scaler and a device to delay the audio. Frankly as digital display technologies take more of a foothold in the market I'm hoping these interlaced resolutions will become far less common.
When I first read the headlines I thought they would perhaps talk about 1024x768 plasmas with rectangular pixels being marketed as 720p. That kind of thing is far more blasphemous in my opinion.
So in summary of TFA: 720p is not 720p when it's 1080i.
Consider the source too! (Score:3, Informative)
And in other instances, the broadcaster will not use the full resolution - what looks like 1920x1080i may actually be an upconvert of 1280x1080i, 1440x1080i, or 1280x720! And then there is the overcompression - taking a 20mb/sec mpeg2 stream and cutting the bitrate in half - compression artifacts galore.
It is sad when HDTV programming available in North America can be WORSE than the DVD!
You never know what you get in US (and other?) HDTV broadcasts. My understanding is that only the Japanese use minimal mpeg2 compress - I saw snippets of Contact (with Japanese subtitles) in full glorious 1920x1080i at the maximum 20 mbit/sec bitrate - and it was glorious!
spatial vs temporal resolution (Score:5, Informative)
The clever algorithm is a "Fourier transform" (Score:5, Informative)
It's not a distortion-free transform, since high frequency signals (e.g. sharp edges) in the original image get interpreted as smooth changes and can get blurred between multiple pixels in an upsampled signal. But then again, that's exactly the sort of thing that happens when you digitize a picture in the first place - if you have a sharp black/white edge that passes through the middle of a pixel, the most accurate thing you can do is make that pixel gray.
Re:Resampling: Imagine a 1-pixel-wide line (Score:5, Informative)
Its a bit messy. Imagine 1080i image with a 1-pixel wide sloping black line that is nearly horizontal on a white background. If you throw out half the data, you create an image with a dashed-line. Gaps in the line occur where the slanting line cut across the rows that were discarded. If you upsample from 540 to 720, you will find that the remaining dashes become fattened non-uniformly. In places where the row in the 720-row image falls directly on top of the 540 row image, the line will be thin and dark. In places where the 720-row image falls midway between rows in the 540 row image, the line will be wide and less dark. The end result is the thin black uniform line is converted to a dashed line of varying thickness and darkness -- not pretty.
Even if you resample directly from 1080 to 720, you still run into problems where the 720-row image pixels fall between the 1080-row pixels. At best, you can use higher-order interpolation (e.g. cubic) to try and fit a curve through the original data and try to estimate what was in the middle of the pixels so they can be shifted half way over. But the result wil never look like an image that was taken with a 720-row camera in the first place.
Re:Well, a little worse, actually... (Score:4, Informative)
High, I just got back from NAB show in Las Vegas last week. The vendors were had HD Cams that would film and record 1920x1080i. That somepoint is today.
Re:Should have bought a 1080i screen then! (Score:3, Informative)
Personally I'm getting a samsung 6168 model.
Re:For the inevitable /.ing (Score:5, Informative)
He's leaving one step out. 1080i is 540 lines scanned 60 times per second, offset by half a vertical pitch. 720p is 720 lines scanned at 30 times persecond.
To try and take two frames which are not occuring at the same instant, stitch them together, remove the motion artifacts, resample, and then display is just plain silly. And frought with errors, as you are expecting a computer to determine which parts of the motion (over 1/60 of a second) to keep and which to throw away.
If you wanted high fidelity, you'd spend the money for a 1080p60 system. Then it wouldn't matter. Except that you would complain about the quality, because each frame you see was upsampled from only 540 lines of resolution.
It all comes back to the fact the the FCC let the industry choose this "18 formats is good" spec.
Personally, I'm in favor of an olympic standard mayonaise, but...no...wait...awww hell, I give up.
Comment removed (Score:3, Informative)
Not news - Buy a scaler. (Score:5, Informative)
Anyway, it's fairly well known that the internal scalers in many devices suck. That is why there is a market for good external scalers. If you are paranoid about watching a lot of 1080i on your 720p projector or LCD TV or Plasma, go buy a scaler. They cost about $1000 but will improve scaled display a lot.
At least if you have an external scaler you will have some options about how you convert 1080i to 720p. The article makes it sound like splitting the fields is a huge sin -- and it is if you discard one field per frame (Half field deinterlacing), but it's perfectly acceptible to scale EACH 540-line field to a seperate 720-line frame and double the framerate. This is called bob deinterlacing and is often the best for converting 1080i video to lower resolutions. If you are watching a 1080i upconvert of a film or something, though, you can have the scaler do full field deinterlacing and inverse telecine for you and see a nice 720p/24fps picture. Scalers also generally have internal audio delays for various types of audio feeds so you won't have to worry about AV sync issues either.
If you have any questions about how your device does this, you should try to find out before you buy it. Most devices don't publish how they do it, though, so your only option may be to derive it -- and that will not be an easy job.
More Like When is HD Not HD (Score:4, Informative)
If you're looking to get into HD there are a *lot* of little quirks to take into account, such as:
- Offically there are two HD resolutions, 720p and 1080i
- Most HD TV's are only capable of *one* of these resolutions. So you have to choose, 720p OR 1080i in most cases. If you want one that can do both, check very carefully.. forget DLP or LCD based devices (fixed set of pixels so fixed resolution), CRT only.
- Many HDTV's will *not* convert from one format to another. They accept only their native resolution.
- Different networks broadcast using one standard or the other. For example CBS uses 1080i and ABC 720p IIRC. Fox is way behind in HDTV support.
- Most HDTV receivers can handle either a 720p or 1080i signal and will convert as required for your TV's native resolution.
- Some TV providers only support one format, regardless of the source material. Ie, in Canada Starchoice only broadcasts in 1080i. Any 720p content they have they upconvert to 1080i before broadcasting. It's impossible to receive a native 720p signal from them.
- The Xbox supports both HDTV modes... but very few HD games actually use 1080i (Dragons Lair being one). Most are 720p. So if this is important to you, you'll possibly want a 720p native TV: most receivers do not have HD inputs that would let you upconvert a 720p game to a 1080i signal for the TV. (the new Xbox will have more HD content than the current one, but it's a good bet that they'll be mostly 720p titles)
- Most Projectors and Plasma's are *not* HDTV. They are EDTV (enhanced definition) or some such. Check the specs carefully.
- Most projectors are 1024x768. This means your HD signal of 1920x1080i or 1920x720p is being heavily rescaled horizontally! Few projectors have a true HD native resolution.
So there you go... lots of fun things to take into account!
Blockwars [blockwars.com]: free, multiplayer, head to head Tetris like game
Re:Reminds me of Sound Blaster (Score:4, Informative)
there are not many 720p displays to begin with. (Score:3, Informative)
only cheap projectors or displays have a maximum res of 720p.
I dont see many of those anyhow.
but yes, on those displays the signal is downconverted by chopping out 1/2 of it.
however, these displays are not popular anyhow.
some of the most popular displays still cant display native, but they are still can display either XGA or SXGA with no proble (were getting pretty close to hd at this point)
dont buy a cheap projector, and you wont get a cheap display. you get what you pay for.
Re:Should have bought a 1080i screen then! (Score:3, Informative)
No, that's the standard (Score:4, Informative)
Re:The clever algorithm is a "Fourier transform" (Score:2, Informative)
Disagree with second paragraph. Upsampling should have more Gibb's ringing around the discontinuities in the image. This is why one introduces a Hannning or Hamming window during sinc interpolation. Whether that manifests itself as a blurring depends on the context of the discontinuity within the image. Upsampling via FT is not the same as linear (or even simple non-linear) weighting during digitization.
Comment removed (Score:4, Informative)
Much ado about nothing. (Score:4, Informative)
The whiners in TFA mistakenly assume that 2 fields of 1080i = 1 frame of 1080p. This is WRONG, WRONG, WRONG.
It cannot be assumed that the following field has anything to do with the current one. See the "not resized or deinterlaced" picture here:
http://www.100fps.com/ [100fps.com]
When the television takes the 540 lines in a given field, interpolates the missing lines, and scales to 720 lines, it is DOING THE RIGHT THING. Otherwise your TV would look like the first two example pictures at the above site.
Nathan
Re:More Like When is HD Not HD (Score:1, Informative)
Re:Reminds me of Sound Blaster (Score:4, Informative)
Look at the Source (Score:3, Informative)
I'm almost sure their scaler will help with most sources you feed your 720p HDTV, what it can do with 480i DVD's is impressive enough that you would believe that. However, I doubt the problem is as bad as they say it is. Also, 1080p DLP sets are going to start hitting the market soon, and in a couple of years 720p will probably have been pushed out of the market mostly. Given what a scaler costs, I'd probably save my money to get the 1080p set in a couple years since the 720p sets still look great.
I have a 1080i set, but I considered a 720p DLP set since they looked amazing and only didn't because of cost.
Re:Reminds me of Sound Blaster (Score:2, Informative)
The NTSC video standard has 525 lines.
The PAL and SECAM video standards have 625 lines.
So where does 480 linrd come from?
480 lines are actually shown on a typically television. The remains of the signal are overscanned and not shown. Though many displays slightly overscan an HD signal, this can be also be filed under the blasphemy that is 720p not being 720p, and there is no technical reason it needs to be done.
Re:Which Models? (Score:2, Informative)
Re:More Like When is HD Not HD (Score:3, Informative)
I have a Mitsubishi X390U linked up at home - it's your typical 1024x768 resolution. I've got the comcast HD box linked up to a TiVO (SD only) and (HD feed) directly into the amp's component inputs. The result is that I can switch between HD & SD at the flick of a button. It all gets projected onto an 8' screen.
The difference between HD footage and the *same* SD footage is that of night and day. Heavily rescaled or no - at the end of a movie the tiny credit text is easily readable on HD (as opposed to a blurred just-about-legible mess on the SD feed). The movie itself just blows me away. Every time
Same source, same amp, same projector. World of difference.
So, (and I'm not disagreeing over the scaling), you do get an excellent upgrade from SD when using a 1024x768 projector.
Simon
Easy Fix (Score:3, Informative)
If you're a real quality nut like me then get a tube based HDTV, they can actually get close to doing 1080i.
Re:Reminds me of Sound Blaster (Score:5, Informative)
Lines 243-262 of each frame (off the bottom of the TV) start with 0.3V for 4.7us, and the rest is 0V. This tells the TV to prepare for a new frame.
This leaves just 242*2=484 lines of effective display.
http://eyetap.org/ece385/lab5.htm [eyetap.org]
Re:Which Models? (Score:3, Informative)
Re:If you can't see the problem, is there a proble (Score:4, Informative)
If you've ever seen high-def on a 480p EDTV plasma, you'll understand just how superior the picture STILL is compared to 480i NTSC.
Nevertheless, true 1080p deinterlacing is coming down the pike right now. Faroudja, SiliconOptix, and Gennum have all created solutions, and we should begin seeing them in external video processors and displays soon.
Silicon Optix ASTROTURF? Naw, it couldn't be... (Score:3, Informative)
Gennum, Pixelworks, Genesis, Oplus (now Intel), and several others make their own scaler/deinterlacer chips. Most of these have already found their way into displays and have proper deinterlacing strategies in them. Nobody scales without deinterlacing first anyway in a modern image processor.
Silicon Optix's technology is based on offline film processors by Teranex. While they can certainly be high quality, they aren't the top of the heap either by volume or by prestige. Genesis/Faroudja had a name for a long time with their "line doublers" which are over 20 years old and their more advanced but cheap gm1601 is one of the more popular solutions for HDTVs. Gennum's GF9350 with VXP technology is currently in the largest plasma tv in the world (Samsung 80"). These and other scaler/deinterlacer chips have none of the problems that Silicon Optix claims exist. If you look at the debates that rage over at the usual enthusiast sites, you'll see that there are issues with its own technology like latency and cost that aren't present in the other solutions I mentioned.
Just like Silicon Optix's "odd film cadence technology" which requires nothing different than what everyone else has today, this reeks of a cheap PR vehicle. While the choice of scaler and deinterlacer is important, it is not the utter tragedy that SO would like to make it out nor are they the saviors of the HDTV world. If they know who the culprits are, then let them name whose image processor it is that creates these problems.
Re:For the inevitable /.ing (Score:5, Informative)
720p is 720 lines scanned at 30 times persecond.
Mostly incorrect.
There are 18 recognized MPEG stream formats for HDTV.
In presenting these on a monitor, your receiver/settop box/whatever is supposed to turn them to a format that your monitor can handle. This will typically be one of these four:
It is noteworthy, though, that some videophile monitors can handle, and set-top boxes deliver, 1080 row 60Hz progressive.
As for the presence/absence of interlacing, I agree that it is very bad to use interlacing at the strem level. This should be eliminated. I would make an exception for the 480 modes, because the material may have been originally captured on NTSC videotape, in which case some sort of conversion would have to take place to get a progressive image, and I feel very strongly that conversions should never be done for broadcast unless absolutely necessary (as when showing PAL/SECAM native material).
On the other hand, at the monitor level, if you have an interlaced monitor, I don't think that is a major issue. In 1080 mode, the best picture that can be sent is the 30fps progressive stream. This can be interlaced for presentation on a cRT.
Now, someone commented that CRT's are dead. Not if you have a budget, they're not! I've owned an HD set for over three years now, and it only ran me $700. It is a CRT. It has a beautiful picture.
Further, I would put forth that CRT's, in addition to being significantly cheaper than the alternatives, also put out a better picture than LCD (view from any angle; accurate color rendition; no lag), are less susceptible to burn-in than plasma (which will be killed by network bugs) and do not exhibit the rainbow effect of DLP (which, in fariness is not really all that bad). Their major failings are their physical size and power consumption.
Re:Resampling (Score:1, Informative)
It is counterintuitive, but in fact, whether a resampling is done in round numbers or not is irrelevant. As an example, take a quality image into photoshop and resample it such that it's one pixel wider, or 3% wider, or some such. It looks exactly the same, and for good reason.
The simple, easy, and totally misleading way to think of pixels is as if they are little squares of color. But they're not: when you snap a picture, it's not a picture of a bunch of little squares. Instead, pixels are point samples of continuous data.
As a though experiment, it's easy to work with one dimensional data instead of two, but it works out the same. Consider an audio file. (Let's do mono for simplicity.) It consists of point-samples of an audio waveform. Plot these samples on a piece of paper and connect the dots. Now resample the continuous curve at whatever resolution you care to. You see how the question of whether the resampling ratio is integral (2x, 3x) is irrelevant? 1.5x works just fine, as does 1.183826x.
Re:ED displays (Score:1, Informative)
There are two very real reasons you may prefer an ED set over an HD set.
1. DVDs are 480i - a progressive scan dvd player can do a great job of upscaling the picture to 480p. An Ed set will display this without any further scaling. (scaling is never as good as direct display). Most programs these days is still 480i (and crappy quality at that). the more you upscale this the worse it's going to look. Upscaling crap is going to give you magnified crap. This is a very real problem for many HD owners. THey can't stand watching 480i programming, but the majority of programming is still 480i.
2. Resolution is harder and harder to discern the further you sit from your television. If you sit 8' from a 42" set YOU WILL NOT SEE A DIFFERENCE BETWEEN 480p and 720p (or 1080i for that matter). With the human eye it is impossible to see the difference. 8' is a common viewing distance. If you get a 50" set or larger I do recommend HD. For 42" it is questionable if you need HD.
The third reason is that ED's cost less than HD (about half the price). Of course cost less does not equal best value. However, if you mostly watch DVD's and 480i cable tv. And if you sit 8' from your tv. and your tv is less than 50" I think it is a very reasonable decision to opt for an ED set opposed to HD.
You may say, but in a few years majority of programming will be HD. Yes, this is true, but in a few years todays HD tvs will cost less than half what they do now (a predicted 30% drop each year that so far has been holding true). So why not buy an ED now, wait for the programming to eventually become HD and then buy an HD at that time - for less money than if you bought an HD today?
Hmmm... even after all of this you may still prefer an HD today. BUT, are you going to be so quick to say that ED is a load of crap?
Re:For the inevitable /.ing (Score:3, Informative)
As an example, if you own the above box, or the PVR box, (both are silver and provided by Comcast), do the following:
Turn off your cable box, then press 'setup' on the remote. You'll get a different config screen, one which allows you, among other things, tell the box what 'resolution' to display. (1080i, 720p, 480p, 480i). The Motorola scaler seems to do a great job, I'm not sure the algorithm / method, but I'm pretty sure it doesn't dice the 1080i fieldset then scale/stretch.
Early plasma TVs who have a native resolution of 852x480, and who have bad scalers in the TVs, benefit by setting the resolution to 480p.
I got lucky, my plasma has an excellent scaler so I left the cable box settings as-is.
Re:How to prove/disprove it easily. (Score:2, Informative)
Re:Shameless Atari / Amiga Plug (Score:5, Informative)
Re:ED displays (Score:3, Informative)
Many current, popular 42" plasma sets are either "HD" resolution (typically around 720 lines) or "ED" resolution (480p). No one argues that the HD doesn't provide a slightly superior picture for HD content, but many argue that ED is preferrable for non-HD (SD) and DVD sources. And the price difference between the two can be dramatic. ($2500 vs. $5000).
For that $ difference, I was willing to compromise. Some pureists will make a different choice. I'm very happy with my ED Panasonic Plasma, and am hoping that in five years the price of direct-view HD sets will be within my reach.
--H
Re:Workaround is to use an HTPC... (Score:3, Informative)
But does a $200 19" CRT have enough "dots" to display all the pixels in a 1920x1080p picture? (I'm not sure. I really want to know.) My knowledge of display technologies is limited, but I think 19" CRTs in this price range don't have enough "dots" (calculated from dot pitch [howstuffworks.com]) to display all of the pixels and will not give a "true" 1920x1080p picture.
Example: I've been thinking about getting a Samsung 997DF [samsung.com], which has a max resolution of 1920x1440 (a resolution I'd only use for 1080p video). However, it also has a horizontal dot pitch of 0.20mm and a viewable width of about 14.4" (about 365.76mm). That's about 1829 viewable dots across the screen, which is less than the number of horizontal pixels in a 1920x1080p picture.
The ability to display 1920x1080p is the biggest reason I'd choose a $200 19" CRT over a $230 17" LCD. However, if the 1080p video is "messed up," then I'd rather get a 17" LCD and just convert everything to 720p.