Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Television Media Displays Hardware

When is 720p Not 720p? 399

Henning Hoffmann writes "HDBlog has an interesting entry about many home theater displays.
Home theater displays around the resolution of 720p (most DLP, LCD, and LCOS displays) must convert 1080i material to their native resolution for display. No surprise there. But many displays do this by discarding half of the 1080i HD signal, effectively giving 720p viewers an SD signal - not watching HD at all! "
This discussion has been archived. No new comments can be posted.

When is 720p Not 720p?

Comments Filter:
  • by Anonymous Coward on Monday May 02, 2005 @11:13AM (#12407587)
    Here is tfa for you...

    When is 720p not 720p?

    Tom Norton, in his coverage of the Home Entertainment expo, brought something up that I was unaware of.

    720p displays show native 720p signals directly, of course. They also upconvert SD signals (like DVD) up to 720p for display. And 720p displays must convert incoming 1080i signals to 720p before they can be displayed. No surprise there, this makes sense. But, Silicon Optix claims that most manufacturers do the 1080i conversion just by taking one 540 line field from each 1080i frame (which is composed of two 540 line fields) and scaling that one field up to 720p, ignoring the other field. Reason being, it takes a lot less processing power to do this than to convert the image to 1080p and scale that, which would use all the information in the original signal to derive the 720p signal. If you have a display like this, it means that you're watching 540 lines of resolution upconverted to 720p. This is not HD, just like watching a DVD upconverted to 720p is not HD. Sure, you'll get the full width of the 1080i resolution, but you're only getting half the height. While this is better than DVD, it's not HD in my mind. (Aside: Tom Norton mentions this in his review of the Screenplay 777 projector.)

    If this is indeed the case, most people with 720p (or similar) projectors (and most DLP, LCD, and LCOS home theater projectors are exactly that) are not seeing what their displays are capable of. They're not, technically, even watching HD. This is crazy! How can this be? Why haven't we heard of this before? How are manufacturers getting away with it?

    Over-reacting? Well, if you're an owner of a 720p (or any similar resolution) projector you're either gonna be really upset by this or you're just gonna be laisez-faire about it because there's nothing you can do and you're enjoying your projector just fine thank-you. But me, I don't even own any such projector and I'm a little ticked. But I guess I should really wait for evidence of how properly-done conversion looks in comparison before making any snap judgements. I'm sure that the people selling HQV (a processor chip that does it the RIGHT way) will set something up.

    To me, this is a serious issue. Comments are welcome.

    from: http://www.hdblog.net/ [hdblog.net]
  • by Rectal Prolapse ( 32159 ) on Monday May 02, 2005 @11:16AM (#12407620)
    A Home Theater PC with good quality parts, drivers, and decoders will preserve the 1080i signal - it will combine the 1080i field pair into a single 1080p signal, and then downconvert (ie. downscale) to 720p.

    As a reference, my Athlon XP running at 2.4 GHz (aproximately equivalent to an Athlon XP 3400+) with a Geforce 6800GT and TheaterTek 2.1 software will have (little) trouble achieving this, assuming the 1080i source isn't glitchy itself.

    Alternative is to use the NVIDIA DVD Decoder version 1.0.0.67 ($20 US after 30 day trial) and ZoomPlayer 4.5 beta ($20 beta or nagware) for similar results.

    TheaterTek is roughly $70 US and includes updated NVIDIA DVD Decoders - too bad NVIDIA hasn't updated their official DVD decoders with the bugfixes that is present in the TheaterTek package.
  • by ... James ... ( 33917 ) on Monday May 02, 2005 @11:17AM (#12407637)
    I have a 720p projector paired with a 110" screen. Both 720p and 1080i material look fantastic. Maybe the supposed degredation would be visible side-by-side with a native resolution projector, but I certainly wouldn't worry about it based on what I've been watching.
  • by AtariDatacenter ( 31657 ) on Monday May 02, 2005 @11:18AM (#12407647)
    The article says: Sure, you'll get the full width of the 1080i resolution, but you're only getting half the height.

    Except your 720p display will hopefully have a horizontal resolution of 1280. 1080i video has a horizontal resolution of 1920. So, you're keeping half of the vertical (1080 lines to 540) and you;re keeping 2/3rds of the horizontal (1920 down to 1280).

    Ouch.
  • by nathanmace ( 839928 ) on Monday May 02, 2005 @11:19AM (#12407667)
    This is what you get when you buy a "major" appliance without doing your research first. I know if I was planning on dropping anywhere from $700-to over a $1000 on something, I would be sure to find out everything about it so I could make an informed decision. If someone didn't do that, then they got what they deserved.

    That said, I'm sure there a lot of people out who "don't care". It works form them, and that is all they care about.
  • by Shkuey ( 609361 ) on Monday May 02, 2005 @11:26AM (#12407752)
    Actually this is an issue of giving people what they want. In this case an HDTV that isn't a thousand bucks more expensive and doesn't have a video processing delay.

    The first incorrect thing in the /. post is that this is somehow standard definition. It's not, 540 lines is more than 480. Not only that but they process 1920 lines of horizontal resolution (scaled down to 1280 for a 720p display), which is quite a bit more than 640.

    Anyone who is serious about getting the absolute most out of their display will have an external scaler and a device to delay the audio. Frankly as digital display technologies take more of a foothold in the market I'm hoping these interlaced resolutions will become far less common.

    When I first read the headlines I thought they would perhaps talk about 1024x768 plasmas with rectangular pixels being marketed as 720p. That kind of thing is far more blasphemous in my opinion.

    So in summary of TFA: 720p is not 720p when it's 1080i.
  • by Rectal Prolapse ( 32159 ) on Monday May 02, 2005 @11:26AM (#12407755)
    From my lurking on HDTV enthusiast sites, sometimes the broadcaster will take DVD content (480i) and upconvert to 1080i! It's a terrible practice.

    And in other instances, the broadcaster will not use the full resolution - what looks like 1920x1080i may actually be an upconvert of 1280x1080i, 1440x1080i, or 1280x720! And then there is the overcompression - taking a 20mb/sec mpeg2 stream and cutting the bitrate in half - compression artifacts galore.

    It is sad when HDTV programming available in North America can be WORSE than the DVD!

    You never know what you get in US (and other?) HDTV broadcasts. My understanding is that only the Japanese use minimal mpeg2 compress - I saw snippets of Contact (with Japanese subtitles) in full glorious 1920x1080i at the maximum 20 mbit/sec bitrate - and it was glorious!
  • by Phearless Phred ( 67414 ) on Monday May 02, 2005 @11:26AM (#12407763)
    Ok, here's the skinny. 1080i is 1920x1080 @ 59.94 fields / second, meaning at any one instant in time, you're looking at a 1920x540 image made up of every other line of the picture (the odd fields, if you will.) Then, ~1/60th of a second later, you see the even fields. 720p is 1280x720 @ 60 FRAMES per second, meaning at any given instant you're looking at EVERY field of the image...not just the odd or even fields. If you were to try and take all 1080 lines from the original signal, they wouldn't really map properly to 720 at any given second because half of data would be from that same ~1/60th of a second later. Scaling the fields up is really the best way to go, at least for stuff that's been shot interlaced.
  • And when you use it to upsample data, it is a lossless encoding that doesn't degrade the signal (unless you deliberately throw away data - discrete Fourier transforms are also used in lossy encoders).

    It's not a distortion-free transform, since high frequency signals (e.g. sharp edges) in the original image get interpreted as smooth changes and can get blurred between multiple pixels in an upsampled signal. But then again, that's exactly the sort of thing that happens when you digitize a picture in the first place - if you have a sharp black/white edge that passes through the middle of a pixel, the most accurate thing you can do is make that pixel gray.
  • by G4from128k ( 686170 ) on Monday May 02, 2005 @11:32AM (#12407862)
    There's got to be a fairly straightforward formula relating inherent resolution loss when performing any noninteger upsampling, or any downsampling.

    Its a bit messy. Imagine 1080i image with a 1-pixel wide sloping black line that is nearly horizontal on a white background. If you throw out half the data, you create an image with a dashed-line. Gaps in the line occur where the slanting line cut across the rows that were discarded. If you upsample from 540 to 720, you will find that the remaining dashes become fattened non-uniformly. In places where the row in the 720-row image falls directly on top of the 540 row image, the line will be thin and dark. In places where the 720-row image falls midway between rows in the 540 row image, the line will be wide and less dark. The end result is the thin black uniform line is converted to a dashed line of varying thickness and darkness -- not pretty.

    Even if you resample directly from 1080 to 720, you still run into problems where the 720-row image pixels fall between the 1080-row pixels. At best, you can use higher-order interpolation (e.g. cubic) to try and fit a curve through the original data and try to estimate what was in the middle of the pixels so they can be shifted half way over. But the result wil never look like an image that was taken with a 720-row camera in the first place.
  • by Ironsides ( 739422 ) on Monday May 02, 2005 @11:33AM (#12407882) Homepage Journal
    Current camera technology can't provide full detail at 1920 pixels yet

    High, I just got back from NAB show in Las Vegas last week. The vendors were had HD Cams that would film and record 1920x1080i. That somepoint is today.
  • by Zed2K ( 313037 ) on Monday May 02, 2005 @11:41AM (#12407986)
    There are 1080p DLP tv's coming this year from samsung, panasonic, mitsubishi.

    Personally I'm getting a samsung 6168 model.
  • by Overzeetop ( 214511 ) on Monday May 02, 2005 @11:41AM (#12408002) Journal
    That's all well and good, but I'm afraid I tend to agree with them. If content providers want to "do it right" they should ditch the 1950's interlacing and get with the 1980s.

    He's leaving one step out. 1080i is 540 lines scanned 60 times per second, offset by half a vertical pitch. 720p is 720 lines scanned at 30 times persecond.

    To try and take two frames which are not occuring at the same instant, stitch them together, remove the motion artifacts, resample, and then display is just plain silly. And frought with errors, as you are expecting a computer to determine which parts of the motion (over 1/60 of a second) to keep and which to throw away.

    If you wanted high fidelity, you'd spend the money for a 1080p60 system. Then it wouldn't matter. Except that you would complain about the quality, because each frame you see was upsampled from only 540 lines of resolution.

    It all comes back to the fact the the FCC let the industry choose this "18 formats is good" spec.

    Personally, I'm in favor of an olympic standard mayonaise, but...no...wait...awww hell, I give up.
  • Comment removed (Score:3, Informative)

    by account_deleted ( 4530225 ) on Monday May 02, 2005 @11:41AM (#12408003)
    Comment removed based on user account deletion
  • by GoRK ( 10018 ) on Monday May 02, 2005 @11:43AM (#12408018) Homepage Journal
    This is not particularly news. Some "blogger" discovers something because he never bothered to ask and screams something about the sky is falling.. I'm kind of sick of this "news" reporting. Incidentally, this same issue affects owners of most plasma and LCD tv's with native resolutions below 1920x1080 too.. depending on how you look at it as a problem or not.

    Anyway, it's fairly well known that the internal scalers in many devices suck. That is why there is a market for good external scalers. If you are paranoid about watching a lot of 1080i on your 720p projector or LCD TV or Plasma, go buy a scaler. They cost about $1000 but will improve scaled display a lot.

    At least if you have an external scaler you will have some options about how you convert 1080i to 720p. The article makes it sound like splitting the fields is a huge sin -- and it is if you discard one field per frame (Half field deinterlacing), but it's perfectly acceptible to scale EACH 540-line field to a seperate 720-line frame and double the framerate. This is called bob deinterlacing and is often the best for converting 1080i video to lower resolutions. If you are watching a 1080i upconvert of a film or something, though, you can have the scaler do full field deinterlacing and inverse telecine for you and see a nice 720p/24fps picture. Scalers also generally have internal audio delays for various types of audio feeds so you won't have to worry about AV sync issues either.

    If you have any questions about how your device does this, you should try to find out before you buy it. Most devices don't publish how they do it, though, so your only option may be to derive it -- and that will not be an easy job.
  • by Myriad ( 89793 ) <(myriad) (at) (thebsod.com)> on Monday May 02, 2005 @11:44AM (#12408036) Homepage
    While this is an interesting issue, and one I hadn't heard of before, it's only one of many mines in the field of HD.

    If you're looking to get into HD there are a *lot* of little quirks to take into account, such as:
    - Offically there are two HD resolutions, 720p and 1080i
    - Most HD TV's are only capable of *one* of these resolutions. So you have to choose, 720p OR 1080i in most cases. If you want one that can do both, check very carefully.. forget DLP or LCD based devices (fixed set of pixels so fixed resolution), CRT only.
    - Many HDTV's will *not* convert from one format to another. They accept only their native resolution.
    - Different networks broadcast using one standard or the other. For example CBS uses 1080i and ABC 720p IIRC. Fox is way behind in HDTV support.
    - Most HDTV receivers can handle either a 720p or 1080i signal and will convert as required for your TV's native resolution.
    - Some TV providers only support one format, regardless of the source material. Ie, in Canada Starchoice only broadcasts in 1080i. Any 720p content they have they upconvert to 1080i before broadcasting. It's impossible to receive a native 720p signal from them.
    - The Xbox supports both HDTV modes... but very few HD games actually use 1080i (Dragons Lair being one). Most are 720p. So if this is important to you, you'll possibly want a 720p native TV: most receivers do not have HD inputs that would let you upconvert a 720p game to a 1080i signal for the TV. (the new Xbox will have more HD content than the current one, but it's a good bet that they'll be mostly 720p titles)
    - Most Projectors and Plasma's are *not* HDTV. They are EDTV (enhanced definition) or some such. Check the specs carefully.
    - Most projectors are 1024x768. This means your HD signal of 1920x1080i or 1920x720p is being heavily rescaled horizontally! Few projectors have a true HD native resolution.

    So there you go... lots of fun things to take into account!

    Blockwars [blockwars.com]: free, multiplayer, head to head Tetris like game

  • by GutBomb ( 541585 ) on Monday May 02, 2005 @11:48AM (#12408092) Homepage
    the government mandate is for DIGITAL broadcast. it has nothing to do with HDTV. there are digital over the air broadcasts of SD content as well, and the government mandate says that all TVs made after January 2006 must be capable of recieving Digital ATSC over air signals, and that all over the air broadcast networks start broadcasting digital ATSC signals.
  • I work on this stuff, every day.. all day.
    only cheap projectors or displays have a maximum res of 720p.
    I dont see many of those anyhow.
    but yes, on those displays the signal is downconverted by chopping out 1/2 of it.

    however, these displays are not popular anyhow.
    some of the most popular displays still cant display native, but they are still can display either XGA or SXGA with no proble (were getting pretty close to hd at this point)

    dont buy a cheap projector, and you wont get a cheap display. you get what you pay for.
  • by mquetel ( 757379 ) on Monday May 02, 2005 @11:48AM (#12408103)
    Sharp has several models of 45" LCDs that are 1080p. The technology is available, but really expensive. See here: http://www.sharpusa.com/products/ModelLanding/0,10 58,1426,00.html [sharpusa.com]
  • by Alereon ( 660683 ) on Monday May 02, 2005 @11:51AM (#12408134)
    That's not why the Soundblaster Live sucked. Resampling to 48Khz before DAC conversion is the standard way to operate a DAC. The problem was that the resmpler was BROKEN in the Soundblaster Live!-series of cards, resulting in significant, audible distortion in all but 48Khz sound signals. That, and the DAC just wasn't all that good even taking that into consideration. The drivers were awful too.
  • by Anonymous Coward on Monday May 02, 2005 @11:52AM (#12408144)
    Agreed it's lossless.

    Disagree with second paragraph. Upsampling should have more Gibb's ringing around the discontinuities in the image. This is why one introduces a Hannning or Hamming window during sinc interpolation. Whether that manifests itself as a blurring depends on the context of the discontinuity within the image. Upsampling via FT is not the same as linear (or even simple non-linear) weighting during digitization.
  • Comment removed (Score:4, Informative)

    by account_deleted ( 4530225 ) on Monday May 02, 2005 @12:03PM (#12408305)
    Comment removed based on user account deletion
  • by gblues ( 90260 ) on Monday May 02, 2005 @12:04PM (#12408323)
    My goodness, there's a lot of idiots out there.

    The whiners in TFA mistakenly assume that 2 fields of 1080i = 1 frame of 1080p. This is WRONG, WRONG, WRONG.

    It cannot be assumed that the following field has anything to do with the current one. See the "not resized or deinterlaced" picture here:

    http://www.100fps.com/ [100fps.com]

    When the television takes the 540 lines in a given field, interpolates the missing lines, and scales to 720 lines, it is DOING THE RIGHT THING. Otherwise your TV would look like the first two example pictures at the above site.

    Nathan
  • by Anonymous Coward on Monday May 02, 2005 @12:05PM (#12408344)
    fox has great HD and yes its true HD. try watching 24
  • by steve6534 ( 809539 ) on Monday May 02, 2005 @12:08PM (#12408372) Homepage
    :The NTSC video standard has 525 lines. :The PAL and SECAM video standards have 625 lines. :So where does 480 linrd come from? The specification is for 525 lines but there are only 480 lines of picture informtation - The other 45 are blank lines that were designed to let the electron gun get back to the top of the screen to begin drawing the next frame.
  • Look at the Source (Score:3, Informative)

    by cheinonen ( 318646 ) <cheinonen@ho t m ail.com> on Monday May 02, 2005 @12:11PM (#12408417)
    The company saying this is Silicon Optics, who makes the Realta scaler chip (featured in the $3,500 Denon 5910 DVD player), and is obviously trying to get people to use their chip, or buy a scaler based on their chip. Now, from the reviews I've read of the 5910, it's the best reasonably priced (under $10k) scaler on the market. It's amazing. However, when they say "Almost all other companies throw away this extra data and do it the lazy way" without naming the companies, I believe them as much as I believe those recent Apple benchmarks.

    I'm almost sure their scaler will help with most sources you feed your 720p HDTV, what it can do with 480i DVD's is impressive enough that you would believe that. However, I doubt the problem is as bad as they say it is. Also, 1080p DLP sets are going to start hitting the market soon, and in a couple of years 720p will probably have been pushed out of the market mostly. Given what a scaler costs, I'd probably save my money to get the 1080p set in a couple years since the 720p sets still look great.

    I have a 1080i set, but I considered a 720p DLP set since they looked amazing and only didn't because of cost.
  • by Shkuey ( 609361 ) on Monday May 02, 2005 @12:12PM (#12408428)
    Televisions are not computer monitors.
    The NTSC video standard has 525 lines.
    The PAL and SECAM video standards have 625 lines.
    So where does 480 linrd come from?


    480 lines are actually shown on a typically television. The remains of the signal are overscanned and not shown. Though many displays slightly overscan an HD signal, this can be also be filed under the blasphemy that is 720p not being 720p, and there is no technical reason it needs to be done.
  • Re:Which Models? (Score:2, Informative)

    by sxdev ( 664129 ) on Monday May 02, 2005 @12:16PM (#12408480)
    Yes. If you connect and display a alternating black and white interlaced frames, you should get a mid-gray if it is combining the two. If you get pure black or pure white then it is showing one of the interlaced frames and throwing away the other one. (as the article claims they do)
  • by Space cowboy ( 13680 ) * on Monday May 02, 2005 @12:20PM (#12408534) Journal
    - Most projectors are 1024x768. This means your HD signal of 1920x1080i or 1920x720p is being heavily rescaled horizontally! Few projectors have a true HD native resolution.

    I have a Mitsubishi X390U linked up at home - it's your typical 1024x768 resolution. I've got the comcast HD box linked up to a TiVO (SD only) and (HD feed) directly into the amp's component inputs. The result is that I can switch between HD & SD at the flick of a button. It all gets projected onto an 8' screen.

    The difference between HD footage and the *same* SD footage is that of night and day. Heavily rescaled or no - at the end of a movie the tiny credit text is easily readable on HD (as opposed to a blurred just-about-legible mess on the SD feed). The movie itself just blows me away. Every time :-)

    Same source, same amp, same projector. World of difference.

    So, (and I'm not disagreeing over the scaling), you do get an excellent upgrade from SD when using a 1024x768 projector.

    Simon

  • Easy Fix (Score:3, Informative)

    by KrackHouse ( 628313 ) on Monday May 02, 2005 @12:20PM (#12408537) Homepage
    You can usually go into your cable/sat box and tell it what to output. If you de-select 1080i as an output format it will do the converting to 720p instead of the TV which should give better results. It's amazing how many times I've seen the cable guy screw up HD installs. They'll have the tv and the cable box stretching the picture which cuts off the edges.

    If you're a real quality nut like me then get a tube based HDTV, they can actually get close to doing 1080i.
  • by J Isaksson ( 721660 ) on Monday May 02, 2005 @12:22PM (#12408582)
    Due to interlacing, a single 525 line picture is split into two ~262 line frames for display on the TV screen.

    Lines 243-262 of each frame (off the bottom of the TV) start with 0.3V for 4.7us, and the rest is 0V. This tells the TV to prepare for a new frame.

    This leaves just 242*2=484 lines of effective display.

    http://eyetap.org/ece385/lab5.htm [eyetap.org]
  • Re:Which Models? (Score:3, Informative)

    by Shkuey ( 609361 ) on Monday May 02, 2005 @12:23PM (#12408588)
    If you're buying a 720p (or close to 720p as many are slightly off) set that is less than ~3,000 USD the answer is easy: All of them. If you're in the market for a high end unit then it is a two-step process. You need to find what kind of scaler is built-in, commonly you'll find third party scalers like Faroudja, and then research the scaler to determine how it functions. Unfortunately some manufacturers (Sony!) build their own scalers and they're probably not going to tell you how it works.
  • by mcg1969 ( 237263 ) on Monday May 02, 2005 @12:24PM (#12408597)
    Yep, I have the same thing. The reason, I believe, is because even dropping down to 540p before upscaling to 720p leaves you a LOT more information than standard NTSC. First of all, NTSC is 480i, not 480p---so really it's reasolution is somewhere between 240p-480p due to the losses incurred by interlacing. And secondly, 1080i content has only 540 lines of chroma resolution anyway; you're really only losing luminance information. (Not to minimize that---luminance is the most important component---it's just that we're losing less that you might initially think.)

    If you've ever seen high-def on a 480p EDTV plasma, you'll understand just how superior the picture STILL is compared to 480i NTSC.

    Nevertheless, true 1080p deinterlacing is coming down the pike right now. Faroudja, SiliconOptix, and Gennum have all created solutions, and we should begin seeing them in external video processors and displays soon.
  • by StandardCell ( 589682 ) on Monday May 02, 2005 @12:32PM (#12408694)
    Did anyone notice that, despite the accusation, Silicon Optix calls out NONE of their competitors for this supposed "issue"?

    Gennum, Pixelworks, Genesis, Oplus (now Intel), and several others make their own scaler/deinterlacer chips. Most of these have already found their way into displays and have proper deinterlacing strategies in them. Nobody scales without deinterlacing first anyway in a modern image processor.

    Silicon Optix's technology is based on offline film processors by Teranex. While they can certainly be high quality, they aren't the top of the heap either by volume or by prestige. Genesis/Faroudja had a name for a long time with their "line doublers" which are over 20 years old and their more advanced but cheap gm1601 is one of the more popular solutions for HDTVs. Gennum's GF9350 with VXP technology is currently in the largest plasma tv in the world (Samsung 80"). These and other scaler/deinterlacer chips have none of the problems that Silicon Optix claims exist. If you look at the debates that rage over at the usual enthusiast sites, you'll see that there are issues with its own technology like latency and cost that aren't present in the other solutions I mentioned.

    Just like Silicon Optix's "odd film cadence technology" which requires nothing different than what everyone else has today, this reeks of a cheap PR vehicle. While the choice of scaler and deinterlacer is important, it is not the utter tragedy that SO would like to make it out nor are they the saviors of the HDTV world. If they know who the culprits are, then let them name whose image processor it is that creates these problems.
  • by Phreakiture ( 547094 ) on Monday May 02, 2005 @12:45PM (#12408866) Homepage

    720p is 720 lines scanned at 30 times persecond.

    Mostly incorrect.

    There are 18 recognized MPEG stream formats for HDTV.

    • 640x480 24 fps progressive narrow
    • 640x480 30 fps progressive narrow
    • 640x480 30 fps interlaced narrow
    • 640x480 60 fps progressive narrow
    • 704x480 24 fps progressive narrow
    • 704x480 30 fps progressive narrow
    • 704x480 30 fps interlaced narrow
    • 704x480 60 fps progressive narrow
    • 704x480 24 fps progressive wide
    • 704x480 30 fps progressive wide
    • 704x480 30 fps interlaced wide
    • 704x480 60 fps progressive wide
    • 1280x720 24 fps progressive wide
    • 1280x720 30 fps progressive wide
    • 1280x720 60 fps progressive wide
    • 1920x1080 24 fps progressive wide
    • 1920x1080 30 fps progressive wide
    • 1920x1080 30 fps interlaced wide

    In presenting these on a monitor, your receiver/settop box/whatever is supposed to turn them to a format that your monitor can handle. This will typically be one of these four:

    • 480 row 60Hz interlaced
    • 480 row 60Hz progressive
    • 720 row 60Hz progressive
    • 1080 row 60Hz interlaced
    • It is noteworthy, though, that some videophile monitors can handle, and set-top boxes deliver, 1080 row 60Hz progressive.

      As for the presence/absence of interlacing, I agree that it is very bad to use interlacing at the strem level. This should be eliminated. I would make an exception for the 480 modes, because the material may have been originally captured on NTSC videotape, in which case some sort of conversion would have to take place to get a progressive image, and I feel very strongly that conversions should never be done for broadcast unless absolutely necessary (as when showing PAL/SECAM native material).

      On the other hand, at the monitor level, if you have an interlaced monitor, I don't think that is a major issue. In 1080 mode, the best picture that can be sent is the 30fps progressive stream. This can be interlaced for presentation on a cRT.

      Now, someone commented that CRT's are dead. Not if you have a budget, they're not! I've owned an HD set for over three years now, and it only ran me $700. It is a CRT. It has a beautiful picture.

      Further, I would put forth that CRT's, in addition to being significantly cheaper than the alternatives, also put out a better picture than LCD (view from any angle; accurate color rendition; no lag), are less susceptible to burn-in than plasma (which will be killed by network bugs) and do not exhibit the rainbow effect of DLP (which, in fariness is not really all that bad). Their major failings are their physical size and power consumption.

  • Re:Resampling (Score:1, Informative)

    by Anonymous Coward on Monday May 02, 2005 @12:46PM (#12408888)
    Yes. The resolution ratio is expressed as the size of the new resolution divided by the size of the old resolution, capped at 1.0. It's just that simple.

    It is counterintuitive, but in fact, whether a resampling is done in round numbers or not is irrelevant. As an example, take a quality image into photoshop and resample it such that it's one pixel wider, or 3% wider, or some such. It looks exactly the same, and for good reason.

    The simple, easy, and totally misleading way to think of pixels is as if they are little squares of color. But they're not: when you snap a picture, it's not a picture of a bunch of little squares. Instead, pixels are point samples of continuous data.

    As a though experiment, it's easy to work with one dimensional data instead of two, but it works out the same. Consider an audio file. (Let's do mono for simplicity.) It consists of point-samples of an audio waveform. Plot these samples on a piece of paper and connect the dots. Now resample the continuous curve at whatever resolution you care to. You see how the question of whether the resampling ratio is integral (2x, 3x) is irrelevant? 1.5x works just fine, as does 1.183826x.
  • Re:ED displays (Score:1, Informative)

    by Anonymous Coward on Monday May 02, 2005 @12:58PM (#12409040)
    I don't own an ED set but I will defend them anyway(I own a 20" CRT).

    There are two very real reasons you may prefer an ED set over an HD set.

    1. DVDs are 480i - a progressive scan dvd player can do a great job of upscaling the picture to 480p. An Ed set will display this without any further scaling. (scaling is never as good as direct display). Most programs these days is still 480i (and crappy quality at that). the more you upscale this the worse it's going to look. Upscaling crap is going to give you magnified crap. This is a very real problem for many HD owners. THey can't stand watching 480i programming, but the majority of programming is still 480i.

    2. Resolution is harder and harder to discern the further you sit from your television. If you sit 8' from a 42" set YOU WILL NOT SEE A DIFFERENCE BETWEEN 480p and 720p (or 1080i for that matter). With the human eye it is impossible to see the difference. 8' is a common viewing distance. If you get a 50" set or larger I do recommend HD. For 42" it is questionable if you need HD.

    The third reason is that ED's cost less than HD (about half the price). Of course cost less does not equal best value. However, if you mostly watch DVD's and 480i cable tv. And if you sit 8' from your tv. and your tv is less than 50" I think it is a very reasonable decision to opt for an ED set opposed to HD.

    You may say, but in a few years majority of programming will be HD. Yes, this is true, but in a few years todays HD tvs will cost less than half what they do now (a predicted 30% drop each year that so far has been holding true). So why not buy an ED now, wait for the programming to eventually become HD and then buy an HD at that time - for less money than if you bought an HD today?

    Hmmm... even after all of this you may still prefer an HD today. BUT, are you going to be so quick to say that ED is a load of crap?
  • by Richthofen80 ( 412488 ) on Monday May 02, 2005 @01:12PM (#12409222) Homepage
    Good sources of HD, such as Motorola 6200 series HD set top cable box, will allow you to choose output conversion prior to the signal leaving the box. That way, your TV will recognize the signal as native and not do a scale/conversion.

    As an example, if you own the above box, or the PVR box, (both are silver and provided by Comcast), do the following:

    Turn off your cable box, then press 'setup' on the remote. You'll get a different config screen, one which allows you, among other things, tell the box what 'resolution' to display. (1080i, 720p, 480p, 480i). The Motorola scaler seems to do a great job, I'm not sure the algorithm / method, but I'm pretty sure it doesn't dice the 1080i fieldset then scale/stretch.

    Early plasma TVs who have a native resolution of 852x480, and who have bad scalers in the TVs, benefit by setting the resolution to 480p.

    I got lucky, my plasma has an excellent scaler so I left the cable box settings as-is.
  • by interJ ( 653180 ) on Monday May 02, 2005 @01:13PM (#12409244)
    You forgot one result: Your screen flashes black to white at 60fps. This is a result of resizing each 1080i field to 720p and displaying it. While it wouldn't give good results for this example, it would probably look the best for real-world signals, because nothing would be discarded, and there is no good way to deinterlace without showing some artifacts or blurriness.
  • The difference is that EGA/VGA you had to poll the card, but on the Amiga and even the Atari, you actually got notified by an interrupt. When you got the interrupt, you could be reasonably assured that there was no latency between the interrupt trigger and the event as interrupts ran at such a mega priority. And on those systems too you could also specify the memory window versus the display window and also scroll by adjusting the memory pointers as well as some shift registers. It just looked really, really good.
  • Re:ED displays (Score:3, Informative)

    by Humba ( 112745 ) * on Monday May 02, 2005 @05:00PM (#12412545)
    The ED vs. HD debate is one of the most heated that you'll find at AVS Forums [avsforum.com].

    Many current, popular 42" plasma sets are either "HD" resolution (typically around 720 lines) or "ED" resolution (480p). No one argues that the HD doesn't provide a slightly superior picture for HD content, but many argue that ED is preferrable for non-HD (SD) and DVD sources. And the price difference between the two can be dramatic. ($2500 vs. $5000).

    For that $ difference, I was willing to compromise. Some pureists will make a different choice. I'm very happy with my ED Panasonic Plasma, and am hoping that in five years the price of direct-view HD sets will be within my reach.

    --H

  • by MojoStan ( 776183 ) on Monday May 02, 2005 @05:22PM (#12412833)
    A $200 19" CRT computer monitor will display more than 1080p (let alone i) with no problem.

    But does a $200 19" CRT have enough "dots" to display all the pixels in a 1920x1080p picture? (I'm not sure. I really want to know.) My knowledge of display technologies is limited, but I think 19" CRTs in this price range don't have enough "dots" (calculated from dot pitch [howstuffworks.com]) to display all of the pixels and will not give a "true" 1920x1080p picture.

    Example: I've been thinking about getting a Samsung 997DF [samsung.com], which has a max resolution of 1920x1440 (a resolution I'd only use for 1080p video). However, it also has a horizontal dot pitch of 0.20mm and a viewable width of about 14.4" (about 365.76mm). That's about 1829 viewable dots across the screen, which is less than the number of horizontal pixels in a 1920x1080p picture.

    The ability to display 1920x1080p is the biggest reason I'd choose a $200 19" CRT over a $230 17" LCD. However, if the 1080p video is "messed up," then I'd rather get a 17" LCD and just convert everything to 720p.

The only possible interpretation of any research whatever in the `social sciences' is: some do, some don't. -- Ernest Rutherford

Working...