Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?
Television Media Displays Hardware

When is 720p Not 720p? 399

Henning Hoffmann writes "HDBlog has an interesting entry about many home theater displays.
Home theater displays around the resolution of 720p (most DLP, LCD, and LCOS displays) must convert 1080i material to their native resolution for display. No surprise there. But many displays do this by discarding half of the 1080i HD signal, effectively giving 720p viewers an SD signal - not watching HD at all! "
This discussion has been archived. No new comments can be posted.

When is 720p Not 720p?

Comments Filter:
  • by Anonymous Coward on Monday May 02, 2005 @11:10AM (#12407538)
    This sounds like the visual version of what Creative Labs has been doing for YEARS with their Sound Blaster audio cards. With most other cards if you want to record with a sample rate of 44.1 khz, you record at 44.1 khz, but even with the newer Sound Blaster cards it must be resampled to 48 khz first.

    It doesn't matter if you are sampling up or down, resampling is bad, your best b
    et is to find a device without it, or if it is necessary like in this case, the one that does the best conversions.

    If I bought one of these displays I would be pretty pissed, but I doubt there is much that can be done about it, if you COULD do something than companies like Creative Labs would be out of business.
    • by Shkuey ( 609361 ) on Monday May 02, 2005 @11:26AM (#12407752)
      Actually this is an issue of giving people what they want. In this case an HDTV that isn't a thousand bucks more expensive and doesn't have a video processing delay.

      The first incorrect thing in the /. post is that this is somehow standard definition. It's not, 540 lines is more than 480. Not only that but they process 1920 lines of horizontal resolution (scaled down to 1280 for a 720p display), which is quite a bit more than 640.

      Anyone who is serious about getting the absolute most out of their display will have an external scaler and a device to delay the audio. Frankly as digital display technologies take more of a foothold in the market I'm hoping these interlaced resolutions will become far less common.

      When I first read the headlines I thought they would perhaps talk about 1024x768 plasmas with rectangular pixels being marketed as 720p. That kind of thing is far more blasphemous in my opinion.

      So in summary of TFA: 720p is not 720p when it's 1080i.
    • by Alereon ( 660683 ) on Monday May 02, 2005 @11:51AM (#12408134)
      That's not why the Soundblaster Live sucked. Resampling to 48Khz before DAC conversion is the standard way to operate a DAC. The problem was that the resmpler was BROKEN in the Soundblaster Live!-series of cards, resulting in significant, audible distortion in all but 48Khz sound signals. That, and the DAC just wasn't all that good even taking that into consideration. The drivers were awful too.
  • It's there (Score:5, Funny)

    by Anonymous Coward on Monday May 02, 2005 @11:13AM (#12407585)
    No surprise there. But many displays do this by discarding half of the 1080i HD signal, effectively giving 720p viewers an SD signal - not watching HD at all!

    The HD signal's still there... you just have to learn how to read between the lines.
  • by Anonymous Coward on Monday May 02, 2005 @11:13AM (#12407587)
    Here is tfa for you...

    When is 720p not 720p?

    Tom Norton, in his coverage of the Home Entertainment expo, brought something up that I was unaware of.

    720p displays show native 720p signals directly, of course. They also upconvert SD signals (like DVD) up to 720p for display. And 720p displays must convert incoming 1080i signals to 720p before they can be displayed. No surprise there, this makes sense. But, Silicon Optix claims that most manufacturers do the 1080i conversion just by taking one 540 line field from each 1080i frame (which is composed of two 540 line fields) and scaling that one field up to 720p, ignoring the other field. Reason being, it takes a lot less processing power to do this than to convert the image to 1080p and scale that, which would use all the information in the original signal to derive the 720p signal. If you have a display like this, it means that you're watching 540 lines of resolution upconverted to 720p. This is not HD, just like watching a DVD upconverted to 720p is not HD. Sure, you'll get the full width of the 1080i resolution, but you're only getting half the height. While this is better than DVD, it's not HD in my mind. (Aside: Tom Norton mentions this in his review of the Screenplay 777 projector.)

    If this is indeed the case, most people with 720p (or similar) projectors (and most DLP, LCD, and LCOS home theater projectors are exactly that) are not seeing what their displays are capable of. They're not, technically, even watching HD. This is crazy! How can this be? Why haven't we heard of this before? How are manufacturers getting away with it?

    Over-reacting? Well, if you're an owner of a 720p (or any similar resolution) projector you're either gonna be really upset by this or you're just gonna be laisez-faire about it because there's nothing you can do and you're enjoying your projector just fine thank-you. But me, I don't even own any such projector and I'm a little ticked. But I guess I should really wait for evidence of how properly-done conversion looks in comparison before making any snap judgements. I'm sure that the people selling HQV (a processor chip that does it the RIGHT way) will set something up.

    To me, this is a serious issue. Comments are welcome.

    from: http://www.hdblog.net/ [hdblog.net]
    • by Overzeetop ( 214511 ) on Monday May 02, 2005 @11:41AM (#12408002) Journal
      That's all well and good, but I'm afraid I tend to agree with them. If content providers want to "do it right" they should ditch the 1950's interlacing and get with the 1980s.

      He's leaving one step out. 1080i is 540 lines scanned 60 times per second, offset by half a vertical pitch. 720p is 720 lines scanned at 30 times persecond.

      To try and take two frames which are not occuring at the same instant, stitch them together, remove the motion artifacts, resample, and then display is just plain silly. And frought with errors, as you are expecting a computer to determine which parts of the motion (over 1/60 of a second) to keep and which to throw away.

      If you wanted high fidelity, you'd spend the money for a 1080p60 system. Then it wouldn't matter. Except that you would complain about the quality, because each frame you see was upsampled from only 540 lines of resolution.

      It all comes back to the fact the the FCC let the industry choose this "18 formats is good" spec.

      Personally, I'm in favor of an olympic standard mayonaise, but...no...wait...awww hell, I give up.
      • by Phreakiture ( 547094 ) on Monday May 02, 2005 @12:45PM (#12408866) Homepage

        720p is 720 lines scanned at 30 times persecond.

        Mostly incorrect.

        There are 18 recognized MPEG stream formats for HDTV.

        • 640x480 24 fps progressive narrow
        • 640x480 30 fps progressive narrow
        • 640x480 30 fps interlaced narrow
        • 640x480 60 fps progressive narrow
        • 704x480 24 fps progressive narrow
        • 704x480 30 fps progressive narrow
        • 704x480 30 fps interlaced narrow
        • 704x480 60 fps progressive narrow
        • 704x480 24 fps progressive wide
        • 704x480 30 fps progressive wide
        • 704x480 30 fps interlaced wide
        • 704x480 60 fps progressive wide
        • 1280x720 24 fps progressive wide
        • 1280x720 30 fps progressive wide
        • 1280x720 60 fps progressive wide
        • 1920x1080 24 fps progressive wide
        • 1920x1080 30 fps progressive wide
        • 1920x1080 30 fps interlaced wide

        In presenting these on a monitor, your receiver/settop box/whatever is supposed to turn them to a format that your monitor can handle. This will typically be one of these four:

        • 480 row 60Hz interlaced
        • 480 row 60Hz progressive
        • 720 row 60Hz progressive
        • 1080 row 60Hz interlaced
        • It is noteworthy, though, that some videophile monitors can handle, and set-top boxes deliver, 1080 row 60Hz progressive.

          As for the presence/absence of interlacing, I agree that it is very bad to use interlacing at the strem level. This should be eliminated. I would make an exception for the 480 modes, because the material may have been originally captured on NTSC videotape, in which case some sort of conversion would have to take place to get a progressive image, and I feel very strongly that conversions should never be done for broadcast unless absolutely necessary (as when showing PAL/SECAM native material).

          On the other hand, at the monitor level, if you have an interlaced monitor, I don't think that is a major issue. In 1080 mode, the best picture that can be sent is the 30fps progressive stream. This can be interlaced for presentation on a cRT.

          Now, someone commented that CRT's are dead. Not if you have a budget, they're not! I've owned an HD set for over three years now, and it only ran me $700. It is a CRT. It has a beautiful picture.

          Further, I would put forth that CRT's, in addition to being significantly cheaper than the alternatives, also put out a better picture than LCD (view from any angle; accurate color rendition; no lag), are less susceptible to burn-in than plasma (which will be killed by network bugs) and do not exhibit the rainbow effect of DLP (which, in fariness is not really all that bad). Their major failings are their physical size and power consumption.

    • Good sources of HD, such as Motorola 6200 series HD set top cable box, will allow you to choose output conversion prior to the signal leaving the box. That way, your TV will recognize the signal as native and not do a scale/conversion.

      As an example, if you own the above box, or the PVR box, (both are silver and provided by Comcast), do the following:

      Turn off your cable box, then press 'setup' on the remote. You'll get a different config screen, one which allows you, among other things, tell the box what '
  • Resampling (Score:5, Insightful)

    by FlyByPC ( 841016 ) on Monday May 02, 2005 @11:14AM (#12407592) Homepage
    There's got to be a fairly straightforward formula relating inherent resolution loss when performing any noninteger upsampling, or any downsampling. Any other change in resolution must necessarily degrade the signal, yes? (Except perhaps if a clever algorithm could losslessly encode the original data in a 1.5x-upsampled version, without distorting it.)
    • And when you use it to upsample data, it is a lossless encoding that doesn't degrade the signal (unless you deliberately throw away data - discrete Fourier transforms are also used in lossy encoders).

      It's not a distortion-free transform, since high frequency signals (e.g. sharp edges) in the original image get interpreted as smooth changes and can get blurred between multiple pixels in an upsampled signal. But then again, that's exactly the sort of thing that happens when you digitize a picture in the first place - if you have a sharp black/white edge that passes through the middle of a pixel, the most accurate thing you can do is make that pixel gray.
      • by Anonymous Coward
        Agreed it's lossless.

        Disagree with second paragraph. Upsampling should have more Gibb's ringing around the discontinuities in the image. This is why one introduces a Hannning or Hamming window during sinc interpolation. Whether that manifests itself as a blurring depends on the context of the discontinuity within the image. Upsampling via FT is not the same as linear (or even simple non-linear) weighting during digitization.
    • by G4from128k ( 686170 ) on Monday May 02, 2005 @11:32AM (#12407862)
      There's got to be a fairly straightforward formula relating inherent resolution loss when performing any noninteger upsampling, or any downsampling.

      Its a bit messy. Imagine 1080i image with a 1-pixel wide sloping black line that is nearly horizontal on a white background. If you throw out half the data, you create an image with a dashed-line. Gaps in the line occur where the slanting line cut across the rows that were discarded. If you upsample from 540 to 720, you will find that the remaining dashes become fattened non-uniformly. In places where the row in the 720-row image falls directly on top of the 540 row image, the line will be thin and dark. In places where the 720-row image falls midway between rows in the 540 row image, the line will be wide and less dark. The end result is the thin black uniform line is converted to a dashed line of varying thickness and darkness -- not pretty.

      Even if you resample directly from 1080 to 720, you still run into problems where the 720-row image pixels fall between the 1080-row pixels. At best, you can use higher-order interpolation (e.g. cubic) to try and fit a curve through the original data and try to estimate what was in the middle of the pixels so they can be shifted half way over. But the result wil never look like an image that was taken with a 720-row camera in the first place.
    • Re:Resampling (Score:3, Interesting)

      by gyg ( 785962 )
      The only effective limitation comes from linear algebra - there are only as many degrees of freedom as there are pixels, so if you downsample, you *always* lose data, like it or not.

      However, even this is not a problem in practice since in real-world pictures nearby pixels are not independent. By using an appropriate encoding dictionary such as wavelets, which zoom in on sharp edges and economize on flat surfaces, you can shrink a typical picture by something like 90% without visible quality loss.

      Now since
  • Which Models? (Score:5, Interesting)

    by goldspider ( 445116 ) <ardrake79NO@SPAMgmail.com> on Monday May 02, 2005 @11:14AM (#12407595) Homepage
    Is there any way of telling which sets do this? This is certainly something I'd like to know before I dropped that kind of cash.
    • Re:Which Models? (Score:5, Interesting)

      by David Leppik ( 158017 ) on Monday May 02, 2005 @11:39AM (#12407965) Homepage
      Is there any way of telling which sets do this? This is certainly something I'd like to know before I dropped that kind of cash.
      Yes. Go to a showroom and look at the displays. If you see some that have greater vertical resolution than the non-HD models, there you go. If you can't see a difference, then it doesn't make a difference.

      If there is a difference you can't see but could learn to see, don't learn; it will not bring you joy, it will only make you miserable or annoying. Long ago I learned to see the FFT distortion in JPEG and MPEG images. Has it made me happy? No. I end up making the JPEGs on my website bigger than everyone else's so I won't see wrinkles on people's faces that are apparently invisible to everyone else. And I can't stand to watch satellite television on a big screen TV because of the annoying compression artifacts.

      • by Total_Wimp ( 564548 ) on Monday May 02, 2005 @12:16PM (#12408478)
        Yes. Go to a showroom and look at the displays. If you see some that have greater vertical resolution than the non-HD models, there you go. If you can't see a difference, then it doesn't make a difference.

        This is not a solution.

        At Best Buy and Circuit city I've seen lots of SD signals on HD displays. How on earth am I going to know if it's the set or the signal that's producing all those jaggies? Ask? At Best Buy? I might as well ask them to build a moon rocket while they're at it.

        Knowing the stats won't neccessarily guarantee you a better picture, but it is a better place to start.

        • Re:Which Models? (Score:3, Insightful)

          by JohnsonWax ( 195390 )
          At Best Buy and Circuit city I've seen lots of SD signals on HD displays. How on earth am I going to know if it's the set or the signal that's producing all those jaggies? Ask? At Best Buy? I might as well ask them to build a moon rocket while they're at it.

          So don't go to Best Buy or Circuit City to evaluate your monitors! Go to a high-end video shop to evaluate your monitors and then go to Best Buy or Circuit City to buy them if the prices are really that much better.

          Seriously, do you want to solve the
      • Re:Which Models? (Score:4, Insightful)

        by twelveinchbrain ( 312326 ) on Monday May 02, 2005 @12:21PM (#12408548)
        Yes. Go to a showroom and look at the displays. If you see some that have greater vertical resolution than the non-HD models, there you go. If you can't see a difference, then it doesn't make a difference.

        That's not necessarily good advice. At the showroom, everything you see is optimized for selling the display. You might not notice any problems until you start to view content outside of their controlled environment.
    • Re:Which Models? (Score:3, Informative)

      by Shkuey ( 609361 )
      If you're buying a 720p (or close to 720p as many are slightly off) set that is less than ~3,000 USD the answer is easy: All of them. If you're in the market for a high end unit then it is a two-step process. You need to find what kind of scaler is built-in, commonly you'll find third party scalers like Faroudja, and then research the scaler to determine how it functions. Unfortunately some manufacturers (Sony!) build their own scalers and they're probably not going to tell you how it works.
      • Re:Which Models? (Score:3, Insightful)

        by maraist ( 68387 ) *
        If you're buying a 720p (or close to 720p as many are slightly off) set that is less than ~3,000 USD the answer is easy: All of them.

        The problem though is that it's BS that the cost of the processing represents affects the price of a multi-thousand dollar system. I can't imagine that a simple $100 PC video card couldn't be stripped of it's bare chips and used as the filter between what-ever proprietary system they use and the input signal processor. It can't possibly cost more than $500 to push an off-
  • by Rectal Prolapse ( 32159 ) on Monday May 02, 2005 @11:16AM (#12407620)
    A Home Theater PC with good quality parts, drivers, and decoders will preserve the 1080i signal - it will combine the 1080i field pair into a single 1080p signal, and then downconvert (ie. downscale) to 720p.

    As a reference, my Athlon XP running at 2.4 GHz (aproximately equivalent to an Athlon XP 3400+) with a Geforce 6800GT and TheaterTek 2.1 software will have (little) trouble achieving this, assuming the 1080i source isn't glitchy itself.

    Alternative is to use the NVIDIA DVD Decoder version ($20 US after 30 day trial) and ZoomPlayer 4.5 beta ($20 beta or nagware) for similar results.

    TheaterTek is roughly $70 US and includes updated NVIDIA DVD Decoders - too bad NVIDIA hasn't updated their official DVD decoders with the bugfixes that is present in the TheaterTek package.
    • I don't care what you do with the signal, DVDs only have SD information encoded on them (480i). So, snaps to you if you're watching HDTV via your HTPC, but now why the hell are you sending everything 1080i? Do you have a 1080i native display? I'll bet you don't - cause they're still pretty damned expensive.

      OK, then let's look at your DVD signal path. 480i converted to 1080i then sent to your display that convertes it to 720p?? Two resolution conversions - and the article states that the second one may
      • I should also mention that TheaterTek, NVIDIA DVD Decoders, ZoomPlayer, etc. are not limited to SD resolution DVDs - they can play recorded 1080i content encoded in mpeg2 transport or program stream files.

        If you have Microsoft's Media Center Edition 2005, you can specify the NVIDIA DVD Decoder (or other competent mpeg2 decoder, such as Elecard's or WinDVD's) for all mpeg2 content including HDTV.
      • DVDs only have SD information encoded on them (480i).

        DVDs actually often contain ED (480p), not SD (480i) material. That's why there's a benefit to hooking up your DVD player to your TV's progressive inputs (if it has them).

    • it will combine the 1080i field pair into a single 1080p signal

      This will, of course, suck.

      There are two different ways to get 1080i material. You can either shoot some other format at 24 frames per second and convert it to 1080i, or you can shoot 1080i.

      If you shoot film or 1080/24 at 24 frames per second and convert to 1080i, there are a set of well-defined tricks you can use. These tricks are collectively referred to as "pulldown." It's possible to remove pulldown, which is great ... except these autom
  • by t_allardyce ( 48447 ) on Monday May 02, 2005 @11:17AM (#12407630) Journal
    When the un-washed masses can't actually tell the difference (they can't even see DCT blocking) and you can get away with selling this crap to them..
    • Which can also be the only explenation for why anyone would try to encode HD-content on todays DVDs.

      I can't understand anything else than that the DCT-artifacts would be even more dominant than on todays DVDs (or DVD-rips).

    • In a fairly recent demo of HDTV one of the manufacturers demonstrated 1080i on a 720p display and even the washed elite of the press largely failed to notice. So what chance do the unwashed masses have?
  • by ... James ... ( 33917 ) on Monday May 02, 2005 @11:17AM (#12407637)
    I have a 720p projector paired with a 110" screen. Both 720p and 1080i material look fantastic. Maybe the supposed degredation would be visible side-by-side with a native resolution projector, but I certainly wouldn't worry about it based on what I've been watching.
    • by mcg1969 ( 237263 ) on Monday May 02, 2005 @12:24PM (#12408597)
      Yep, I have the same thing. The reason, I believe, is because even dropping down to 540p before upscaling to 720p leaves you a LOT more information than standard NTSC. First of all, NTSC is 480i, not 480p---so really it's reasolution is somewhere between 240p-480p due to the losses incurred by interlacing. And secondly, 1080i content has only 540 lines of chroma resolution anyway; you're really only losing luminance information. (Not to minimize that---luminance is the most important component---it's just that we're losing less that you might initially think.)

      If you've ever seen high-def on a 480p EDTV plasma, you'll understand just how superior the picture STILL is compared to 480i NTSC.

      Nevertheless, true 1080p deinterlacing is coming down the pike right now. Faroudja, SiliconOptix, and Gennum have all created solutions, and we should begin seeing them in external video processors and displays soon.
  • by AtariDatacenter ( 31657 ) on Monday May 02, 2005 @11:18AM (#12407647)
    The article says: Sure, you'll get the full width of the 1080i resolution, but you're only getting half the height.

    Except your 720p display will hopefully have a horizontal resolution of 1280. 1080i video has a horizontal resolution of 1920. So, you're keeping half of the vertical (1080 lines to 540) and you;re keeping 2/3rds of the horizontal (1920 down to 1280).

    • From my lurking on HDTV enthusiast sites, sometimes the broadcaster will take DVD content (480i) and upconvert to 1080i! It's a terrible practice.

      And in other instances, the broadcaster will not use the full resolution - what looks like 1920x1080i may actually be an upconvert of 1280x1080i, 1440x1080i, or 1280x720! And then there is the overcompression - taking a 20mb/sec mpeg2 stream and cutting the bitrate in half - compression artifacts galore.

      It is sad when HDTV programming available in North America
      • The truth is that a broadcast quality SD source (say from DigiBeta) is still of much higher quality than what you find on an analog NTSC signal (NTSC has a limited color space and its own resolution limits).

        So most broadcast SD material upconverted to HD resolution still looks better than had the material been broadcast in analog NTSC.
  • Good Ol' CRT (Score:5, Insightful)

    by goldspider ( 445116 ) <ardrake79NO@SPAMgmail.com> on Monday May 02, 2005 @11:18AM (#12407651) Homepage
    When I upgrade to an HD idiot box, I plan on sticking with tried-and-true CRT. IMHO, you can't beat the picture quality/price, and I have yet to hear a compelling reason to fork out thousands of dollars for the trendier offerings.
    • I have yet to hear a compelling reason to fork out thousands of dollars for the trendier offerings.

      The trendier offerings sell to first-adopters and very rich people: those in the first group get their kicks inviting friends at home to hear them go "ooh..ahh..wow", not really out of the better quality, and the second ground just doesn't care about the price.

      When the early adopters are done early-adopting, then it gets affordable for people with regular lives, like you and me.
    • Re:Good Ol' CRT (Score:3, Insightful)

      by GatorMan ( 70959 )
      You must live on the first floor and I also assume you don't relocate often. Try getting a 215lbs. 36" CRT HDTV up a couple flights of stairs. Even with the home delivery options you're lucky if the thugs don't damage your set, your building, or both.
    • The reason to fork over thousands of dollars for trendier offerings? Picture size.

      The reason Plasma/LCD are on the market wasn't because they were selected for image quality. Quite a ways from it. Instead, they were able to scale to sizes greater than 40".

      Tube based HDTVs are only manufactured to about 36" today. And they are an awesome value for what you get... they've got an excellent image quality. The problem is that to fully resolve a 1080i image with your eye, you've got to be sitting pretty close t
    • Re:Good Ol' CRT (Score:4, Insightful)

      by The-Bus ( 138060 ) on Monday May 02, 2005 @11:47AM (#12408074)
      Well, not sure what you mean by CRT but I would say the compelling reason is size. CRTs can only be so big. If you want to go bigger, you can go with what I call the "MTV Cribs" TVs, plasma/LCD, etc. or you can go with a quality RPTV or a projector. I have yet to see a plasma or LCD that has a better quality picture than a decent RPTV.
  • Early adopters often get slapped in the face. I've been thinking about buying an hdtv for a long time. I'm really glad I read this before I bought one.
  • by nathanmace ( 839928 ) on Monday May 02, 2005 @11:19AM (#12407667)
    This is what you get when you buy a "major" appliance without doing your research first. I know if I was planning on dropping anywhere from $700-to over a $1000 on something, I would be sure to find out everything about it so I could make an informed decision. If someone didn't do that, then they got what they deserved.

    That said, I'm sure there a lot of people out who "don't care". It works form them, and that is all they care about.
    • The problem is, how do you do the research? The audio/video publications out there have not even come close to adopting a standard set of measurements that would quantify the performance of processors that need to perform complex tasks like scaling, 3:2 pulldown, etc. The results from different chipsets are all over the map (chroma key errors, cheats, lame algorithms), and it's rare to be able to get any information at all on new products. You just have to wait 6 months until someone that actually knows
    • by Zed2K ( 313037 ) on Monday May 02, 2005 @11:43AM (#12408016)
      Kind of hard to "do your research" when you can't find out how the conversion takes place or can't understand how it all works. This is not a consumers fault kind of thing. This kind of information is not made known unless people ask the question. Who would have thought to ask a question like this?
    • This is what you get when you buy a "major" appliance without doing your research first.

      Yeah, thats what I thought too, except that my research is usually about 2x better than the QA of most companies today. I've found that its best to pay a premium or do without, and when paying a premium, have some integrator or store stand behind the shitty QA of the products. Even then, its a pain in the ass, but it certainly beats buying something from a company like Newegg and paying a 15% restocking fee for them
  • by node 3 ( 115640 ) on Monday May 02, 2005 @11:20AM (#12407678)
    If the broadcast is 1080i, and your display isn't 1080i, I don't think it's logical to assume the quality of the downsampled video will be equivalent to a true 720p broadcast.

    When I get around to buying a HD television (not any time soon, I do all my televisioning on my computer), it will be a true 1080i (are there 1080p televisions?) display so I'll know I'm getting the full potential of HD.

    Unless I'm strapped for cash, of course, in which case I'll just suck it up and know my 720p won't be the best thing for watching 1080i content on.

    On the plus side, it's important to get the facts out there for the consumer, who will likely (although not logically) assume he's/she's getting more than they really are.
  • Hey, Bloggers... (Score:5, Insightful)

    by Gruneun ( 261463 ) on Monday May 02, 2005 @11:20AM (#12407681)
    Why not submit a link to the original article, rather than a link to your blog, which consists only of a link to the original article?

    Otherwise, people might assume this is a shameless attempt to draw traffic to your site.
  • Which set-top (cable/satellite) receivers are doing that same stingey conversion when you've told them to output 720p?

    Anyone with any real regard for picture quality, and whose equipment leaves them the choice, has probably evaluated it under both configurations anyway.
  • This is the reason I bought a 52" rear screen projection than LCD/Plasma and whatever. That, and it was 3000 bucks cheaper and had a better picture.

  • by Phearless Phred ( 67414 ) on Monday May 02, 2005 @11:26AM (#12407763)
    Ok, here's the skinny. 1080i is 1920x1080 @ 59.94 fields / second, meaning at any one instant in time, you're looking at a 1920x540 image made up of every other line of the picture (the odd fields, if you will.) Then, ~1/60th of a second later, you see the even fields. 720p is 1280x720 @ 60 FRAMES per second, meaning at any given instant you're looking at EVERY field of the image...not just the odd or even fields. If you were to try and take all 1080 lines from the original signal, they wouldn't really map properly to 720 at any given second because half of data would be from that same ~1/60th of a second later. Scaling the fields up is really the best way to go, at least for stuff that's been shot interlaced.
  • ED displays (Score:3, Interesting)

    by justforaday ( 560408 ) on Monday May 02, 2005 @11:27AM (#12407783)
    I haven't had a chance to read through the full article/blog/whatever yet (I'll do that at lunch), but this sounds like something I noticed over the weekend while browsing the Best Buy site. Companies are now producing ED-compatible TVs. They list all sorts of compatible display modes (1080i, 720p, 480p, etc), but then mention that they downscale them for display on the TV. Is this just some way of offering half-assed support to unsuspecting consumers?
    • Having read TFA, I see that it's not even the same thing at all. I still think that these ED displays are a load of crap though...
    • Re:ED displays (Score:3, Informative)

      by Humba ( 112745 ) *
      The ED vs. HD debate is one of the most heated that you'll find at AVS Forums [avsforum.com].

      Many current, popular 42" plasma sets are either "HD" resolution (typically around 720 lines) or "ED" resolution (480p). No one argues that the HD doesn't provide a slightly superior picture for HD content, but many argue that ED is preferrable for non-HD (SD) and DVD sources. And the price difference between the two can be dramatic. ($2500 vs. $5000).

      For that $ difference, I was willing to compromise. Some pureists will make

  • only HD is HD (Score:2, Interesting)

    by Anonymous Coward
    Manufacturers better refrain from selling not HD capable displays as HD displays. This is clearly false advertising and there have been several succesful lawsuits lately where people who have been stupid enough and bought into this got their money back.
  • When stations, at least in the DC area, broadcast a 720p signal it's considered a big deal. Most broadcast 480p to utilize existing bandwidth for broadcast of two or more stations.
  • by chrisbolt ( 11273 ) on Monday May 02, 2005 @11:36AM (#12407924) Homepage
    720p displays show native 720p signals directly, of course. They also upconvert SD signals (like DVD) up to 720p for display. And 720p displays must convert incoming 1080i signals to 720p before they can be displayed. No surprise there, this makes sense. But, Silicon Optix claims that most manufacturers do the 1080i conversion just by taking one 540 line field from each 1080i frame (which is composed of two 540 line fields) and scaling that one field up to 720p, ignoring the other field.
    Then later on, he says "Tom Norton mentions this in his review of the Screenplay 777 projector [ultimateavmag.com]." The problem is, in the Screenplay 777 article, it's explained a bit differently:
    For conversion to the native 720p resolution of the projector, 1080i material undergoes an interesting process. Consider each 1080i frame as two 540p fields. Each of these 540p fields is converted to the 720p native resolution of the 777. These upscaled fields are then displayed sequentially. One could, I suppose, make the argument that this potentially reduces or eliminates the temporal (motion) problems inherent in 1080i material. One could also make the argument that it eliminates any spatial (static) resolution advantages that 1080i has over 720p--or indeed over even native 720p sources.
    So he's off, in that instead of it taking one 540 line field from each 1080i frame composed of two fields, it's taking both 540 line fields from each 1080i frame. Not as good as deinterlacing to get 1080p, then resizing that to 720p, but not as bad as the guy makes it sound. Not to mention most rear projection CRTs will display 1080i, but can't display 720p... what happens there?
  • 720p operates at 60 frames (60 full frames) per second. 1080i operates at 60 fields (30 full frames) per second.

    If they convert each individual 1080i frame (1920x540) to 720p (1280x720) then they are not tossing any fields (which seems to be the problem). So, if they are converting 1080i@60 fields to 720p@60 frames, then there is no problem here. If, however, they are converting to 720p@30 frames, then they are tossing half the fields from 1080i and we have a problem. All depends on how the conversion
  • WTF would prefer the lousy flickering interlaced picture just because of the higher resolution?

    It's a big enough shame that this crap has found it's way into the HDTV specs, but WTF does someone use it?!?

  • by GoRK ( 10018 ) on Monday May 02, 2005 @11:43AM (#12408018) Homepage Journal
    This is not particularly news. Some "blogger" discovers something because he never bothered to ask and screams something about the sky is falling.. I'm kind of sick of this "news" reporting. Incidentally, this same issue affects owners of most plasma and LCD tv's with native resolutions below 1920x1080 too.. depending on how you look at it as a problem or not.

    Anyway, it's fairly well known that the internal scalers in many devices suck. That is why there is a market for good external scalers. If you are paranoid about watching a lot of 1080i on your 720p projector or LCD TV or Plasma, go buy a scaler. They cost about $1000 but will improve scaled display a lot.

    At least if you have an external scaler you will have some options about how you convert 1080i to 720p. The article makes it sound like splitting the fields is a huge sin -- and it is if you discard one field per frame (Half field deinterlacing), but it's perfectly acceptible to scale EACH 540-line field to a seperate 720-line frame and double the framerate. This is called bob deinterlacing and is often the best for converting 1080i video to lower resolutions. If you are watching a 1080i upconvert of a film or something, though, you can have the scaler do full field deinterlacing and inverse telecine for you and see a nice 720p/24fps picture. Scalers also generally have internal audio delays for various types of audio feeds so you won't have to worry about AV sync issues either.

    If you have any questions about how your device does this, you should try to find out before you buy it. Most devices don't publish how they do it, though, so your only option may be to derive it -- and that will not be an easy job.
  • gee am I suprise (Score:2, Insightful)

    by lexcyber ( 133454 )
    As usual when we talk quality, in any discussion it all boils down to: You get what you pay for!

    Buy crap equipment and you will get crap.

  • by Myriad ( 89793 ) <[moc.dosbeht] [ta] [dairym]> on Monday May 02, 2005 @11:44AM (#12408036) Homepage
    While this is an interesting issue, and one I hadn't heard of before, it's only one of many mines in the field of HD.

    If you're looking to get into HD there are a *lot* of little quirks to take into account, such as:
    - Offically there are two HD resolutions, 720p and 1080i
    - Most HD TV's are only capable of *one* of these resolutions. So you have to choose, 720p OR 1080i in most cases. If you want one that can do both, check very carefully.. forget DLP or LCD based devices (fixed set of pixels so fixed resolution), CRT only.
    - Many HDTV's will *not* convert from one format to another. They accept only their native resolution.
    - Different networks broadcast using one standard or the other. For example CBS uses 1080i and ABC 720p IIRC. Fox is way behind in HDTV support.
    - Most HDTV receivers can handle either a 720p or 1080i signal and will convert as required for your TV's native resolution.
    - Some TV providers only support one format, regardless of the source material. Ie, in Canada Starchoice only broadcasts in 1080i. Any 720p content they have they upconvert to 1080i before broadcasting. It's impossible to receive a native 720p signal from them.
    - The Xbox supports both HDTV modes... but very few HD games actually use 1080i (Dragons Lair being one). Most are 720p. So if this is important to you, you'll possibly want a 720p native TV: most receivers do not have HD inputs that would let you upconvert a 720p game to a 1080i signal for the TV. (the new Xbox will have more HD content than the current one, but it's a good bet that they'll be mostly 720p titles)
    - Most Projectors and Plasma's are *not* HDTV. They are EDTV (enhanced definition) or some such. Check the specs carefully.
    - Most projectors are 1024x768. This means your HD signal of 1920x1080i or 1920x720p is being heavily rescaled horizontally! Few projectors have a true HD native resolution.

    So there you go... lots of fun things to take into account!

    Blockwars [blockwars.com]: free, multiplayer, head to head Tetris like game

    • - Most projectors are 1024x768. This means your HD signal of 1920x1080i or 1920x720p is being heavily rescaled horizontally! Few projectors have a true HD native resolution.

      I have a Mitsubishi X390U linked up at home - it's your typical 1024x768 resolution. I've got the comcast HD box linked up to a TiVO (SD only) and (HD feed) directly into the amp's component inputs. The result is that I can switch between HD & SD at the flick of a button. It all gets projected onto an 8' screen.

      The difference betw
  • I work on this stuff, every day.. all day.
    only cheap projectors or displays have a maximum res of 720p.
    I dont see many of those anyhow.
    but yes, on those displays the signal is downconverted by chopping out 1/2 of it.

    however, these displays are not popular anyhow.
    some of the most popular displays still cant display native, but they are still can display either XGA or SXGA with no proble (were getting pretty close to hd at this point)

    dont buy a cheap projector, and you wont get a cheap display. you get what you pay for.
  • by Stavr0 ( 35032 ) on Monday May 02, 2005 @11:55AM (#12408192) Homepage Journal
    Simply provide a 1080i test pattern made of alternating horizontal lines where the left half is black lines on even scanlines, white lines on odd scanlines and the right half is reversed (white-even,black-odd)

    The results may be one of the following:

    You will get a screen full of tiny, shimmering horizontal lines that shift in the center of your screen
    Congratulations! Your HT gear is showing a true 1080i picture
    You will get a full screen of gray, possibly with a line in the center
    Not bad, your gear is properly downscaling the signal
    Half your screen is black, the other is white
    Uh oh. Your gear is taking the easy way out and dropping half the scanlines to downconvert (Bele and Lokai)
    I call that the Cheron Test.
  • by gblues ( 90260 ) on Monday May 02, 2005 @12:04PM (#12408323)
    My goodness, there's a lot of idiots out there.

    The whiners in TFA mistakenly assume that 2 fields of 1080i = 1 frame of 1080p. This is WRONG, WRONG, WRONG.

    It cannot be assumed that the following field has anything to do with the current one. See the "not resized or deinterlaced" picture here:

    http://www.100fps.com/ [100fps.com]

    When the television takes the 540 lines in a given field, interpolates the missing lines, and scales to 720 lines, it is DOING THE RIGHT THING. Otherwise your TV would look like the first two example pictures at the above site.

  • Look at the Source (Score:3, Informative)

    by cheinonen ( 318646 ) <cheinonen@hoCOLAtmail.com minus caffeine> on Monday May 02, 2005 @12:11PM (#12408417)
    The company saying this is Silicon Optics, who makes the Realta scaler chip (featured in the $3,500 Denon 5910 DVD player), and is obviously trying to get people to use their chip, or buy a scaler based on their chip. Now, from the reviews I've read of the 5910, it's the best reasonably priced (under $10k) scaler on the market. It's amazing. However, when they say "Almost all other companies throw away this extra data and do it the lazy way" without naming the companies, I believe them as much as I believe those recent Apple benchmarks.

    I'm almost sure their scaler will help with most sources you feed your 720p HDTV, what it can do with 480i DVD's is impressive enough that you would believe that. However, I doubt the problem is as bad as they say it is. Also, 1080p DLP sets are going to start hitting the market soon, and in a couple of years 720p will probably have been pushed out of the market mostly. Given what a scaler costs, I'd probably save my money to get the 1080p set in a couple years since the 720p sets still look great.

    I have a 1080i set, but I considered a 720p DLP set since they looked amazing and only didn't because of cost.
    • Why can't we use our computers to scale images? I have a 720p projector (Sanyo Z2), and I scale my movies up to 1280x720 on the fly. You need a bit of processor power, but it's much cheaper than the scalers that are mentioned in your post. In addition, I'd assume that I can buy newer software packages and 'upgrade' my algorithms more easily than selling my scaler and buying a new one...
  • Easy Fix (Score:3, Informative)

    by KrackHouse ( 628313 ) on Monday May 02, 2005 @12:20PM (#12408537) Homepage
    You can usually go into your cable/sat box and tell it what to output. If you de-select 1080i as an output format it will do the converting to 720p instead of the TV which should give better results. It's amazing how many times I've seen the cable guy screw up HD installs. They'll have the tv and the cable box stretching the picture which cuts off the edges.

    If you're a real quality nut like me then get a tube based HDTV, they can actually get close to doing 1080i.
  • by StandardCell ( 589682 ) on Monday May 02, 2005 @12:32PM (#12408694)
    Did anyone notice that, despite the accusation, Silicon Optix calls out NONE of their competitors for this supposed "issue"?

    Gennum, Pixelworks, Genesis, Oplus (now Intel), and several others make their own scaler/deinterlacer chips. Most of these have already found their way into displays and have proper deinterlacing strategies in them. Nobody scales without deinterlacing first anyway in a modern image processor.

    Silicon Optix's technology is based on offline film processors by Teranex. While they can certainly be high quality, they aren't the top of the heap either by volume or by prestige. Genesis/Faroudja had a name for a long time with their "line doublers" which are over 20 years old and their more advanced but cheap gm1601 is one of the more popular solutions for HDTVs. Gennum's GF9350 with VXP technology is currently in the largest plasma tv in the world (Samsung 80"). These and other scaler/deinterlacer chips have none of the problems that Silicon Optix claims exist. If you look at the debates that rage over at the usual enthusiast sites, you'll see that there are issues with its own technology like latency and cost that aren't present in the other solutions I mentioned.

    Just like Silicon Optix's "odd film cadence technology" which requires nothing different than what everyone else has today, this reeks of a cheap PR vehicle. While the choice of scaler and deinterlacer is important, it is not the utter tragedy that SO would like to make it out nor are they the saviors of the HDTV world. If they know who the culprits are, then let them name whose image processor it is that creates these problems.
  • by karlandtanya ( 601084 ) on Monday May 02, 2005 @12:41PM (#12408809)
    Remember when CDs first came out?

    Oh, this was going to be great. Fidelity like you never had it before. No scratches. No groove wear. Dynamic range you won't believe. Crystal clear highs. Thunderous lows, with no rumbling feedback even if sat your player on your speaker.

    Remember the little logos? AAD? ADD? DDD was the best you could have (digital recording, digital mastering, and (obviously) digital media in your hand). And a lot of hard work on the part of the engineers operating the mixing boards. It's that last part that costs time and money. Now, all the equipment is digital. So, it's all great, right? Sorry--the technology is not the limiting factor in sound quality anymore.

    The limiting factor is apathy. Most people can not really hear the difference. And fewer people care.

    Exactly the same thing is now happening in video.
    Since we can't improve the functionality (well, we could, but you'd never notice). It's pure hype from here on out.

    Now, where'd I leave that case of speaker spikes and green markers? Gotta get 'em up on ebay; David Hannum was right.

  • Decide for yourself (Score:3, Interesting)

    by DumbSwede ( 521261 ) <slashdotbin@hotmail.com> on Monday May 02, 2005 @01:28PM (#12409437) Homepage Journal
    I have a NEC1351 projector and can display in 1080i 1080p and 720p. I have looked in the show room and noticed a huge range of qualities in HDTV quality. Upconverting standard definition is by far the largest area of quality difference between various models, but that is a bit of an aside to this discussion. I have always been surprised at the number of show room models that wimp out at 720p and not even true 720p but some weird non 1280 horizontal number that has to be extrapolated no matter what the source. Three beam rear projection tube models generally support all resolutions, but have far less actual resolution than the raw numbers they sight on their spec sheets. My 1351 looks stellar compared to anything you will find at BestBuy, but I have done a lot of tweaking, and it is still short of absolute perfect 1080 performance, having to do with a range of factors I won't go into here. The point is even a high-end machine like mine is currently not maxing out the quality of a true HDTV signal yet. The HDTV signals themselves are of so huge a variance of quality you have to wonder what kind of Rube Goldberg solutions they are using behind the scenes in some cases.

    I have seen pros and cons on how these sets do their sampling. Here is my advice -- go look at the picture on a set with a good HDTV source. Use the specs as a guide but don't trust them. Get what looks good to you. My father would never have been able to see anything more than the quality of a good DVD. He couldn't see the difference between crappy digital cable and DVD. Some people like me are so manic about visual quality we will devote huge amounts of time tweaking our systems. While my system is probably limited to about 1500 lines of resolutions due to the lenses, I find its image much warmer, uniform, and pleasing to the eye than the pixilated look of some very high-end flat screen solutions that go for 10-20k.

    About the only thing that really shows how good HDTV can be is material that is shot originally with HDTV video cameras. Upconverting film inevitably introduces a softness that is exaggerated by systems like mine. For now you can only see a few things on the Discovery Channel and a few musical events in true HD (meaning not upcoverted from film). I mention this because while I advise you to go see for yourself (if possible) most stores don't really offer a good enough HD signal to display the difference. If you can hold out a little longer I would wait until either HD-DVD or Blu-Ray players hit the shelves and then demand a demo with these HD sources to make a decision.

    One final note, I haven't noticed that 1080i hasn't had as much comb-artifact during motion as I would have expected, but there still is a noticeable blurring during camera pans (maybe this is just combing in disguise). I'm sure I will get a little boast in quality when I can play off of a true 1080p source. If I were to design the next generation of video recorders I would introduce variable framing rates in playback. The picture being refreshed as high as120fps, but the actual picture updates depending on need for frames to eliminate motion blur. About motion blur, storing 120fps would be inefficient and overkill. The system I propose would make true frames at like 30-60fps, but as the camera moves, the edges would be scrolled in as needed to keep a smooth fluid motion. A really intelligent system might be able to also track one or two moving objects across this field and give them higher frame rates as well. At 2 mega pixels, I think we need to retrench and try to slay motion blur before going onto higher pixel counts.

  • by bradleyland ( 798918 ) on Monday May 02, 2005 @02:11PM (#12410042)
    My DLP monitor's native resolution is 720p. Before purchasing the unit, I read up about signal, conversion, interlacing, etc. The conclusion I reached is that my monitor should do one thing and do it well. That one thing is to display a picture at its native resolution.

    Almost everything I've read notes that the deinterlacing hardware in most TV's flat out sucks. My solution? I bought a Samsung DLP sans ATSC tuner. My TV is a display, nothing more. Had I been able to, I would have purchased it without the NTSC tuner as well. Buying the tuner separately affords me the opportunity to buy a better quality piece of hardware without the redundancy of having purchased the same hardware in my monitor.

    I'll deliver a quality 720p signal to my monitor, and it will display the picture. What's more to ask?

"I'm not afraid of dying, I just don't want to be there when it happens." -- Woody Allen