Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Television Media Displays Hardware

When is 720p Not 720p? 399

Henning Hoffmann writes "HDBlog has an interesting entry about many home theater displays.
Home theater displays around the resolution of 720p (most DLP, LCD, and LCOS displays) must convert 1080i material to their native resolution for display. No surprise there. But many displays do this by discarding half of the 1080i HD signal, effectively giving 720p viewers an SD signal - not watching HD at all! "
This discussion has been archived. No new comments can be posted.

When is 720p Not 720p?

Comments Filter:
  • by Anonymous Coward on Monday May 02, 2005 @11:10AM (#12407538)
    This sounds like the visual version of what Creative Labs has been doing for YEARS with their Sound Blaster audio cards. With most other cards if you want to record with a sample rate of 44.1 khz, you record at 44.1 khz, but even with the newer Sound Blaster cards it must be resampled to 48 khz first.

    It doesn't matter if you are sampling up or down, resampling is bad, your best b
    et is to find a device without it, or if it is necessary like in this case, the one that does the best conversions.

    If I bought one of these displays I would be pretty pissed, but I doubt there is much that can be done about it, if you COULD do something than companies like Creative Labs would be out of business.
  • Resampling (Score:5, Insightful)

    by FlyByPC ( 841016 ) on Monday May 02, 2005 @11:14AM (#12407592) Homepage
    There's got to be a fairly straightforward formula relating inherent resolution loss when performing any noninteger upsampling, or any downsampling. Any other change in resolution must necessarily degrade the signal, yes? (Except perhaps if a clever algorithm could losslessly encode the original data in a 1.5x-upsampled version, without distorting it.)
  • by t_allardyce ( 48447 ) on Monday May 02, 2005 @11:17AM (#12407630) Journal
    When the un-washed masses can't actually tell the difference (they can't even see DCT blocking) and you can get away with selling this crap to them..
  • Re:misty (Score:1, Insightful)

    by Anonymous Coward on Monday May 02, 2005 @11:18AM (#12407638)
    Yeah I mean how dare they complain, so what if they spent their hard earned cash expecting to get a device with the specs that were claimed. Shame on them!

    (what the hell kind of attitude is that)
  • Same thought (Score:3, Insightful)

    by SuperKendall ( 25149 ) * on Monday May 02, 2005 @11:18AM (#12407645)
    I had the same line of thought, that if you just use an HTPC and a monitor that can display data from that (like a projector) then you are all set - as long as whatever feeds HDTV into your HTPC for display is properly doing the conversion. That would be interesting to know, how are current HDTV cards for PC's doing any scaling? I guess they just dump the ffed to disca and then it's up to the players, which hopefully use the great horspoer of a PC to scale properly...
  • Good Ol' CRT (Score:5, Insightful)

    by goldspider ( 445116 ) on Monday May 02, 2005 @11:18AM (#12407651) Homepage
    When I upgrade to an HD idiot box, I plan on sticking with tried-and-true CRT. IMHO, you can't beat the picture quality/price, and I have yet to hear a compelling reason to fork out thousands of dollars for the trendier offerings.
  • its really too bad (Score:2, Insightful)

    by bassgoonist ( 876907 ) <aaron.m.bruce@gm ... minus herbivore> on Monday May 02, 2005 @11:19AM (#12407663) Journal
    Early adopters often get slapped in the face. I've been thinking about buying an hdtv for a long time. I'm really glad I read this before I bought one.
  • by node 3 ( 115640 ) on Monday May 02, 2005 @11:20AM (#12407678)
    If the broadcast is 1080i, and your display isn't 1080i, I don't think it's logical to assume the quality of the downsampled video will be equivalent to a true 720p broadcast.

    When I get around to buying a HD television (not any time soon, I do all my televisioning on my computer), it will be a true 1080i (are there 1080p televisions?) display so I'll know I'm getting the full potential of HD.

    Unless I'm strapped for cash, of course, in which case I'll just suck it up and know my 720p won't be the best thing for watching 1080i content on.

    On the plus side, it's important to get the facts out there for the consumer, who will likely (although not logically) assume he's/she's getting more than they really are.
  • Hey, Bloggers... (Score:5, Insightful)

    by Gruneun ( 261463 ) on Monday May 02, 2005 @11:20AM (#12407681)
    Why not submit a link to the original article, rather than a link to your blog, which consists only of a link to the original article?

    Otherwise, people might assume this is a shameless attempt to draw traffic to your site.
  • by pe1chl ( 90186 ) on Monday May 02, 2005 @11:31AM (#12407851)
    But this is true for SD display on a double-scan TV as well.
    The "digital feature box" in the TV is supposed to combine the two fields into one single frame. This is usually referred to as "motion compensation" or some other nifty marketing term.
    This is what separates the cheap from the expensive TVs.
  • Re:misty (Score:4, Insightful)

    by eyegor ( 148503 ) on Monday May 02, 2005 @11:32AM (#12407868)
    There has been a similar issue for years with audio amplifier specs.

    Mfgrs usually tout their amps with having "200 watts of pulsing music power" which usually means 100 watts per channel peak. In reality it's more like 70.7 watts/channel RMS (assuming they're not still lying).
  • Re:Good Ol' CRT (Score:3, Insightful)

    by GatorMan ( 70959 ) on Monday May 02, 2005 @11:34AM (#12407899)
    You must live on the first floor and I also assume you don't relocate often. Try getting a 215lbs. 36" CRT HDTV up a couple flights of stairs. Even with the home delivery options you're lucky if the thugs don't damage your set, your building, or both.
  • Comment removed (Score:4, Insightful)

    by account_deleted ( 4530225 ) on Monday May 02, 2005 @11:36AM (#12407924)
    Comment removed based on user account deletion
  • by Tiroth ( 95112 ) on Monday May 02, 2005 @11:41AM (#12407996) Homepage
    The problem is, how do you do the research? The audio/video publications out there have not even come close to adopting a standard set of measurements that would quantify the performance of processors that need to perform complex tasks like scaling, 3:2 pulldown, etc. The results from different chipsets are all over the map (chroma key errors, cheats, lame algorithms), and it's rare to be able to get any information at all on new products. You just have to wait 6 months until someone that actually knows what they are doing throws a review up on the web.
  • by Zed2K ( 313037 ) on Monday May 02, 2005 @11:43AM (#12408016)
    Kind of hard to "do your research" when you can't find out how the conversion takes place or can't understand how it all works. This is not a consumers fault kind of thing. This kind of information is not made known unless people ask the question. Who would have thought to ask a question like this?
  • gee am I suprise (Score:2, Insightful)

    by lexcyber ( 133454 ) on Monday May 02, 2005 @11:44AM (#12408035) Homepage
    As usual when we talk quality, in any discussion it all boils down to: You get what you pay for!

    Buy crap equipment and you will get crap.

  • Re:Good Ol' CRT (Score:4, Insightful)

    by The-Bus ( 138060 ) on Monday May 02, 2005 @11:47AM (#12408074)
    Well, not sure what you mean by CRT but I would say the compelling reason is size. CRTs can only be so big. If you want to go bigger, you can go with what I call the "MTV Cribs" TVs, plasma/LCD, etc. or you can go with a quality RPTV or a projector. I have yet to see a plasma or LCD that has a better quality picture than a decent RPTV.
  • Comment removed (Score:5, Insightful)

    by account_deleted ( 4530225 ) on Monday May 02, 2005 @11:48AM (#12408101)
    Comment removed based on user account deletion
  • by wiredlogic ( 135348 ) on Monday May 02, 2005 @11:49AM (#12408110)
    Every HD source I've ever used, from over-the-air tuner to satellite receiver to upconverting DVD player to HTPCs, have a selectable output on their side

    If you receive an OTA broadcast or cable signal in 1080i then you don't have any control over the video source. Since the broadcasters are split between 720p and 1080i this is a real issue.
  • Re:It's there (Score:2, Insightful)

    by ergo98 ( 9391 ) on Monday May 02, 2005 @12:04PM (#12408315) Homepage Journal
    The HD signal's still there...

    Especially given that 540p is still HD. 540p is a hell of lot better looking that 480i. In fact 540p is one of the HD standard resolutions.
  • by Pope ( 17780 ) on Monday May 02, 2005 @12:04PM (#12408330)
    Wow, do you have it backwards! The FCC wanted a single HDTV/digital standard, and it was the industry that pissed and moaned and dragged their feet as to what standards to support. Now you have half the broadcasters on 1080i, and half on 720p, and massive confusion in the marketplace.

    This is one place where ONE standard would be fine. Remember good ol' NTSC? Single friggin' standard, and all TVs sold here support it with little to no problem.

    The government (FCC) had a job to do, and it failed its citizens miserably, by knuckling under to the corporate interests you seem so keen on defending. Not everything can be solved with a free-for-all market battle, and certainly nothing as big as broadcast television should be. It's all a very nice theory you have, but is not practical by any stretch of the imagination.
  • Re:Which Models? (Score:4, Insightful)

    by twelveinchbrain ( 312326 ) on Monday May 02, 2005 @12:21PM (#12408548)
    Yes. Go to a showroom and look at the displays. If you see some that have greater vertical resolution than the non-HD models, there you go. If you can't see a difference, then it doesn't make a difference.

    That's not necessarily good advice. At the showroom, everything you see is optimized for selling the display. You might not notice any problems until you start to view content outside of their controlled environment.
  • by MalHavoc ( 590724 ) on Monday May 02, 2005 @12:31PM (#12408685)
    That's fine advice until you realize that at this point in time there is NO source of 1080p material. Sure, you can scale up to it, but then you're interpolating, and you're not really seeing true 1080p. It's nice to own a unit that can display 1080p natively, but it might be a while until we get a native source for it. You're damn right about the tubes in a CRT though. Those suckers are expensive to replace, compared to replacing a single bulb in an LCD or DLP projector. It cost me three times as much to replace the CRT bulbs in my Pioneer RPTV, compared to the single bulb in my Runco projector.
  • Re:Which Models? (Score:3, Insightful)

    by maraist ( 68387 ) * <{michael.maraist ... mail.n0spam.com}> on Monday May 02, 2005 @12:59PM (#12409051) Homepage
    If you're buying a 720p (or close to 720p as many are slightly off) set that is less than ~3,000 USD the answer is easy: All of them.

    The problem though is that it's BS that the cost of the processing represents affects the price of a multi-thousand dollar system. I can't imagine that a simple $100 PC video card couldn't be stripped of it's bare chips and used as the filter between what-ever proprietary system they use and the input signal processor. It can't possibly cost more than $500 to push an off-the-shelf component into their system. Since base HD-TV's are just above $1,000 these days.. Then such a low-end system of superior down-sampling shouldn't cost more than $1,500.

    I understand that most of these monitors have end-to-end proprietary solutions, and thus are internally limited to processing power and not extensible as I've said above. But if you can make a high-cost low-volume chip, then somebody else can make another one that IS extensible for no greater cost.

    The problem ISN'T that it costs them a lot to make, but that they would prefer to sell the high-cost (and thereby high profit-margin) sets. So if they're going to sell a cheap-o $1,000 set, then they want to artificially restrict it's capabilities. So they use a $50 CPU instead of a $150 CPU (made up numbers). Of course I'd rather pay $1,100 than $1,000 for a TV that uses a better CPU. But they'd rather me buy the $1,000 set now and when I can afford a $5,000 set, buy it again.

    Note these are unsubstantiated allegations; take them at face value.
  • by espressojim ( 224775 ) <eris@NOsPam.tarogue.net> on Monday May 02, 2005 @12:59PM (#12409063)
    Why can't we use our computers to scale images? I have a 720p projector (Sanyo Z2), and I scale my movies up to 1280x720 on the fly. You need a bit of processor power, but it's much cheaper than the scalers that are mentioned in your post. In addition, I'd assume that I can buy newer software packages and 'upgrade' my algorithms more easily than selling my scaler and buying a new one...
  • by uncoveror ( 570620 ) on Monday May 02, 2005 @01:32PM (#12409486) Homepage
    The government staying out of the equation, and letting the market decide did a wonderful job with AM stereo and quadraphonic records. Someone must set standards, and only the FCC, and other such agencies internationally have the authority to do so. Could you imagine haw color TV would have caught on if there had been a format war instead of NTSC? We all need to stop bowin beforew a golden calf called the "free market".
  • by As Seen On TV ( 857673 ) <asseen@gmail.com> on Monday May 02, 2005 @01:39PM (#12409583)
    it will combine the 1080i field pair into a single 1080p signal

    This will, of course, suck.

    There are two different ways to get 1080i material. You can either shoot some other format at 24 frames per second and convert it to 1080i, or you can shoot 1080i.

    If you shoot film or 1080/24 at 24 frames per second and convert to 1080i, there are a set of well-defined tricks you can use. These tricks are collectively referred to as "pulldown." It's possible to remove pulldown, which is great ... except these automatic schemes usually break on cuts, which means they introduce more artifacts than they remove.

    If you shoot at 1080i, like for a sporting even or live broadcast, then combining the two fields into a single 1080p frame is going to make it look like ass. Because when you shoot 1080i, you're capturing the upper field 16 milliseconds before you capture the lower field, which means your subject moves between capturing field A and capturing field B. If you then just shove these two fields together, you're going to get a blurry picture.

    It's a widely repeated fallacy that 1080p is inherently superior to 1080i. The two formats are equivalent in most respects, and going from one to the other is not a "up-conversion." If done wrong, it can be a down-conversion.
  • by ePhil_One ( 634771 ) on Monday May 02, 2005 @01:41PM (#12409619) Journal
    1024x1024 plasmas? That is weird, whats the brand?

    But getting back to the subject, this article is goofy from the start. It supposes that the "proper" way to down convert the 1080i signal is to first convert it to 1080p, then down convert it; mostly because thats the way the "geniuses" at HQV have done it. But if you think about it, thats a brain dead solution too. Why?

    The 1080i signal is 2 540 line scenes, shot 1/60th of a second apart, with half the picture data. If anything on the screen is moving, it will be in a different place 1/60th of a second later. if you mash those two scenes together into a 1080 line image, those items in motion will blur without some complex 3-d (time being the 3rd D) reconstruction. If you instead treat the 1080i signal as a 540p signal, you eliminate the motion blur at the potential cost of some static image sharpness.

    In other words, pick your poison. The blog has shown itself to be either an advertising shill or lacking in technical competence since it completely failed to go into the real issues here...

  • by bradleyland ( 798918 ) on Monday May 02, 2005 @02:11PM (#12410042)
    My DLP monitor's native resolution is 720p. Before purchasing the unit, I read up about signal, conversion, interlacing, etc. The conclusion I reached is that my monitor should do one thing and do it well. That one thing is to display a picture at its native resolution.

    Almost everything I've read notes that the deinterlacing hardware in most TV's flat out sucks. My solution? I bought a Samsung DLP sans ATSC tuner. My TV is a display, nothing more. Had I been able to, I would have purchased it without the NTSC tuner as well. Buying the tuner separately affords me the opportunity to buy a better quality piece of hardware without the redundancy of having purchased the same hardware in my monitor.

    I'll deliver a quality 720p signal to my monitor, and it will display the picture. What's more to ask?
  • Re:Which Models? (Score:3, Insightful)

    by JohnsonWax ( 195390 ) on Monday May 02, 2005 @02:18PM (#12410135)
    At Best Buy and Circuit city I've seen lots of SD signals on HD displays. How on earth am I going to know if it's the set or the signal that's producing all those jaggies? Ask? At Best Buy? I might as well ask them to build a moon rocket while they're at it.

    So don't go to Best Buy or Circuit City to evaluate your monitors! Go to a high-end video shop to evaluate your monitors and then go to Best Buy or Circuit City to buy them if the prices are really that much better.

    Seriously, do you want to solve the problem or just argue?
  • by Total_Wimp ( 564548 ) on Monday May 02, 2005 @03:27PM (#12411036)
    I totally agree with the the idea of your post, but the actual practice is a bit different.

    First off, DVHS and HDTV PVRs exist in tiny numbers. I am not one of the guys who owns one. Neither are the vast majority of the people reading your post. It's a great idea, but a bit of a stretch to ask someone to purchase equipment that costs half a grand minimum before even going out to purchase their first HDTV. Once again, great idea, just not something very many people are going to be able to take advantage of.

    Ok, number two: think of the specialty shop in your area that has a wide variety of HDTVs set up and ready to plug your box into. Having trouble? Me too. All of the specialty shops in my area have, at best, half a dozen sets. This is pretty far away from variety. I'm not trying to knock these guys. I bought my speakers from a specialty shop and got a good deal on some great speakers. I love those guys. But their monitor selection was pathetic, just like every other "home theater" shop I've actually walked into in my area. Are you _ever_ going to be able to see the actual display you're interested in, or just something fairly close?

    Finally, what about Best Buy with your own content? Well, this is a problem too. A big problem. Let's say you showed up at BB with your own source material and you happened to find the one guy in all the BBs thoughout the land that was very knowledgeable, very friendly and didn't mind at all walking around with you answering your questions and helping you set up your PVR to every box you were interested in. Well, what about the sets? How are they calibrated? Have they ever been calibrated? Did they set that CRT down hard and screw up the guns? Have they been showing their plasma monitors at full brightness for 12 hours a day for half a year? Are those dead pixels typical of the manufacturer or unique to that one display? To put it succinctly, when you go into that menu and choose the defaults for the display you're interested in, how do you know that those defaults are typical of the box you're going to open up at home? How do you know if the set next to it is just as bad, or actually much closer to factory defaults?

    I know, you can glean a lot by looking at the displays, but it's far from the only place to look. Even good sets will often look like crap at the big box store. I hate to say it, but this is one place where stats will often paint a better picture than the picture you see in front of your eyes at the store.

    TW

This file will self-destruct in five minutes.

Working...