Forgot your password?
typodupeerror
Television Media Technology

NHK Working To Make HDTV Obsolete 299

Posted by Zonk
from the tech-doesn't-stand-still dept.
An anonymous reader writes "According to an article at EEtimes.com Japanese company NHK has successfully demonstrated a live relay of 'Super Hi-Vision' television, which is 16x 1080i resolution -- 7680 x 4320!" From the article: "NHK developed a Super Hi-Vision camera equipped with 8 megapixel CCD image sensors that can take 4k x 8k images. In the field test, it sent the two cameras to a sea park and sent baseband signals without image compression using an fiberoptic network formed by multiple network companies. The signal of the total 24 gigabits per second was divided into 161.5 Gbps HD-SDI signals to sent using the DWDM (dense wavelength division multiplex) method."
This discussion has been archived. No new comments can be posted.

NHK Working To Make HDTV Obsolete

Comments Filter:
  • by xmas2003 (739875) * on Friday November 04, 2005 @08:26PM (#13954765) Homepage
    There's a little more info on the Open House 2005 site [nhk.or.jp] (where it was demo'd) that includes a graphic and mentions that it "employs a 22.2 channel 3D loudspeaker arrangement to realize excellent sound field reproduction and a wide listening range" ... whatever 22.2 is, it sure sounds like a lotta speakers. EETimes didn't say when this would be actually available to end-users, but PCWorld wrote on June 16th [pcworld.com] "... the NHK says its system is unlikely to be commercialized until sometime in the next decade" so it will be a while.

    As with many new technologies, the p0rn industry will probably be the first to deploy this 33,177,600 pixel technology. Boy, I feel a bit inadaquate as my halloween webcam (goes offline Saturday night) [komar.org] only has 337,920 pixels (704x480) - I guess size matters, eh? ;-)

    • I feel a bit inadaquate as my halloween webcam (goes offline Saturday night) only has 337,920 pixels (704x480) - I guess size matters, eh? ;-)

      I've been to your site before -- great job, btw. Although your server is quite impressive, I want to see a live-feed of a 7680x4320 video @ 60-Hz showing us the server room (perferrably wired so that the visitors can cut the power of the cooling on demand) while it is being Slashdotted.
    • 22.2 = 22 mains, 2 subs.

      8 mains at ear level (3 across front, 3 in rear, 2 on each center side), 7 mains each above and below ear level (no rear center).
    • As with many new technologies, the p0rn industry will probably be the first to deploy this 33,177,600 pixel technology.

      I might have to disagree. The recent "Weapons of Ass Destruction" was said to be cutting-edge--but it appeared to have been filmed with a 20 dollar webcam, and it was on a VHS in SLP mode!

      my halloween webcam (goes offline Saturday night) only has 337,920 pixels (704x480)

      If I watched that tape with this new technology, each testicle could take up as many pixels as your webcam!

    • by InvalidError (771317) on Friday November 04, 2005 @09:31PM (#13955163)
      This stuff was on Discovery Channel months ago... and NHK's plans are to use it for movie theaters. Availability for home system was not discussed and it will certainly take a while, if it ever does get there. The DC overview of the UHD system did not say much about the audio system that went with it though. (Nor did it go into any sort of details about how the system was setup for the demonstrations.)
    • whatever 22.2 is, it sure sounds like a lotta speakers.

      this is exactly the type of sharp, quick witted analysis that keeps me reading slashdot.
    • So did you actually hook the lights up this time, or is this another case of You being full of shit?
    • by diablomonic (754193) on Friday November 04, 2005 @10:18PM (#13955350)
      What is annoying to me is that even with all those pixels, which is coming close to enough to properly completely trick the eye (some people estimate that the eye can see somewhere round 5000 by 10000 "pixels" accross our full field of view, maybe more for some people), they have left it as ~25 frames a second (~32,000,000 pixels * 3 bytes of info per pixel (but why only use 24bit true colour when your going for this quality?) is roughly 96 MB per frame, and the uncompressed total was ~2.5 GB per second, which is roughly 25 frames a second.

      I realise that they most likely did this becouse it would be damn hard to get any higher with that amount of data per frame, but still, if your someone who is designing a spec and aiming for a new super dooper standard, PLEASE UP THE FRAME RATE. 25 FPS SUCKS for fast action.

      also, anyone who is going to argue with this and say 25 is all you need, please read and understand this before hand, or else shut up: www.100fps.com [100fps.com]

      • hehe should of really read the extra info someone posted, 60 Hz progressive !!!!! YAY (although I'd prolly go even further hehe), although this makes me wonder where I went wrong in my earlier calculations: 24 Gb/s /( 60 FPS * 33 000 000 pixels) = 12 bits of info per pixel ?????? 3 bit rgb channels??? heheh yay for CGA or whatever this is hehe

        I guess they must have lowered the FPS in this one to 25fps for testing

      • "I realise that they most likely did this becouse it would be damn hard to get any higher with that amount of data per frame, but still, if your someone who is designing a spec and aiming for a new super dooper standard, PLEASE UP THE FRAME RATE. 25 FPS SUCKS for fast action."

        I think I read somewhere that some IMAX movies run at 60FPS, for that very reason. But I have to wonder if higher frame rates were available in films, would dramas use them? I mean, TV is capable of doing 60FPS. (50 in Europe) Yet th
      • why only use 24bit true colour when your going for this quality?

        Yeah, this is stupid.

        I would take HD resolution with an improved colour model over this any day.
    • by Khyber (864651)
      Some doctors I've recently talked to say that the approximate resolution of the human eye is about 14 megapixels.... so... As with many new technologies, the p0rn industry will probably be the first to deploy this 33,177,600 pixel technology. I'm wondering just how sharply we (our eyes) will perceive this 33 megapixel technology???

      BTW, for the math nitpicks, 7680 x 4320 equals out to 33,955,200 pixels, not 33,177,600.
  • That's the pipe coming into my house so I can watch Three Stooges in Super Hi-Vision.

    still can't get the EPL matches I want though, dammit

  • Yeah. Obsolete, huh. Whatever. Nice hyperbole.

    HDTV will be hitting in three or four years. It will be the standard for the next fifty years, just as we've stuck with the outdated "standard" we have now for however many decades. Don't expect to see any of this (in America, at least) in our lifetimes.
    • Re:Obsolete? Hardly. (Score:3, Interesting)

      by timeOday (582209)
      You raise an interesting issue... will our new video standards last for 50 years like the first ones did?

      IMHO, they will not... I think we'll see more frequent improvements. First generation equipment was all implemented in hardware with a certain number of scan lines, refresh rate, color fidelity, and encoding scheme, yet downloaded videos vary in ALL of these parameters. From the early postage-stamp animated gifs, to video clip mpg, to VCD, SVCD, xVid, and now full DVD rips seem to be catching on. An

      • What does it mean for a standard to last? If enough content is generated that is compatible with that standard, and in fact is created for it, that standard can be said to last.

        We need to be suspicious of the rhetoric of obselesence. Obviously, we never have the "best" technology possible - it's always a balance and negotiation between budgets, economies of scale, resources, and the like. And people's lives aren't all about what's called 'progressive time,' in which we are oriented only to a promised, unfol
        • The problem with tech neophilia is that it turns into permanent future-orientation, living for a time that is always about to arrive.

          That's one of the best one-line statements I've read on slashdot.

      • ...... I think we'll see more frequent improvements......

        Sure, we have seen great progress in fancy computer generated video and sound effects, but what improvements have there been to the PROGRAMS in the last 50 years? The sex has become more explicit and violence has become more graphic, the commercials more frenetic and numerous and the news reports more biased, dismal and depressing.

        Other than having to endure all that crap in stunning high and higher resolution, what signs are there that HD or super HD
  • by Chocolate Teapot (639869) * on Friday November 04, 2005 @08:28PM (#13954782) Journal
    "Super Hi-Vision has huge information and was difficult to transmit. Using 16 waves on optic fiber, we succeeded a live relay over a long distance."

    ...but the sound is still a little disjointed.

  • by amliebsch (724858) on Friday November 04, 2005 @08:29PM (#13954790) Journal
    "The signal of the total 24 gigabits per second was divided into 161.5 Gbps HD-SDI signals to sent using the DWDM (dense wavelength division multiplex) method."

    What color ray is that disc going to need? I'm guessing puce.

    • Re:Random thought (Score:5, Interesting)

      by Punboy (737239) on Friday November 04, 2005 @09:02PM (#13954999) Homepage
      For this I am assuming 1cmx1cmx5mm for the size of Samsung's 16Gbit flash chip. This is probably slightly larger, but we must include the board on which they are soldered

      People made do with huge VHS tapes for years, right?

      So lets see how much storage we can cram into a VHS tape using flash.

      first lets gets the area of a VHS tape... 7 3/8 x 4 1/16 x 1. Thats in inches. So, lets use Google to calculate that into cubic centimeters.
      Thats about 491 Cubic centimeters.

      Now lets see how many cubic centimeters a single flash chip is.
      Thats 0.5 cubic centimeters. Now lets divide 491 by 0.5.
      Thats a whopping 982 flash chips!

      Now, how many gigabits of storage is that?
      15,712 Gigabits of storage space in a single VHS tape filled with 16Gbit flash. Wow. What is that in GB?
      1,964 gigabytes

      Ok, so we'd need 10 of those for a 2-hour movie. But you have to remember, thats uncompressed. If we compress it, we just may get a single movie into a 1,964GB flassette (flash-cassette, something i just made up).

      Woot.
  • Let's see data transfer and storage catch up with this development to consider it an alternative to HDTV instead of it's eventual replacement....
  • by Kjella (173770) on Friday November 04, 2005 @08:31PM (#13954804) Homepage
    Resolution doesn't make sense unless you can see it. HDTV adoption is slow at best, and consumers aren't going to move to a better format than that for many many decades. This format might be interesting for cinemas and such, but it's not significant to HDTV at all.
    • HDTV adoption is slow at best, and consumers aren't going to move to a better format than that for many many decades.

      One of the benefits of HDTV, as commonly deployed, is that it decouples the display from the source - e.g. you can watch an 1080i signal on a 480i SDTV screen, a 720p, or a 1080i, or hypothetically anything larger. My LCD TV accepts a DVI input feeding from 480i to 1080i, and it displays it on the 720p screen.

      This decoupling is a major benefit, because if one of the satellite providers wanted
    • I've had HDTV for a few years now and honestly I never was blown away. What I did get was a big hard to move cini-screen set that I'd be more then happy to part with for something better and thinner/lighter.

      Its all about how they position it. You plasm screen owners might be a little harder to convince, but that resolution is a big jump and by the time the rest of the technology catches up you might be ready for an upgrade anyway (we don't have 21TB dvd's yet and at least here in the US streaming that kin
  • 16x 1080i What?? (Score:3, Insightful)

    by mpapet (761907) on Friday November 04, 2005 @08:33PM (#13954811) Homepage
    In typical slashdot byline fashion: Is this the end of HDTV? Tune in and see!

    The two places it would be great are:
    -Digital cinema. It might keep the movie theaters open a few more years. On the production side: Talk about a storage problem when you have to store all of the raw footage!
    -"jumbotron" type displays for arena-style live events.

  • by molrak (541582)
    I skimmed the article--it's a bit light on the details, but I didn't read anywhere in the article that their Super Hi-Vision was interlaced. I would hope that the next generation of TV's (or whatever form they take) would get rid of this idiotic interlacing nonsense.
  • 'Super High Vision' sounds SuperUltraMegaCool!
  • Per hour (Score:5, Informative)

    by Punboy (737239) on Friday November 04, 2005 @08:39PM (#13954840) Homepage
    24 (gigabits / sec) = 10.546875 terabytes / hour

    Thats 21TB for a standard-length movie! ~21,000GB! Foly Huck!
    • Re:Per hour (Score:2, Interesting)

      by NanoGator (522640)
      "Thats 21TB for a standard-length movie! ~21,000GB! Foly Huck!"

      I remember thinking 'Foly Huck!' when talk of CD-ROMs were holding 650 meg of data. Back then a big hard drive was a whopping 40 megs. Now I think CDs are so miniscule. I'm guessing in a decade or so 21 terrabytes will be like what a gig is today. Not that I did any real math to arrive at that conclusion, but man, as the years go by, it's amazing what storage capacities turn into.

      In any event, that's uncompressed. With compression (which is
    • Re:Per hour (Score:3, Informative)

      by Ironsides (739422)
      24 (gigabits / sec) = 10.546875 terabytes / hour Thats 21TB for a standard-length movie! ~21,000GB! Foly Huck!

      Remember also that the 24 Gbs is UNCOMPRESSED. Compressed it would be much much less. Probably at most (*thinks* 50Mb*16=800Mbs) 800 Mbs or ~360GB/hour. They could probably compress it a bit more without much loss of quality. As for the 21TB, that is easy to do with todays Fibre Channel storage (~25TB using 42 500GB drives in RAID on an ATA Beast) The problem is the max sustained read spea
      • Interestingly that's roughly 1000 times the bandwidth of an NTSC DV-25 data stream. DV-25 happens to use 5:1 compression. Discuss.
  • Why did they not make a resolution-independent standard for HDTV?
  • by dada21 (163177) <adam.dada@gmail.com> on Friday November 04, 2005 @08:43PM (#13954882) Homepage Journal
    HDTV is old news and an antiquated format. It was a government standard based on OTA standards.

    Tomorrow's receivers will be much faster (a la XPMCE or MythTV). OTA is dead, we want IPTV. 7.2 surround is ready. 2.35:1 is required, at a resolution of 3392 x 1440, progressive.

    We want fixed 6500K color standard, with no flesh-push or blue-push. We want an adaptable decoding processor, not something stuck in one mode.

    HDTV isnt the future. A PC, Gnutella, and a HD2 projector is.
    • by MichaelSmith (789609) on Friday November 04, 2005 @09:13PM (#13955068) Homepage Journal
      HDTV isnt the future. A PC, Gnutella, and a HD2 projector is.

      Yep. There are too many layers in these TV specifications. What field are they in? Video or communications? There will be a need for ultra high res video in the future, but TV is dying.

      Every evening TV competes with /. for my time, and mostly loses. And I am not one of those who exhaust themselves on World of Warcraft until 3am then stagger into the office and pretend to work.

      The Broadcasting model came out of the basic physics of radio transmission. We are not limited by that anymore, so broadcasting is out.

  • Assuming the eye's lense is about 1-cm, the angular resolution of the human eye would give you about 54,212 x 54,212 pixels. assuming a 180x180 degree feild of view (with blue light). So we've still got a ways to go!
    • However, the human eyes, due to our having two of them, have a wider view than it is tall. So either our vision is not 54,212 tall or it is more than 54,212 wide.
    • .....the angular resolution of the human eye would give you about 54,212 x 54,212 pixels.....

      Yeah, and the human eye was not designed, unlike a super HD TV system. The construction of the eye and brain system just happend by the blind random chance vibrations of atoms over millions of years.

      The Apostle Paul got it right when he wrote: "Although they claimed to be wise, they became fools
  • Compression? (Score:3, Interesting)

    by TommydCat (791543) on Friday November 04, 2005 @08:51PM (#13954925) Homepage
    What's the point if any service that feeds video to you compresses the crap out of it?

    I've got a hddirectivo, but the compression is fairly obvious when compared to OTA broadcasts, and even those are easy to pick out artifacts.

    I don't see any huge leaps in bandwidth from any provider Real Soon Now, and wouldn't any compression to fit the available bandwidth reduce the effective resolution?

    However if this is for closed-circuit feed from Hugh Hefner's humble abode, I may be interested :)

    • Umm, my first BBS in 85 was 300 bps. Today I have 10Mbs at home. 6Mbps sustained.

      Tomorrow (2007) I expect 24Mbps (I could alpha test 18Mbps right now). 100Mbps is google close.
    • Your DVR transcodes the signal from your digital source to another format? Why? My family has a HD DVR and it just saves the digital stream directly. The quality is exactly the same as OTA.
    • You've touched on a huge annoyance of mine regarding digital TV. Cable companies have created the marketing myth that "digital" == "flawless", and they compress the hell out of the signals on the digital channels in order to squeeze more of them into the service. (I'm not sure, but I suspect that satellite TV companies do this too.)

      The result, as you say, is artifacts, sometimes so bad that they can completely ruin the the aesthetic experience of watching a movie. One of the most glaring examples I have e
  • The signal of the total 24 gigabits per second was divided into 161.5 Gbps HD-SDI signals to sent using the DWDM (dense wavelength division multiplex) method.

    This must be very innovative technology. I wonder how they can divide one 24 Gbps signal into several 161.5 Gbps signals?

    Even to divide into a single 161.5, they need to divide the 24 signal by 0.148606811145511.

    This DWDM stuff sounds weird...
  • Now these recent words from Ken Kutaragi of Sony Computer Entertainment, the PlayStation guru, start to make sense...

    http://www.next-gen.biz/index.php?option=com_conte nt&task=view&id=1470&Itemid=46 [next-gen.biz]

    >Generations to come

    >Sony gave also a vision of things to come in terms of video
    >quality and the format to support it. Today's TV sets are
    >allowing resolution of 720 to 1080i. Sony calls it the 'HD ready
    >generation' with a frame rate of 60 to 90 fps. This is
    >symbolized by the DVD for
  • The only real use I can think of for this technology, considering the bandwidth requirements, would be public closed-circuit viewing of live remote events. For example, imagine this set up at the Olympic games and transmitted to 5 or 6 special "viewing arenas" (glorified theaters) worldwide so that if you can't make it to the games physically, you can go to your nearest viewing arena and pay to watch it live. Or imagine it for NFL: a team establishes a dedicated link to their home stadium whenever they pl
  • OK, cool.. but... (Score:4, Insightful)

    by loraksus (171574) on Friday November 04, 2005 @09:07PM (#13955035) Homepage
    1080i transport streams run about 5 gigs for 40 minutes and require a ~2Ghz processor to decode without dropping any frames or choppiness. I know 2Ghz isn't considered too fast - even now, but I am finding the trend to require an insanely fast machine to watch / record tv sightly odd. Without someone out there to create a unit out there that makes it easy to view HD content - and by easy, I mean "dear old mom and dad" easy, I'm worried that people won't adopt it and choose to just stick with plain jane devices (which won't drop the price on the cool stuff for us)
    There really isn't a lot of really great HDTV compatible stuff out there either. DirectTV is dragging their feet and the rest of the major players out there aren't exactly pushing anything terrible innovative either. Software for it is also pretty bad. I know a lot of people like MythTv, etc, but it could be a lot better.
    There really isn't a efficient way to compress any 1080 streams either - you need loads of time, a fair bit of ram and a great machine - even then a 250gig drive fills up really quickly.

    Also, and this is somewhat of a pet peeve of mine - is that with 1080i (and 720p), you can see if the camera isn't focused perfectly. I find this incredibly annoying. If the quality gets bumped up another couple of levels, this will be more noticable. I'm guessing this will be corrected as more and more people realize that it looks sloppy on the cameraman's part.

    If you're bored, try and figure out storage requirements for the folks who film your favorite shows in 24p (BSG does, as well as a bunch of other shows) and then figure out the storage requirements for something recorded in this format ;)
    • Ah... the current generation of CPU's are quite pathetic with processing media streams.

      Just wait a year or so. HDTV hardware decoding in video cards will become very popular. In fact, I wonder why the pixel shader GPU unit can't be programmed to do some of the calculations. I'm very disappointed with Quicktime 7 for example which requires such an insane CPU speed (dual core even???) just to watch 1080i video. It's obvious Apple did not bother to optimize their software (of course, it helps sell new mac's do
    • ...a pet peeve of mine...is that with [ultra hi-res], you can see if the camera isn't focused perfectly.

      IANAExpert, but think that sequential generational losses are multiplicative... meaning that even the lowest-res tv presentation won't conceal bad focusing.

  • by inkdesign (7389) on Friday November 04, 2005 @09:07PM (#13955038)
    6.8ghz Laptop [slashdot.org]
  • And yet... (Score:2, Insightful)

    by Evil Butters (772669)
    And yet, there will still only be 3-4 programs on TV/cable/satellite actually worth watching -- no matter what the resolution is!
    • I wait with bell on my toes for "Blue Collar Comedy Hour" in SHDTV!

      Git'R'Done!

    • "And yet, there will still only be 3-4 programs on TV/cable/satellite actually worth watching -- no matter what the resolution is!"

      I agree! The popularity of Bittorrent, seasons of TV shows on DVD, and TiVo around here is so peculiar!!
  • Ok, I suck at math and all, but I can't be the first one who picked this up.

    1920 x 16 = 30720 != 7680
    1080 x 16 = 17280 != 4320

    Also, sometimes more resolution isn't always better. I don't really want to see the exact number of pimples on someone's face.
    That sort of stuff is visible even in this screen cap [vehiclehitech.com], which is in 720p and encoded with xvid (cap is from HBO's "Rome", BTW). I really don't want to be able to make out globs of makeup on someone's face. Ignorance is bliss I guess (teeny bopper pop stars woul
    • Re:t3h new maths? (Score:3, Informative)

      by LocalH (28506)
      1920 x 1080 = 2073600
      7680 x 4320 = 33177600

      33177600 / 2073600 = 16
    • 7680 / 1920 = 4 times as wide
      4320 / 1080 = 4 times as tall
      4 screens wide * 4 screens tall = 16 times the size

      According to the article, this would be used in Theater environments, 16x HDTV res is reasonable on a 8 meter screen...
  • THIS is why you don't want governments stepping in and saying Okay everybody you got five years to broadcast in X format and only in that format. Left in the free market, we would not have bound ourselves so tightly to something inferior to this (possibly).
  • Neil Patrick Harris wouldn't say that about HDTV!
  • I am not the biggest HD fan . I only see marginally better quality in HD. I am just wondering what rez is the one the eye really starts to get impressed by.

    I have wondered why there aren't higher resolution TV's out there. It's because HDTV is the standard and I doubt we will see anything better come out. Kind of sucks.

    I don't think the manufacturers will be coming out with 3000*2000pixel TVs.

    Computer displays may play a key part in new display technologies. Since there is one resolution , I can see home t
    • I've heard the holy grail of TV is 2160p at 120fps. Supposedly in Japan or someplace else they showed a demo of this and people in the audience got motion sickness looking at the screen. However, our current consumer 'Max' TV is 1080p at 60fps. 2160@120 is about 8 times as much raw info. So it is going to be a bit before it comes out where most people can afford it.
  • by Max Nugget (581772) on Friday November 04, 2005 @09:43PM (#13955214)
    They seem to have forgotten to write a press release to go with this big story so I wrote it for them:

    This is great news for ____! Eventually it will improve ____ for all consumers but initially it will be used in the ____ industry to improve ____. Sony, Samsung, and Toshiba have all announced they will be introducing their own versions, which will be available in 21__ and are eventually expected to saturate the market at prices as low as $...,...,... Said one executive, "We're incredibly excited about this. We have invested $...,...,...,...,... in this project and are very confident it will succeed and dominate the ____ market. The new technology will first be experienced by consumers in selected ____ during a special ____-enhanced presentation of Star Wars: Episode OMG.

    Have fun filling in those blanks. I sure couldn't.
  • I suspect that people WILL be able to see this; I seem to recall that when 6 megapixel cameras came out, camera connaisseur John Dvorak wrote in his column that they were then hitting the effective resolution of 35mm film.

    But movies have had some success at 70mm frames and IMAX frames that are each about 4X the pixel count, successively...I'm not sure that 33 megapixels is yet as good as an IMAX frame.

    So even without ultraviolet-ray 18-layer disks, I'm imagining IMAX Theatres going digital by just mailing 1
  • someone is going to start making converters of signals and formats cause I ain't buying no HDTV to replace the working 32" TV I have and certainly not going to buy this new fangled tech... to only then deal with the next latest and greatest tech video....

    and that converter is gonna have to be less then $40 before I'll buy it.... Who loses? Well I don't see the commercials then do I?
  • Whomever said that HDTV will be the standard for the next 50 years is exagerating a wee bit. Think about the internet connection you (might have) had 20 years ago. If you were BBSing it in 1985, it was probably with a 300 baud modem. The speed with which we can get home internet access today is exponentially faster than back then. It really isn't a difficult stretch to extrapolate another 20 years in the future. I wouldn't at all be surprised if we have multi-terabyte-speed pipes in our homes in 2025.
  • by istartedi (132515) on Friday November 04, 2005 @10:28PM (#13955386) Journal

    The whole idea of just one standard for TV is obsolete anyway. Just about every cable system offers broadband, and many offer "digital cable". The general-purpose PC, and specialized computers like TiVo are becoming more common. So instead of having just one standard for TV, it seems pretty reasonable to push codecs out to viewers once in a while.

    OTOH, as far as broadcast over the air is concerned, digitial is all too often a joke. When analog goes sour, you get a little "static" or "fuzz". It's not too bad usually. When digital goes bad, the sudden cut-outs of sound, frozen images, and blocks appearing on the screen are much more annoying. We had a little analog TV for a while with a digital tuner. It responded to signal weakness by dropping out EVERYTHING and turning the screen blue, then flashing back to the picture when the signal was stronger. Oh please, bring back my snowy picture!

    What would really be cool is a standard for specifying variable quality of analog signals, and a tuner that could adjust (or report that it isn't capable) of handling high-quality analog. That would be the best of both worlds.

    • I hear these stories and I am rather surprised.

      I have an antenna in my attic, and I pick up 8 UHF stations from 40 miles away perfectly. This is IMPOSSIBLE with analog. They break up maybe once every couple hours. If I look at the analog versions, they're very snowy and ghosty all the time.

      I know digital goes abruptly from great to nothing, but in my experience, it is still great when analog is so ugly as to be bothersome.

      As to digital TV being obsolete when it started out, it's just not true. You were neve
  • Now THAT'S odd, I was just reading about this yesterday in the latest SMPTE Journal on my to-read pile. I think it was the July issue. Alas, it's not online and I finished it and threw it out.

    Anyway, just like DVD-A and SACD are/will be failures, this 32 megapixel technology will be a failure. 1.44 Mbps CD audio is Good Enough for 99% of the people (and far better than the analog format previously available), and 19 Mbps HD video is Good Enough for 99% of the people (and far better than the analog format
  • by Arkaein (264614) on Friday November 04, 2005 @10:47PM (#13955484) Homepage
    At first when I saw the listed resolution I thought that it was total overkill, that no one wold even be able to see anything near that detail. I own an HDTV (720p resolution, or 1280x720), and at a normal viewing distance you aren't missing a lot of detail.

    Coincidentally though, I'm taking a class in visual perception and we've just been discussing optimal human visual acuity, specifically as measured with sine wave patterns. Maximum human acuity is about 60 cycles per degree of visual angle. One cycle in a sine wave can be roughly represented with two rows or columns of pixels, so you really can't do any better than 120 pixels per degree (which is also the approximate density of photoreceptors in the fovea, the highest resolution spot in the retina).

    So what's a reasonable viewing angle? When developing 3D graphics applications I find than a perspective projection angle less than about 60 degrees requires getting pretty close to the screen for realistic perspective. This seems reasonable for a closest comfortable viewing distance. I know I usually sit farther away from my TV than this, probably less than a 30 degree viewing angle.

    At 60 degrees this monitor has just about 120 pixels per degree (128 to be exact). At a farther distance the pixel density will be even higher.

    In a practical sense this monitor still seems like massive overkill to me. HDTV is great for TV, and even computer screens will see considerably diminishing returns by this point. In a theoretical sense though, it might be the perfect resolution.
  • by catdevnull (531283) on Saturday November 05, 2005 @12:42AM (#13955978)
    That's not TV, that's a freakin' hologram.

    Warp speed, Mr. Sulu--and don't look at me like that...I was really drunk and the tribbles had me confused.

<<<<< EVACUATION ROUTE <<<<<

Working...