Ultra High Definition Video 290
hovermike writes "This story about UHDV (Ultra High Definition Video) comes from the NY Times. Here are a few specs from the article: 'picture size of 7,680 by 4,320 pixels'; 'UHDV's beefed-up refresh rate of 60 frames per second (twice that of conventional video), projected onto a 450-inch diagonal screen with more than 20 channels of audio'; '22.2 sound: 10 speakers at ear level, 9 above and 3 below, with another 2 for low frequency effects'; AND THE KICKER, 'All those sound channels and all those image pixels add up to a lot of data. In test, an 18-minute UHDV video gobbled up 3.5 terabytes of storage (equivalent to about 750 DVD's). The data was transmitted over 16 channels at a total rate of 24 gigabits per second.' Don't think I'll wait to buy regular 'old' HDTV..."
Viewscreen (Score:5, Funny)
Re:Viewscreen (Score:2, Funny)
Re:Viewscreen (Score:2)
Re:Viewscreen (Score:2)
First Thought.... (Score:5, Funny)
Please note: first thoughts != best thoughts
Confused (Score:5, Funny)
Re:First Thought.... (Score:3, Informative)
Many TV stations found out that their sets looked really cheesy when they tested them with HD video cameras. Not to mention the faces of the "talent'.
Re:First Thought.... (Score:3, Insightful)
Right ... (Score:2, Insightful)
It's easy to make up insane specs n such, to be able to use them is a other
Re:Right ... (Score:5, Funny)
"And here you can see the distribution of Influenza cases superimposed across this landsat image lower manhattan... and my apartment. Hey! There's me! And I have the flu."
I dunno (Score:2)
A cluster of ASICs could be used to decode the signal.
They've done this sort of work before... (Score:2, Informative)
Re:Right ... (Score:2)
thus, higher framerate actually makes the game move faster.
Re:Right ... (Score:2)
With rendered frames, each frame is independent. There is no blur of the moving items. Because the brain doesn't have those clues to follow, the illusion is less complete. At around
Re:Right ... (Score:2)
That's why you need to get one of those newfangled 3dfx cards with a T-Buffer (tm)
Re:Right ... (Score:3, Informative)
Re:Right ... (Score:2)
Even after you sift through the terrible post, which he tried to correct apparently, I have to dissagree that you can't notice the difference between 30FPS and something higher. And I'm talking about FPS, not refresh.
You go to the movies and watch the screen when they pan. You can see how jerky it is. You notice it on TV too. And try to play a video game at 30FPS, it's terrible. (Ohh, and don't try to tell me that console game syst
Re:Right ... (Score:2)
To be fair, film is 24FPS, not 30.
TV isn't a good choice for comparison. It is interlaced, not progressive, NTSC is telecined, and all too often, it's a matter of junky TV sets causing problems, not the signal itself.
In other words, I'd find it hard to believe that the jerkiness is a case of a refresh rate that's too low... Mainly because NTSC TV is really 60FPS, it's just that they are half-h
Re:Clarifying my previous post (Score:3, Informative)
Sure, you'll have less flicker at higher refresh rates, but a higher refresh rate will not improve the frame-rate, of course. Something like 10FPS video will still look plenty choppy at 120Hz, and you will notice that.
Re:Right ... (Score:4, Informative)
The NTSC standard is not 60Hz refresh. A NTSC tv draws even lines first, then odd lines. Each one of these is called a 'field'. There are 60 fields per second, but they are put together to make a 'frame' There are 30 frames per second. (when they added in sound/colour in there, it got switched down to 29.997 or somethign stupid...bandwidth issues)
Anyways, this even-odd line drawing is called interlacing. It tricks the eye into aaalmost seeing 60 frames per second.
NTSC TV's, unlike monitors/video games, dont have seperate frame/refresh rates, because inside the TV, its all analog circuitry driving the electron beam which is driven directly off of the RF signal to display what it does on the screen. Not like a video game where the computer might be generating 120fps, but the monitor refreshing at 75Hz. In this case, the monitor redraws itself completely(called progressive scan), 75 times a second. When your video game framerate is higher than your monitor refresh you will certainly experiance 'tearing' This is when the frame changes in the middle of a monitor drawing cycle.
Most people who want a nice looking picture will turn on vertical synchronization . This makes it so that no frame changes in the middle of a monitor drawing by limiting the maximum framerate to the monitor refresh rate, and synchronizing the two. Once this is on, it becomes a lot more like NTSC, except not interlaced. One video game 'frame' is served up every monitor 'frame' and it all looks very nice.
The reason a video games looks so shitty at 30fps and TV doesnt is twofold.
1. What the above poster said, that 30fps is an average, and if you are getting 30 average, you are probably sometimes getting more than 30, sometimes getting less...you will really notice the 'less'
2. Video games dont have motion blur. On video cameras/your eyes, moving objects are automatically blured because your eyes dont 'update' all that fast. On video games a distinct frame without motion blur is drawn..each frame could be removed to make a sharp picture. Not so on tv. (oh and i know about some hardware/software inserting some crappy motion blur routines..the fact that you can plainly see it means it looks nothing like the real thing
Re:Right ... (Score:2)
At least we have some good news (Score:5, Insightful)
Re:At least we have some good news (Score:2, Insightful)
Re:At least we have some good news (Score:5, Insightful)
KFG
Re:At least we have some good news (Score:3, Funny)
Bah! (Score:3, Funny)
Yes, but don't worry.... (Score:2, Redundant)
Of course, all I get here is interlaced PAL on cable. Earth to TV networks: I'm getting better progressive/HDTV feeds via newsgroups
Re:At least we have some good news (Score:3, Insightful)
This technology probably won't be used to make a better picture on the 20" screen but will give you the ability to have a 200" screen without looking at gigantic pixels.
Re:At least we have some good news (Score:2)
I've got a 9' wide projection setup that displays 1365x768, and from 15' or so away, the pixels are small enough that I'm losing detail because of my vision, not because of the resolution of the image. I typically sit more like 8' or 9' away where I can pick out some pixel-y details (like jaggies), but sitting that close means that the image takes up a huge portion of my FOV.
I imagine for typical setups of 42" or 50" plasma displays that do 720p nati
Re:At least we have some good news (Score:2, Informative)
http://clarkvision.com/imagedetail/eye-resoluti
Re:At least we have some good news (Score:2)
It took about forty years to take HDTV from the lab to a commercial product. (Development first started on it at NHK in 1964.)
I don't think we're "about a decade away" from anything like that.
Re:At least we have some good news (Score:2)
Some people will say it is too sharp, but the beauty is that you can add analog-looking grain and probably make it indistinguishable from a good modern film projection. Personally, I prefer analog-looking artifacts more than digital-looking artifacts because they often represent reality better.
Re:At least we have some good news (Score:4, Informative)
The resolution of the human eye is about 2500x2500 (6-7,000,000) cone cells (color) and 35000x35000 (120,000,000) rod cells (grey). Not evenly spread and the rods are not individually sensitive with multiple rods triggering the one nerve. See this [gsu.edu] more detail.
---
It's wrong that an intellectual property creator should not be rewarded for their work.
It's equally wrong that an IP creator should be rewarded too many times for the one piece of work, for exactly the same reasons.
Reform IP law and stop the M$/RIAA abuse.
Thank you very much (Score:5, Funny)
nice, but what is the point? (Score:4, Insightful)
Even if the price is within our reach, this piece of technology is going to be left to corporations and ultra rich people with lots of real estate. I fail to see point of having this, except for new digital cinemas.
My god, watching the latest holycrud with mind boggling resolution...
Re:nice, but what is the point? (Score:3, Interesting)
consider that a 17" WXGA screen is 1440x900- you could definitely go up to, say, a 50" screen with this resolution and have something really photoreal, even when you're standing up close to it.
i'd argue that the 20 channels of sound would be much less noticably better than the higher re
Re:nice, but what is the point? (Score:2)
Um, you're actually contradicting yourself here. If your parents have a 51" HDTV, it should be able to do 1920x1080, which is quite a bit higher than your "photoreal" 1440x900.
Now, there's the problem tha
regfree link (Score:4, Informative)
Re:regfree link - make it automagical (Score:2)
Half Rez (Score:3, Funny)
24 Gb/s (!) (Score:3, Interesting)
I wonder how long it will be before the local utility offers a 24 Gb/s connection. (of course it will all be for naught if the uploads are still snaily)
Re:24 Gb/s (!) (Score:2)
Probably around the same time as when they start bringing fibre to the home.
Re:24 Gb/s (!) (Score:2)
Re:24 Gb/s (!) (Score:2)
Monitors First (Score:4, Interesting)
The power of modern GPU's could be put to use with this resolution, and we could once again have a resolution war between the various chip makers.
Let's learn to "walk" with images of this resolution, before we try and run.
Re:Monitors First (Score:2)
Hmm. I don't think that's really right. I saw my first 1920x1200 monitor in the mid-1990's, which was well after HDTV became commercially available. (It wasn't cheap or abundant, but it was available.) Before that, 1280x1024 was pretty much the most common resolution, or 1600x1200 if you liked to squint. (21" CRT's can't really resolve that many pixels; they get fuzzy.)
In this particular case, the TV's beat the computer folks to th
Re:Monitors First (Score:2)
It took decades for TVs to really take advantage of NTSC resolution, and only in the past decade did they surpass them. As it is, only a few displays can really display HDTV in its full detail, they all have some shortcomming, but it's doable with proper tweaking and not cheaping out.
IBM has LCD monitors of 4000x3000 resolutions NOW, for scree
Rips should look great... (Score:3, Funny)
The point is... (Score:4, Interesting)
I expect you to give the quote "640k ought to be enough for anyone", and you are right, but by the time anyone can store this much data, we'll probably have holographic projectors, and 3D tv.
And would you like Ultra-High-Mega-Super-Happy-Fun Resolution 2D tv, or SDTV quality 3D.
Why do I bother asking....
Oh, Great... (Score:5, Funny)
Damnit!
It was great until... (Score:5, Funny)
Serenity NOW!
Tim
Well, then no more of that (Score:3, Funny)
Why just 60 Hz? (Score:3, Funny)
57 channels and nothing on. Bah humbug!
Re:Why just 60 Hz? (Score:2)
I'm not sure if going up to a multiple of the frequency (e.g. 120Hz on a 60Hz supply or 100Hz on 50Hz) would give the same benefits?
Re:Why just 60 Hz? (Score:2)
No, it's really much simpler than that. In order to refresh a scanning CRT at a given rate, you have to have a clock that runs at that rate. To keep TV's cheap and simple, they left out the clock and synchronized the vertical scan to the incoming AC. Which, in the US, was 60 Hz, and in the Commonwealth and Europe was 50 Hz.
When color came along, everyt
12TB/hr? (Score:5, Funny)
Of course... (Score:5, Interesting)
Information (as in raw bytes/sec) will continue to become cheaper and cheaper. The price of content is quite stable. Add 2+2 and see where it is going. More, faster and more "profitable". I know several people that are probably "millionaires" by now.
At the estimates for piracy, using the full penalty of the law, the total piracy is more than the GNP of the world - not just this year - but (estimating like a geometric sequence) for all eternity since the dawn of time.
How's that possible? Simple. We make "money" out of thin air. You give me a million, I give you a million, and we both keep it as well. At $0/content, we could all have all the content in the world. So the loss = 7 billion people * millions of CDs/DVDs/Apps/Games/whatever * full retail price. Yeah. Right.
Copyright will have to change because pretty soon everyone will have millions in liability - it will simply be common. I've seen it in every age group from 8 to 80, both sexes, all sorts of people. It's bigger than prohibition in the sense that "everybody" is doing it. There's simply no stopping that.
Kjella
Re:Of course... (Score:2)
Re:Of course... (Score:2)
No bittorrent link? (Score:5, Funny)
Is that the best you can come up with? (Score:5, Funny)
But can I connect it to my PC? (Score:2)
where? (Score:3, Insightful)
Maybe many years from now we'll see it in a home setup... of course it was about 8 years ago when i bought a 1 gig hdd for 200 or 300 bucks (don't remember specifically now) that someone told me "what are you going to use that for? you'll never be able to use all that space!"
Who's laughing now?
Re:where? (Score:2)
I wonder... (Score:2, Interesting)
Re:I wonder... (Score:3, Informative)
HDTV (ATSC) supports 1080i and 720p.
I wonder: (Score:2)
If it approximates "being there", you could make some mind blowing 3D movies with this technology!
Dupe (Score:3, Informative)
Re:Dupe (Score:2)
Media beats reality? (Score:3, Informative)
The FCC-mandated transition to digital broadcasts probably won't help make HD content mainstream either. Stations may be broadcasting all digital, but they'll still be broadcasting Gilligan's Island [imdb.com] reruns at SD or (gasp) upconverted to 1080i.
UHDV technology may be the future, but the expense of producing content won't make it mainstream. Oh, and Slashdot covered this before [slashdot.org].
Re:Media beats reality? (Score:2)
[Omni|I]MAX (Score:4, Interesting)
My biggest problem with [Omni|I]MAX is that at 24fps, scenes with slow pans get very jumpy (fast pans blur enough to not be noticable). However, if you just ran the film at 60fps, the size of the reels would be unmanagiable, and the speed of the film through the transport would be dangerous!
But imagine a [Omni|I]MAX theater with 100TB of storage (not a big deal nowadays) and a DMD/DLV projector at these kinds of resolutions and refresh rates. They could play any movie they have pretty much instantly, they could have longer running movies, and the movies would be absolutely immersive (esp. for OmniMAX movies - on a 120 degree screen pretty much your entire field of view would be the movie.)
Of course, they'd need to make sure people understood the "If you feel yourself getting sick, just CLOSE YOUR EYES AND BREATH!" a bit more.
*Max is already better than this, IIRC (Score:2)
I can't speak to the audio channels, and I think the frame rate is only 30 fps, but I think *Max wins hands down on resolution.
Tech notes: IMAX uses "15 hole by 70mm film", but the frames ar
Re:*Max is already better than this, IIRC (Score:2)
Oh, don't be absurd. If you scan a single frame of Super 35 at anything higher than 4K (4096x3112), you're just scanning noise. For all commercial applications short of digital mastering, 2K is as high as you need to go (2048x1556).
gah (Score:2, Insightful)
Let's get HDTV first (Score:2)
Expo86 (Score:3, Informative)
The high frame rate eliminates the strobing effect that occurs when the camera pans, or an object moves quickly across the screen. I noticed the strobing when watching LOTR in the movie theater, but the effect isn't visible on TV.
Re:Expo86 (Score:2)
There has been a serious attempt improve film movie projection using the MaxiVision format (which runs at 48 frames per second), but between the necessity of needing
I get a sense of deja vu (Score:3, Interesting)
For this new system to work, we need much larger bandwidth and/or much better compression than we have now, which in practice means more powerful CPUs than we have now. This will come, but I think this will be a decade or more off. (At that point, any system invented by the Japanese right now will be superceded by something newer invented in the mean time).
Personally, I like the idea of this new system in principle. It is the first television system I have seen that generates pictures as good or better as conventional film. It will look fabulous if used in a digital cinema. (Current digital cinema technology only uses 1000 lines or so, and this is seriously lacking compared to film)
As for viewing this in your living room, it is probably overkill, unless we have screens covering entire walls of rooms (which of course we may). The 1080 lines max of conventional HDTV probably is good enough for 40 inch screens and the like. Current generation screens do show various digital artifacts, but these are more to do with the inadequacies of the display technologies than the number of pixels on the screen. (Things like LCD and plasma displays are simply not is good as conventional CRTs in terms of picture quality). Increasing the number of lines in such circumstances will certainly improve the picture further and it will probably happen some day, but larger gains in picture quality can probably be more easily gained in other ways for the moment.
Re:I get a sense of deja vu (Score:2)
The next leap forward is 1080-line non-interlaced display, which may become available to consumers by 2010. 1080-line non-interlaced is extremely sharp, something like 25 to 50 percent clearer than the current USA standards. Such displays will happen when we get DLP and LCOS rear projection
Download speeds (Score:3, Interesting)
So, start download when leaving the house for work on Monday...
Chip H.
Re:Download speeds (Score:2)
Why so many audio channels? (Score:5, Interesting)
Two channels does quite a good job of reproducing all the sounds of an environment, assuming the stereo speakers are appropriately far apart.
5.1 channel sound added a sub-woofer, which is a positive development, and then 3 more speakers. Okay, the two rear channels I can understand, because most people don't have their speakers located well, and there's a certain gee-wiz factor in hearing something that is distinctly behind you. However, the center channel still makes little sense to me, since the stereo speakers can handle that area just as well (center channel is usually a crappy little set of treble-only speakers anyhow).
Now, I am really at a loss to understand why you need even more, especially 20+... Put on a pair of stereo headphones and pick any location, 360 degrees, and I'll make it sound like a noise is comming from that exact spot. So what can 20+ channels do for you?
Even if we start getting holograms comming out of the screen, I could still make a sound seem like it's comming from whatever position that object is located with just 4 speakers, and I could do a pretty good job with just 2 if needed.
Re:Why so many audio channels? (Score:2)
No, I doubt you will. Assuming you're just planning to crossfade between the right and left channel. When sound enters the human ear, the amplitudes of various frequencies are affected by the direction the sound came from. For example (and this is
Re:Why so many audio channels? (Score:2)
No, actually it's more than that. It's really a matter of getting dual microphones positioned correctly, then recording sounds you make, relative to the microphones.
For instance, if a sound is comming from in-front of you, but off to the left, then you'll hear it in your left ear an instant before your right. By exaggerating the spacing a bit, it sounds much clearer where the noise is comming from. Now,
Re:Why so many audio channels? (Score:2)
Twenty two channels may seem like overkill, but I won't write it off just yet. It's only till people experience it when people can truly tell what they have isn't everything. I ex
Re:Why so many audio channels? (Score:5, Interesting)
And that doesn't even get into the social isolation headphones encourage. For the majority of settings, it is better to have multi-channel surround sound for an area that can encompass a group rather than a stack of headphones (and the sound processors that come with that).
Your comments about center channel are also misled. The center channel is a tremble speaker so it projects voices from the center/screen area best. Panning voices between front left/right can causes the voices to seem off screen and can be very distracting to some people.
So now you ask, what is my background? I do VR environments in colaboration with these people: http://imsc.usc.edu/research/project/immersiveaud
Human Eye = 17,000 DPI (Score:3, Informative)
name Herbert
status educator
age 60s
Question - Presently there is quite a bit of talk about pixels. Each
digital camera manufacrer claims there camera has 3 million pixels,
another 3.5 million, on and on. This reminds me of the 50's & 60's when
Hi-Fi audio manufacturers claimed there equipment had a wider bandwidth
than its competitor. So the question is what is the resolution of the
human eye, and can the figure be quoted in pixels?
I will answer as much as I can, but your questions about the limits of the
human eye should really be directed to a specialist in the theoretical
limits of the human eye. Right now that is a question that has been
researched quite well, and there are several formulas to help predict that.
From what I understand, the resolution of the human eye is not measured
directly in pixels, but by the angular difference between two points of
light that can be resolved. Here is a very good article on that:
http://www.madsci.org/posts/archives/may9 7/864446241.Ph.r.html
From this article, if I have done the math right, I understand that a
typical person has a maximum resolution of about 17000 point sources per
inch. This doesn't really equate to pixels, but, pixels can be changed into
pixels per inch, and that should be close enough.
Digital cameras do brag about their resolution, because, well, it really
does matter. It matters because their resolution is so poor compared to a
real cameras, or a decent printer that it is pathetic.
For example, a really good digital camera might have a resolution of 2160 x
1440. If you made that into a 4x5 picture, you have a resolution of about
400 pixels per inch. Which isn't bad, but photo quality printers print at
2400 pixels per inch. If you decided to make it into a 8x10 photo, you end
up with about 200 pixels per inch. This was considered excellent quality 10
years ago, but is very poor quality by todays standards.
So, compared to the human eye, a real camera, or good printed material,
digital cameras aren't there yet. They do use a wide variety of software to
try and enhance the quality for printing, but there is still room for
improvement.
That doesn't mean digital cameras don't have a use. If you need pictures in
a digital form to be displayed on computer screens, then you have something.
A computer screen has a resolution of about 72 pixels per inch, and digital
cameras are definitely better than that. Also, since it is basically one
step from taking the picture to downloading it onto your computer, you get
better results than if you took a picture, developed it, and then scanned it
in, not to mention much faster results. With the popularity of the web,
digital cameras are great for creating images to place on a web site.
I hope this helps.
--Eric Tolman
http://www.madsci.org/posts/archives/may 97/864446241.Ph.r.html
From this article, if I have done the math right, I understand that a
typical person has a maximum resolution of about 17000 point sources per
inch. This doesn't really equate to pixels, but, pixels can be changed
into pixels per inch, and that should be close enough.
It would seem to me that if the resolution of the human eye is one
arcminute at 10 inches, then the maximum resolution of the human eye is
found as follows:
You find the circumference of a circle of radius 10 inches, which comes to
62.83 inches. One 1/21600th (or 1/60th of a degree) of this is 0.002908
inches, the minimum possible perceptible distance by the human eye at 10
inches.
To get this much resolution, you need 343, not 17,000 pixels per inch.
Of course if you get even closer, the story changes, but what the
resolution of the human eye is at some other point that 10 inches I am not
sure.
Even taking a hypothetical one inch of distance with the exact same eye
resol
This is an article from last year. (Score:2)
Super high resolution monitors have been done before, but usually as CRTs. Greyscale CRTs are easy to make, and have been used for medical X-ray viewing [clintonelectronics.com], where 5-megapixel displays are often used. The medical monitor makers are now offering 5-megapixel greyscale LCD panels. Color panels still have lower resolutions.
Perspective - Engineering drawings are 9600x7200 (Score:3, Informative)
IMHO 200 dpi is about right for viewing without noticeable digitizing effects - moire, rasters, etc. Pencil lines at this resolution don't have visible jaggies if they're antialiased, and don't look out of focus either.
analog eyes (Score:2)
SHMUHDV coming soon (Score:2)
great, that's like having a ferrari (Score:3, Insightful)
seriously, tv doesn't have enough decent CONTENT to use something like this, my roommate has a 36" hdtv and we cancelled the hdtv package from comcast becuase there is literally nothing to watch and it's pointless to pay an extra $35+ for hi res newscasts
the only place the hdtv shines so far is in showing cg scenes (return of the king is fucking amazing on the tv for example, much better than it was in the theater) but nothing else really improves so the question is begged why bother?
could it be that they're trying to outpace moore's law? how many people have multi-terrabyte beowulf clusters set up to manipulate video that massive?
Re:But... why? (Score:2, Interesting)
Re:But... why? (Score:2)
Fax at normal quality = 144 dpi. This screen = 7,680 pixels over about 300 inches = about 25dpi. Of course, you would expect to be a lot closer to a fax than to this screen.
The human eyes have about 250 million photoreceptors per pair, and they are not evenly distributed, so the resolution is much higher around area you are looking directly at, this screen has 33 million pixels which are evenly distributed.
B
Re:But... why? (Score:2)
Visual acuity isn't measured in points per inch, for obvious reasons: there's a third dimension to deal with. Visual acuity is measured in arcseconds. How many seconds of arc have to be occluded before your eye can resolve the spot?
The most-often-repeated statistic is that 20/20 vision can resolve about one arcminute, or 60 arcseconds, or 1/60th of a degree. How much is that in terms of inches? It depends on how far away t
Re:Here's my own video format.. (Score:2, Funny)
> as atoms. Each atom has it's own dynamic path.
> This allows the viewer to move around in the
> movie as they would in real life, and even
> interact with it.
Sadly, most nerds around here cannot figure out how to work this video style's porn.
Re:Steve Jobs' prediction was a little bit wrong (Score:2, Funny)
R e a d? B o o k?
Oh, well then I'll just get the audible.com file and put it on my iPod. Plus, I got these cool earphones that look like seashells--AWESOME!
Re:Steve Jobs' prediction was a little bit wrong (Score:2)
Re:Damn and we missed the boat again! (Score:2, Funny)
Numbskull.