YouTube Adds 'Leanback,' Support For 4K Video 204
teh31337one writes with news that YouTube has announced support for 4K video, which runs at a resolution of 4096 x 3072. From their blog: "To give some perspective on the size of 4K, the ideal screen size for a 4K video is 25 feet; IMAX movies are projected through two 2k resolution projectors. ... Because 4K represents the highest quality of video available, there are a few limitations that you should be aware of. First off, video cameras that shoot in 4K aren't cheap, and projectors that show videos in 4K are typically the size of a small refrigerator. And, as we mentioned, watching these videos on YouTube will require super-fast broadband." They provided a small playlist of videos shot in 4K. This announcement comes a few days after YouTube debuted "Leanback," a service that attempts to find and serve videos you'll like based on past viewing habits, as well as offering a simplified method of browsing.
not the highest resolution: 8k super hi-vision (Score:5, Informative)
http://en.wikipedia.org/wiki/Super_Hi-Vision [wikipedia.org]
4k video is so legacy.
4K?? They can't even do 1080p yet (Score:1, Informative)
I'd be far more impressed by this news, had it not been for YouTube's dismal implementation of 1080p, which in reality is only 1920x540. Yes, they effectively do 1080i, but remove one of the frames entirely.
This should prove the point: http://www.youtube.com/watch?v=MyFGDHPm-tc
Re:How about less compression? (Score:3, Informative)
What that basically means is that your 1080P video was overcompressed and did not actually contain "1080P"-worth of information. The 4K video is probably overcompressed and doesn't contain "4K" worth of information either, but it had more than the 1080P video. (In fact there's a decent chance the 4K video is simply about 1080P's worth done right.) You shouldn't be able to tell.
Variable bit rate encodings means that resolution is pretty much a fiction, as others have pointed out in this discussion.
This is one of the reasons that BluRay won't quite die as fast as some people say. While it is technically possible to stream a BluRay-quality video, we're a ways away from it being practical on the large scale yet, and we're even further away from it being so dirt cheap that big corporations finally decide that they might as well not compress the video to death. (I've certainly streamed some video from Netflix I'd call "better than SD", but definitely not "BluRay quality".) Until then, streams can label themselves as "1080P" all they like, but without the bits it's just equivalent to a lower resolution video upsampled. There's different levels of "pixel quality".
In other news, a DVD can have a better quality than a streamed putatively-HD video, because the DVD may have less resoultion, but (like BluRay) it's full of high-quality pixels where the HD-stream may just consist of impressionistic blobs when you really look at it. Bits matter.
Ya 4k is stupid (Score:3, Informative)
There are no consumer 4k monitors out there, none. You CAN find 4k large displays if you try. Barco makes some that are close (3840x2160) like their LC-5621 but that costs nearly $40,000. 4k is just not the sort of thing you find on a desktop PC or in a consumer's home.
As such doing video on a site like Youtube in it is worthless. Actually it is worse than worthless since, as you noted, it overloads the decoding ability of current hardware and causes slowdown. There is just no point, at all, on current desktop systems. Until they have displays that can handle the image and graphics hardware that can handle the decoding it doesn't do shit.
As you said, they need to improve their 1080p stuff. Reason is that their 1080 stuff... well... isn't. I mean yes, it is encoded at 1920x1080, but the compression is so heavy that you do not really get that level of detail. I mean consider that Blu-ray usually runs in the realm of 25mbps H.264 for 1080p. Youtube's problem with regards to visuals isn't resolution, they can already handle the resolution of all but the biggest computer displays (27" and 30" LCDs are slightly past 2k, but that's as high as it goes). Their problem is that they compress things to a very low bitrate to aid in streaming.
So fix that first. Don't add a 4k option, add a 1080p HBR (high bit rate) option. Until you are streaming high enough bitrate to make 1080p look real good, there's no point in going any higher. Once you've got that, then maybe add 2k for the really big monitors (not that a 2k source is easy to get). Don't add 4k until there is actual 4k hardware in homes. That day will come I'm sure, but probalby not for another 5-10 years.
Re:Framerate, not resolution (Score:1, Informative)
Yes, essentially, because of the way the 3D works -- you're using alternating frames for left, right, left, etc. This means that stuff close to the "viewpoint", where there's a significant difference between the left and right images, winds up only at 12fps.
I know a few people who got severe headaches from watching Avatar because of the 3d; I imagine that improving the framerate will go a long way toward fixing this.
RealD, the leading 3D projection system, projects alternating views at 144 fps, not 24 fps. Each eye's view of each frame is projected 3 times.
Most film projectors have 3-bladed shutters, so running at 24 fps, they will also display each frame 3 times.
It's the highest in actual use (Score:5, Informative)
SHV is experimental tech. They are playing with it right now, but it isn't in use anywhere, even in the pro world. It is just proof of concept and early testing.
4k is the high end of cinema. 4k is normally what you scan in and process film at (it is considered to be about the same as good 35mm). You can also get monitors that are very nearly 4k, and the high end digital cinema projectors are 4k. It is a currently used and in production format. If you go to a new, spiffy, digital theater and watch a movie like Avatar, it is probably a 4k projector (though some places with smaller screens use 2k instead, which is just a bit higher than 1080p).
There's a difference between "Technology that is being developed," and "Technology that is being used."
Take Ethernet. 100gb is currently under development. There are test units that exist, and the standard was finalized last month. However it is not a deployed technology. Your network does not have 100gb Ethernet backbones. 10gb is currently the fastest Ethernet out there. It is the fastest deployed in actual networks right now (fastest Ethernet, I know there are faster POS lines and so on).
So it is accurate to say 4k video is the highest for now.
Re:What is this for again? (Score:3, Informative)
Re:not the highest resolution: 8k super hi-vision (Score:2, Informative)
For the non-EEs out there, the Nyquist frequency is the highest frequency that can be resolved at a given sampling rate. It's just less than half the sampling frequency. Any frequencies higher than that will be aliased to a lower frequency, with signals equal to the sampling frequency appearing at DC.
So given the GP's ocular resolution of 400 dpi, we'd need something over 800 dpi printed in order for the print to appear non-pixellated (ignoring any ink/toner blending). Add to that the fact that most people don't hold a printed document at 1m from their eyes for reading, (if they even have arms that long) and I can see a market for the 1200 dpi printers.
IMAX is NOT 2K (Score:3, Informative)
35mm film is about equal, or a little better than, 4K digital in terms of resolution. Most all of the time when you go see a movie these days, it is still being projected on 35mm film. It's cheaper than a digital projector and looks better. When you went and saw Avatar in 3D it was being projected in 2K (for almost everyone) digital and that's why you could see the pixels on the screen. 2K is NOT good enough for anything but very small movie screens. Anyone who says it is is not a cinematographer (I am) and has never used both a 35mm film camera and the best digital cinema cameras (I have), and probably doesn't know what a cinematographer even really is. 99% of all big-budget films are still shot on 35mm film because it is simply the third-best image-capturing method out there, better than ANY digital camera in existence today. It is also much more expensive, but on large films the price of film is a drop in the bucket compared to everything else.
The second best cinema image-capturing method out there is 65/70mm film.
The best is IMAX.
IMAX is 65/70mm film travelling through the camera horizontally; each frame is about 2.75 by 2 inches. That is enormous. It's like a medium-format still camera...except 24 times a second. Here's a comparison of IMAX to regular 35mm film (most digital cinema cameras have sensors, by the way, about exactly the same size as a 35mm film camera): http://en.wikipedia.org/wiki/File:Imaxcomparison.png [wikipedia.org]
IMAX is NOT projected on 2K. "Digital IMAX" is. Digital IMAX is pretty much useless and is not even as good as standard 35mm film projection; it uses two 2K projectors overlapping each other to give a slightly higher than 2K theoretical resolution to the image; for those of you with still cameras, 2K is about equal in resolution to a 6- or 7-megapixel camera. Congratulations. Your $1000 SLR has way more resolution than a digital cinema projector that costs a half million dollars.
Real IMAX, i.e. horizontal 65/70mm film, has an estimated resolution of about 104 million pixels; you would need a 12K x 9K digital sensor to even come close to the resolution of IMAX. No one makes those and no one will for a long time, if ever. The highest-resolution digital photographic sensors outside of the military are probably Hasselblad digital medium format backs; they are about 60 megapixels, or half the resolution of IMAX film, and they are still cameras capable of only about one frame a second.
IMAX is not 2K. Digital IMAX is not IMAX.
IMAX is film. Film is incredible.