Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Television Media Graphics Software

On NTSC Video, Blue Blurring, Chroma Subsampling 308

NEOGEOman writes "Something I've been fascinated with for a long time is video signals. On my website I've spent over six years collecting video and other hacks for game consoles. I've recently put together the fourth revision of my video signal primer and it's expanded to six pages now, including strange subjects like chroma subsampling, horizontal colour resolution and rather interesting revelation: your eyes suck at blue."
This discussion has been archived. No new comments can be posted.

On NTSC Video, Blue Blurring, Chroma Subsampling

Comments Filter:
  • Uhm... (Score:4, Funny)

    by the unbeliever ( 201915 ) <chris+slashdot&atlgeek,com> on Wednesday December 24, 2003 @10:35PM (#7806339) Homepage
    Can anyone possibly tell me why the hell this site is listed in my company's web filter as pornography?

  • by danamania ( 540950 ) on Wednesday December 24, 2003 @10:39PM (#7806352)
    My eyes suck especially badly at blue. a pure-blue image is something my eyes completely refuse to focus on. I can see the image is there, see that it's blurry, but whatever makes my eyes focus just doesn't work on blue. Light a room with pure blue light and I'm almost blind. gah!.

    Add some other colours and I'm fine. Curiously, given a red line of text, a green line of text and an off blue line of text, I have to focus differently for all three. (Fully blue is, of course, a complete waste of time :)
    • Wow, you know, I see the same thing, but never met anyone else who noticed it.

      My parents had a blue digital clock that I couldn't focus on in the dark (when it was the only color). For some reason, noone else noticed it. I wonder if they just didn't realize it. :)
    • What I find especially straining is the default colors for ls --color, at least under fairly recent linux distros.

      Compressed files get a bold red, directories get a slightly dimmer blue. I use a black background on my xterms, and I've found that when I try to read a directory that has a lot of both I'm constantly having to refocus when I go between blue and red areas.

      It's annoying enough that on any new machine the first thing I do is change my alias for ls to no colors. :p

      I wonder if this is something t
      • That is exactly what I see too, and it's much worse at smaller text sizes. Looking at the same colours on a white background seems to completely remove any focus problems, so I settle for a black/colour on white/pastel terminal :).

        This is an example of what's horrible for my eyes [danamania.com]
      • I also have this problem, and many times the red seems like it is higher than the blue, as if it's a 3D picture. Does anyone have that experience? I think it's in the glasses I use because before I had them I didn't have that problem.
    • The color blue exists at the low end of the spectrum of visible light. The bluer the color, the harder it is to focus on. The wavelengths are so short that they won't focus on the retna properly, hence blue is blurry.

      It is a common thing to see a deep blue color and not be able to distinguish edges but notice a "glow" around the colors.

      Oh great, now I can't find my keys.

    • by zakezuke ( 229119 ) on Thursday December 25, 2003 @01:17AM (#7806884)
      Wavelenth issue? As in better manual focus lenses have an infrared mark on them that basicly you focus normal, and step it back a notch for IR to be in focus. Whether or not the human eye much focus diffrently for objects closer up showing diffrent wavelenghs is beyond me.

      I do know that in order to detect red from green I see the effect of the surrounding, as in green reflects more light then red does... as in green leds are annoying cause they make the whole room bright... but the red ones do not.

      Being color blind, i've studdied this quite a bit. I find that i'm fond of purple text rather then blue dispite the total lack of contrast between the two. I'm the one who made purple british flags in gradeschool didn't understand what I was doing wrong. Stupid unlabled markers. Red text on black is my worst color combo, can't see the contrast usually.

      I find i'm better with blues then others, but never the less, i'm familar with this lack of focus esp when the blue is on contrasting background, yellow/green/red. I know on my old amiga... my text colors were hard for others to see, as I picked what I considered to be high contrast in my eyes... blue / black / green. No other bugger could read the blue I picked.

    • by SirNarfsALot ( 536889 ) <narfinity@oper[ ]il.com ['ama' in gap]> on Thursday December 25, 2003 @01:25AM (#7806926)
      You have to focus on different colors differently because red, green, and blue light are all different wavelengths and therefore are refracted slightly differently by your eye (shorter wavelengths, i.e. blue, refract more, longer wavelengths - red - refract less). You have to focus closer in on a red light than a blue light for it to be sharp, if you can even do so.

      Something I don't understand is that blue light seems to exaggerate my mild astigmatism; I have a Logitech mouse that drives me crazy if I try to focus on the blue taillight. Red lights I can focus on quite clearly and from further away (I am also nearsighted) than any other color lights.

      Fun related trivia bit (and forgive me if this is common knowledge):
      If you have a decent old 35mm SLR camera with a normal lens (other lenses may have this too) look at the focus ring. There is a marker for where to line up the focus ring in normal conditions, and then there should also be a little red dot a fraction of an inch to the side of it to show where to line up the ring when shooting on infrared film. You have to focus the lens closer for infrared than for visible light, because longer wavelengths refract less.

      This is all related the prism rainbow effect, too.
  • by whoever57 ( 658626 ) on Wednesday December 24, 2003 @10:44PM (#7806363) Journal
    RGB is standard on most equipment as it is included in the SCART [knoware.nl] connector usually found on any TV/VCR/DVD sold in the last decade.
  • it really needs a summary on the front page for those with ADD:

    Digital > RGB > Component > Svideo > Composite > RF modulator

    The difference between Composite and SVideo is HUGE, please change if you are still using Composite!

    • The author also combines a description of the cabling with color model. This is someone who understands game creation for TVs?!?! NTSC DV pumped through an S-Video cable is 4:1:1 color, PAL DV is 4:2:0. Artifacts? Yes. The same type? No. can RGB color model be sent over analog cables? Yup.

      The author has a classic case of knowing enough to be dangerous but now knowing enough to know you don't know enough.
      • by NEOGEOman ( 155470 ) on Thursday December 25, 2003 @12:26AM (#7806699)
        The author knows he knows nothing, but knows more than the hundreds of info-starved fanboys on forums the world over, and knows enough to put together a primer (And I'm repeating myself here) that is an introduction to a subject, not a comprehensive guide.

        Also, as far as I know no one pumps "NTSC DV" through an svideo cable, unlress they're way off spec. Svideo is analogue, not digital. Or are you using a non-spec definition of "DV"? ;)
      • Im sorry, but "4:1:1" and "4:2:0" are the same thing (both subsample chrominance by 2 both horizontally and vertically), with the only difference the centerpoint of the subsampling filter.

  • Girl (Score:5, Funny)

    by Anonymous Coward on Wednesday December 24, 2003 @10:49PM (#7806385)
    who's the girl? She's hot ;)
  • Obvious Physics (Score:5, Informative)

    by rbruels ( 253523 ) on Wednesday December 24, 2003 @10:50PM (#7806387) Homepage
    If NEOGEOman had bothered with a freshman-level physics or astronomy course, the conclusion that "your eyes suck at blue" would have been obvious some time ago.

    It's well known; as our eyes drift to the blue and red end of the spectrum, we lose our sensitivity, off by many orders of magnitude from say, yellow. This is why you see blue, and more commonly, red, lights as "night" light sources.

    The general reasoning: our eyes evolved with a single primary light source: the Sun. Which has quite the yellow tinge to it. Our eyes adapted to this, and as such, gave yellow the highest sensitivity and drifted off in a rough bell curve [ndt-ed.org] from there.

    It was an interesting article, and certainly put the RGB sensitivity into perspective, but ... it's not entirely new or surprising, either. Nor does the human eye really respond at RGB -- its response curves (beta, gamma, and rho) more closely correspond to blue, green/yellow, and yellow/orange.

    That all being said, thanks for letting us meet Traci. ;)

    • Re:Obvious Physics (Score:5, Interesting)

      by Anonymous Coward on Wednesday December 24, 2003 @11:03PM (#7806453)
      You have been misled.

      It is in fact BLUE at 445nm that the eye is most sensitive to. Blue receptors are the most sensitive.
      This "sucking at blue" thing has nothing to do with sensitivity of the receptors, but with the fact that only 2% of the cone receptors are the blue sensitive ones, so you have no resolving capability in the blue part of the visible spectrum.

      This is an issue of resolution NOT sensitivity.

      Furthermore, I have been researching vision for about 10 years now, and I can tell you that the curve you linked to is totally fucked up. The leftmost curve is not far enough to the left at all. 445nm, which is what your blue cones are sensitive to is far more purple than that stupid graph would have you believe.

      You need more reputable sources.
      • Re:Obvious Physics (Score:2, Insightful)

        by NEOGEOman ( 155470 )
        OK, so you're saying our eyes don't suck at blue, or that they suck for different reasons? The end result as I see it is exactly the same - we still perceive very low resolution in blues, and we can reduce the detail in an image's blue channel without a noticable drop in image quality. This is proven through the pictures I provided. Is there more to the story? There sure as hell is! Am I the world's leading comprehensive source for Eyes+Blue information? Nope, nor do I claim to be.

        What you say is fa
        • Well, he is right after all and you should have guessed since it's implicit in your article. Nowhere have you said that blue perception has poor contrast and infact, lossy compression throw away spatial samples rather than dimishing the dinamic range. So in a sense you had the answer but didn't know it; don't worry, it happens to me all the time ;-)
          Andecdotal: try to focus on something under a violet light (don't stare at the lamp, you can't feel it, but it is bright) and feel the frustration of someone tha
        • He's agreeing with you that the human eye sucks at blue (where by "suck" we mean "has low resolution for detail"), and furthermore is saying that the reason for this has to do with the spatial density of blue-sensitive cones on the retina.

          Here, take a look at this page [uh.edu] by Austin Roorda, a vision scientist at the University of Houston. He shows images of the cells in some test subjects' eyes, with the images colored to indicate which cells detect which colors of light. You can see at a glance that the eye

    • Re:Obvious Physics (Score:2, Insightful)

      by Hard_Code ( 49548 )
      I don't doubt that our eyes are less responsive to red and blue (I already knew that) but isn't the example sort of contrived? I mean, if I showed you a completely RED square, and then proceeded to remove both the RED and, say, BLUE color components, would that be proving anything? Obviously removing all the RED in a primarily RED picture is going to have more of an affect than removing any other color. The question is was the blue component in that picture significant to begin with (and in reverse deduc
    • Re:Obvious Physics (Score:2, Interesting)

      by kaphka ( 50736 )

      It's well known; as our eyes drift to the blue and red end of the spectrum, we lose our sensitivity, off by many orders of magnitude from say, yellow. This is why you see blue, and more commonly, red, lights as "night" light sources.

      No. Red lights (certainly not blue) are used in low-light situations, e.g. a ship's bridge at night, because the photoreceptors that are used in scotopic conditions are most sensitive to short wavelength ("blue") light. If you stepped out of the starlight outside and into a ro

      • If you stepped out of the starlight outside and into a room with a blue (or white) light source, the rods in your eyes would immediately be saturated, and it would take up to a half hour of darkness before your night vision was fully restored.
        This explains why I find those new bluish headlights so annoying.
        I have almost resorted to wearing sunglasses at night to block them out.
        What were the manufacturers thinking?
    • as our eyes drift to the blue and red end of the spectrum...
      Um. Dude. Blue & Red are at opposite ends of the spectrum. Hence UV at one end and IR at the other...(ok, sure, violet is on the other side of blue from red, but in between you've got all the greens, yellows & oranges...)
  • by IpSo_ ( 21711 ) on Wednesday December 24, 2003 @10:57PM (#7806414) Homepage Journal
    I realise this competely off topic, but is it ever a small world... I went to high school with the lady in the picture laying on the car (Traci). A year after high school I moved 400km's away to a large city and about 2.5 years after that I ran to her working at the Red Robin a block from my house. Now I see her picture on Slashdot of all places, whats the chance of that happening? :)

    To top it off, the guy who apparently owns the website (gamesx.com) runs (or ran?) a console game rental store in my home town, and used to date my sister!

  • Wow. (Score:5, Interesting)

    by TheSHAD0W ( 258774 ) on Wednesday December 24, 2003 @10:58PM (#7806423) Homepage
    I'm red-green colorblind, and the pictures on the LEFT, with the low-resolution red images, look as good to me as the original or the one with the low-res blue. Does anyone else notice this?
    • Re:Wow. (Score:2, Interesting)

      by NEOGEOman ( 155470 )
      I'm not colourblind at all, and I consider my vision to be pretty good. The difference on the red-reduced image is less noticable than the blue. It varies though depending on the source image and reduction method, where green is always noticable and blue almost always invisible.
    • Absolutley, in fact, blue is the best color I can see because of my red-green color blindness. The red I can't see at all, the green is just a little worse then the blue.
      • I thought *I* had red-green color blindness, but I can only see jaggies in the center picture. Perhaps my color blindness is more severe than I'd thought.
      • Eeeeexcellent. Then you will never know the start date of the first wave of alien attack hidden in the red spectrum...

        OH NO I'VE SAID TOO MUCH
  • by K8Fan ( 37875 ) on Wednesday December 24, 2003 @11:01PM (#7806441) Journal

    Our eyes suck at seeing blue for an even better reason - there are very few blue things that:


    1. You can eat.
    2. Can eat you.

    From an evolutionary perspective, that's the most important thing. We're get good at seeing green, because many green things are edible, and some things that want to kill us are good at hiding in green areas. So people who were especially good at seeing movement in green areas, and finding edible green things tended to survive, while those who didn't died out.

  • SCART != RGB (Score:5, Informative)

    by JazFresh ( 146585 ) on Wednesday December 24, 2003 @11:02PM (#7806448)
    I always see people saying things like "I connect via a SCART socket, so it's RGB."

    Not necessarily.

    SCART connectors are huge chunky things that can handle a number of video formats, including RGB, S-Video and Composite (maybe others too). But that's not the same as saying that a given SCART cable or socket will support all those formats. Many cheaper cables only support Composite (fewer wires means cheaper cost). And on some high-end TVs with multiple SCART inputs, only some of those will support RGB.

    So if you're playing your PS2 or whatever through a SCART cable, the TV might be using the SVideo or Composite signal rather than RGB.

    The lesson is, be sure to check your TV inputs, and always buy good quality cables!

    • ...as I just learned the hard way.

      I live in the US, but I've got a couple of PAL tapes, so I bought a Samsung Worldwide VCR. It plays any format of tape, but doesn't have S-Video output.

      It does have a SCART connector, so I thought I'd out-smart the designers and get a SCART -> S-Video adapter. It works, but the signal is black and white. I assume this is because there's no seperate chrominance line hooked into the jack.

      It's only VHS anyway, so the quality will never be superb, but it would have been n
      • It does have a SCART connector, so I thought I'd out-smart the designers and get a SCART -> S-Video adapter. It works, but the signal is black and white. I assume this is because there's no seperate chrominance line hooked into the jack.

        Sounds like the colour information is still in PAL format, and your TV can't decode it. Or the Video dumps the colour information instead of converting it (some of the cheaper NTSC playback on PAL TV models do that).

    • Re:SCART != RGB (Score:3, Informative)

      by Jeff DeMaagd ( 2015 )
      The article seems to be a little uneducated or assumes a lot.

      "Using it instead of RGB makes your DVD player cheaper by about $0.02 and that's significant savings! On the other hand the price of your TV goes up with the extra equipment needed to decode this component signal."

      This complains too much of the cost of converting component to RGB and no explaination is given to why it costs more to integrate comp->RGB into the TV than it would in the DVD player.

      Where the conversion to RGB happens doesn't m
      • I agree. This guy isn't a professional in this space. He makes a lot of the assumptions and errors I made five years ago. But they're smart errors and assumptions, and I'm sure he'll find the Slashdot experience a great way to get all your errors pointed out quickly :).

        He really hasn't done the math to realize the advantages of Y'CbCr are more about bandwidth reduction, interoperability with existing color video technology, and easier signal processing. While we certainly could build an RGB DVD player for
        • >While we certainly could build an RGB DVD player for roughly the cost of a Y'CbCr one, it'd only get half the run time, and would look worse on standard television (only VGA is natively RGB for signal).

          This may be true in the USA, but in Europe RGB has been the standard for professional and consumercolor systems for a very long time before VGA. 20 year old television sets have SCART connectors that accept RGB, even before S-video existed as a consumer video standard.

          While you could think that a comp
  • This is Traci with her blue elements increased in size NINE TIMES. Each blue pixel is now 3 x 3 pixels in size, a nine-fold increase in the area of each blue pixel. You can't really tell, can you?

    Sorry. I wasn't looking at her blue elements. And what's this about making something 9 times larger?

    Wait, weren't you saying something about TVs?

  • It's NOT RGB. (Score:4, Informative)

    by dtfinch ( 661405 ) * on Wednesday December 24, 2003 @11:17PM (#7806495) Journal
    It's only RGB after decoding.

    NTSC video uses the YIQ color space, very similar to YUV (used in PAL, JPEG, DVD, & stuff). Y is the brightness, which gets the highest resolution, and I & Q (or U & V) are the chroma values, which can be greatly subsampled because they have no effect on brightness (when everything's working correctly).

    Most lossy image compression formats involve first transforming the image to the yuv color space. The RGB->YUV transform is also used by many paint programs for things like estimating differences between colors for color reduction & such.

    First match on google for "YIQ YUV":
    http://astronomy.swin.edu.au/~pbourke/colour/conve rt/ [swin.edu.au]
  • "Your eyes suck at blue"

    Which is why red light districts do so much better than the ones in blue.
  • The limits of NTSC (Score:3, Informative)

    by Animats ( 122034 ) on Wednesday December 24, 2003 @11:28PM (#7806523) Homepage
    NTSC has so little color bandwith that you can only change the color through its full range about ten times horizontally across the screen. Try narrow RGB color bars on an NTSC receiver and see how bad it really is.

    JPEG also relies on this. But JPEG could provide considerably more compression if it didn't introduce those highly visible high-frequency artifacts.

  • NEOGEOman, I just wanted to say gret job on the signal primer. I had always been a bit curious about video signals, though I had never looked into the subject. So I just wanted to to thank you for making an easy to understand document that covered everything I wanted to know (and more).
  • There is a lot of evidence in literature that our eyes have evolved to see the color blue over time. A lot of biblical and Grecian writings describe the sea to be the color of wine (purplish)... and... their are a lot of other examples, but that is the one I can think of off the top of my head.
    • Yes there is evidence that we developed blue sensitivity last, say 10,000 years ago or so. But that's not necessarily why greeks and bible writers would describe things differently. Color blindness, for example, is fairly common among men and, guess what, the most common form is the inability to see blue.
    • There is a lot of evidence in literature that our eyes have evolved to see the color blue over time. A lot of biblical and Grecian writings describe the sea to be the color of wine (purplish)

      Actually, the relevant phrase is "wine-dark sea", and refers not to the color of the water, but the observation that, because the Mediterranean falls off into deep water rapidly, the water is comparable in clarity to that of the open ocean, which, because the light hitting the water is not diffused by phytoplankton

    • Words (Score:3, Interesting)

      by cirby ( 2599 )
      I have one odd little hobby... I collect foreign language dictionaries. One of the funny things you notice when you browse through languages is that the less "sopisticated" ones have fewer color words. Some of the lesser-known tribal languages have one word that stands for both blue and green, because the difference is really not very important to the average guy living way out in the middle of nowhere.

      The more urban/technical a culture is, the more words for color the average person knows.
  • Fun and depressing (Score:4, Interesting)

    by pc486 ( 86611 ) on Wednesday December 24, 2003 @11:52PM (#7806597) Homepage
    <rant>
    After fooling around with video for quite some time now, I have came to the same conclusion that NEOGEOman gets: Macrovision and the entire industry blows. Sure, we all know that the MPAA sucks, but the drop so low that to mess with the video to the point of almost unwatchable is absurd. Here's a small list of things they do to mess up composite video (NTSC):

    - variation of the black level (confuses AGCs)
    - phase modulation of the color burst (later macrovision versions, like DVD players)
    - removal of lines from one field and putting them on the other field.
    - bursts in the VBI

    And then the industry refuses to move on until they can get some other "protection" on the video feed. Who do they worry about? The "Casual copier," "hobbyist," "hacker," "small scale pirate," and the "professional pirate" (DDWG powerpoint presentation [http://www.ddwg.org]). The cost? Remotely decent video and your right to fair use.

    Arg!
    </rant>

    As a side note, if you're interested in chroma sampling and how it can go wrong, check out this page: http://www.hometheaterhifi.com/volume_8_2/dvd-benc hmark-special-report-chroma-bug-4-2001.html

    It's an interesting read.
    • Macrovision also raises the sync level. In NTSC, normal sync is around -40 IRE, while on average, a video with Macrovision will have a -30 IRE sync. Sometimes there's a little more compression on the high end too, with max brightness being 97 IRE instead of 100 IRE, but I've found that to be less of an issue than the overall flux.
    • Yeah macrovision is a pain - but any good hardware/software hacker can remove it - it's pretty much a noop.

      (note I have 15 years doing TV hardware and software - I have a little experience in this field) ...

      My main complaint with his pages are that he busts on component video (vs. RGB) because of macrovision - they're really two different things (you can but MV on RGB BTW). The main reason component exists (YPrPb) is because color TV has always worked that way (YIQ/YUV) these are all color spaces that h

      • > PAL systems tend to have much more garish advertising - lots of bright oranges that you just can't see in the US:-(.

        Two words: Colin Baker.

  • by fozzy(pro) ( 267441 ) on Thursday December 25, 2003 @12:09AM (#7806658)
    you would realize that most of the image data is infact in the blue range on NTSC M RGB sysytems Blue holds nearly 85% of the image data.
  • Please tell me that the author did NOT just split an image interpolated from a digital camera, which probably used sensor with pixels layed out in a bayer pattern*, using a CCD type of sensor which tranditionally shows the most noise in the blue channel...

    Well DUH the green channel is going to have more detail -- there are twice as many green pixels on the sensor. And DUH the blue channel is going to be the fuzziest, given that is the shade digital cameras have the hardest time capturing (take a picture o
    • Alright, I'll tell you exactly that. I used a digital camera, and reduced the image to 25% using an interpolated scaling to create an image much closer to 'balanced' than a raw digital image would be. By scaling it this way the image was resampled to the new size by combining and averaging the pixels, negating any effect the camera's CCD had.

      According to the histogram the colours are very nearly the same but for some variation in the highs.

      What you say is irrelevant for the discussion, really, because a
      • Alright, I'll tell you exactly that. I used a digital camera, and reduced the image to 25% using an interpolated scaling to create an image much closer to 'balanced' than a raw digital image would be. By scaling it this way the image was resampled to the new size by combining and averaging the pixels, negating any effect the camera's CCD had.

        This depends on the algorithm used to create the image from the sensor. Each pixel in the acquired image influences the color of surrounding pixels. Most of the tim
  • Your eyes (and mine) "suck" at blue for two very clear physiological reasons:

    1. Very few blue-sensing cones. Your retinas have two kinds of light receptors. There are "black and white" light intensity sensors called rods, and color-sensitive sensors called cones. Of the cones, only a few (something like 1% or 2%, I don't recall exactly) are sensitive to blue light, while the rest are sensitive to either red or yellow-green.

    2. Blue-sensing cones are outside the fovea. The part of the retina that we depend
  • The page that demonstrates the effects of a noisy blue channel contains an error: DVD doesn't encode video in RGB. MPEG uses a YUV color space, in which there's a luma component, encoding the light intensity, and two chroma components, encoding the color. Typically, the chroma channels are encoded at half the resolution of the luminance. While working with MPEG encoders and decoders I got to "experiment" with the luma channel (read: I had horrible bugs that kept it either at its minimum or maximum intensity
    • Yes, but the author continually mixes and interchanges the concepts of color space, compression, video signal encoding and cabling.
      As others have pointed out, he knows enough to be dangerous. The article sounds quite informative, but is misleading, incomplete, and just plain wrong on many points.
  • by captaineo ( 87164 ) on Thursday December 25, 2003 @01:38AM (#7806963)
    The slow merging of HDTV and film technology is pushing to eliminate subsampling and Y'CbCr (YUV) color space entirely. George Lucas, after shooting Episode II in 4:2:2 Y'CbCr, insisted that Sony develop 4:4:4 RGB equipment for Episode III.

    I *hope* this will continue to the point where Y'CbCr can be dropped entirely (there isn't much use for it aside from chroma subsampling), as well as interlacing. These things cause serious problems in computing... Every time you see stair-step artifacts, improper telecine, mis-matched black levels, banding in gradients, or black rectangles in screenshots of media players, you can thank interlacing and Y'CbCr color space.

    (but they *are* quite effective as compression algorithms, and also clever hacks, in their time - how *else* are you going to send full-color motion video in 6MHz of radio bandwidth using 1950's technology?)
    • by benwaggoner ( 513209 ) <.ben.waggoner. .at. .microsoft.com.> on Thursday December 25, 2003 @02:17AM (#7807040) Homepage
      Oh, I think we've got decades of Y'CbCr to look forward to. it's really an advantageous format in a lot of ways.

      First, a 4:2:0 Y'CbCr is half the bandwidth of 4:4:4 RGB. We're a long way away from having half the processing power required, bandwidth, storage, etcetera simply not mattering. My RAID is 2 TB formatted, but I regularly have projects that take up over 50% of the space.

      Second, Y'CbCr is a better native space for video processing, since the channels align better with what we want to filter. Luma filters like gamma or contrast are more than 3x faster in Y'CbCr than in RGB, since only one channel needs to be processed. Saturation is more than 6x faster in 4:2:0, since only two channels, each at 25% bandwidth, need to be processed. Plus a lot of filters have to convert from RGB to another color space to run. Y'CbCr is closer to those other spaces, and often doesn't require any conversion. You can say whatever you will about Moore's law, the difference between 4 and 8 real-time layers will matter for a while. Even the audio guys, who have it a lot easier, still run into performance limits with enough simultaneous tracks and such.

      Lastly, our entire video infrastructure is build around subsampled Y'CbCr. Never underestimate the lock-in of standards like this. If computer people couldn't kill interlaced video in HDTV, they're never going to kill subsampling for lots of applications. Color video has always been Y'CbCr, and that's how everyone works and thinks for decades now.

      That said, Hollywood is likely to pick a >8-bit RGB solution for digital projection. For digital projection, bandwidth is a non-issue, and quality, and quality like that of film. Film guys live in RGB. Plus, it's a win for that industry to have digital cinema be as INCOMPATIBLE with consumer digital video tech as possible, in order to reduce the ease of piracy, and to maintain an advantage of the theatrical experience over home theater.

      FWIW, I'm a member of the SMPTE groups working on both video compression and digital cinema.
      • I hope you're wrong, but I have a feeling you're right.

        I would like to see Y'CbCr recognized for what it is: a lossy compression method. Nothing more, nothing less.

        I am fine with Y'CbCr as a compressed transmission and storage format, but not as an interface (in the same sense that JPEG DCT coefficients are a storage/transmission format, not an interface). It just causes too many problems going to/from RGB (especially at 8 bits), and is limiting in the sense that nobody in the video world has an incentive
  • Does your blue vision suck? Pop some Viagra! [information-viagra.com]
  • One of my coworkers used to tell me that NTSC stood for "Never Twice the Same Color". Didn't find that very funny until today.
  • Cool article on Blue. I will have to read the rest of your Website as it seems rather interesting. It's cool to see sites and research by regular folk like this.

  • The reason there's so little blue visible is that there's so little blue in that picture.

    Of course it doesn't make a difference if you blur the blue, because it's already 20 dB down from the other colors.

    I'd like to see this with calibrated images.
  • I hope people can see this.

    Since we suck at seeing blue, couldn't the blue bit depth be reduced in the video codec? Or even a indexed blue pallete with less colors?

    That would make for better video compression right?

"Confound these ancestors.... They've stolen our best ideas!" - Ben Jonson

Working...