DTV Primer

Home | What Transition? | Buyer's Guide | Timeline | History | Glossary | Links | Tutorials | E-mail |

HDMI 1.3 and Color Standards

March 9, 2007

I've started researching color spaces, and discovered that the transition from our constricted TV color standard to a wider, richer HDMI 1.3-supported one is not going to be nearly as easy as buying a new TV.

For almost 50 years the colors we saw on TV were limited to what could be reproduced by combining the three colored phosphors used in CRT televisions -- red, green, and blue. As it turns out, the color space that can be described by these three phosphors is a fraction of the colors that can be perceived by the human eye, and certainly more limited than what can be reproduced on the movie screen using film and the colors you see when you walk outside.

These two charts show the extent of 1) the colors that can be captured by film, and 2) naturally occuring surface colors, that cannot be displayed by TV RGB color (white triangle).

Because TVs in the past were small and resolution too low to show any real detail, expectations for realistic color were not very high. With the coming of digital TV and associated new display technologies, all of that is beginning to change.

This color transition started showing life a couple of years ago, but it has been the adoption of HDMI version 1.3 that has raised awareness of it as an industry-wide phenomenon. In particular, HDMI 1.3 promises support for increased color depth (from 8-bit to 10, 12, or 16-bit color--"Deep Color") and support for the expanded xvYCC color standard.

xvYCC is the abbreviation for "Extended YCC Colorimetry for Video Applications," in case you were wondering.

Color depth

Increasing color depth from 8-bit to 10-bit takes color from 256 shades to more than a thousand shades, making the elimination of banding possible. Gray shading is greatly improved. Does this automatically happen if your gear has HDMI 1.3? Or if a new TV touts its 10-bit panel?

Well, no. The people who compress video for broadcast or DVD have complied with standards that call for 8-bit color. If you compress color to 8-bits, trying to convert those 256 shades to a thousand+ (10-bit) might yield an improvement for certain types of scenes, but more likely it would yield something ugly. What we need is 10-bit content, which is not yet here.

Broadcast TV codes for 8-bit color, as does DVD and Blu-ray and HD-DVD. The potential for a transition to 10-bit color by the high-def disc formats is there, but you're still looking at a huge 8-bit infrastructure, from capture to distribution to display.

TV further skimps on bits by coding the chrominance part of the signal at lower bit-rates than luminance, based on the human eye's greater sensitivity to brightness than color. Hence, instead of R-G-B, we have YCbCr efficiently coded at 4:2:2 or 4:2:0 subsampling (instead of 4:4:4 equal bits for each element) with the two chrominance elements sampled at half the rate of the luminance, or less.

The object is to compress storage space and end up with a smaller bit-rate, and studies have show that people notice little or no difference when this is done. But with high-resolution TVs having enhanced color display capabilities, is this still the case?

How much better will displays have to become before new standards are adopted?

Color Space (or Gamut)

HDMI 1.3's seeming promise to expand the color range on our TVs to everything the human eye can see is unfortunately widely misunderstood. While it is true that the xvYCC color standard encompasses virtually all the colors we can see, our display technology is far from achieving that capability.

Rather, HDMI 1.3 adopted the xvYCC color standard because display technology will never outgrow it. HDMI 1.3 supports xvYCC; it does not make it so.

The current TV color spaces (sRGB and NTSC) are too small for the capabilities of emerging video technologies. Film and many video cameras can capture more colors than those standards allow for. As a result, the most vibrant saturated colors are typically clipped; we never get to see them. (The same thing is happening with resolution: some video cameras capture images at 1080/60p, but the signal is then processed and recorded at lesser standard formats.)

Color spaces are described as coordinates in a three-dimensional space that can be represented on a two-dimensional graph if we leave out brightness. This is called a "CIE xy chromaticity diagram." (CIE = International Commission on Illumination) The numbers around the curved part of the horseshoe are the wavelengths of the mono-chromatic (pure) colored light at the edges.

Because TV color has been based on three primary colors--red, green, and blue (RGB)--TV color standards are represented by a triangle, with each primary color at the corner of the triangle. The "horseshoe" shape contains all the colors we can see.

The xvYCC color space pretty much fills the entire horseshoe. That area covers many more colors than are inside the NTSC triangle (the sRGB/BT.709/HDTV color standard is almost the same as the current 1979 NTSC color space). xvYCC also covers many more colors than appear in nature, and more colors than can be captured by film; both of these latter color spaces (nature and film) are in turn larger than the NTSC/sRGB color space.

Many televisions cannot display even all the colors within the NTSC triangle. TV displays that use the same type of emissive phosphors that have been used in CRT TVs for 50 years are only capable of displaying about 75% of the NTSC colors. Besides CRT TVs, plasma and SED flat panels are in this category. (SED=Surface-conduction Electron-emitter Display)

Older-technology LCD flat-panels are capable of displaying from 72% to 82% of NTSC colors. The most advanced LCD TVs are capable of displaying up to about 110% of the NTSC color gamut. More common advanced technologies yield around 90% to 94% NTSC color.

[Please note that there are two NTSC color spaces. The older one (FCC 1953) has its green primary further out, and therefore that color space is larger than the more modern NTSC color space, but it is no longer in general use. It is not clear to me, however, which of these NTSC color spaces is being referenced with these percentage comparatives. Nevertheless, the percentages are useful as a relative comparison between technologies.]

One common LCD color-space enhancement uses wide-spectrum cold-cathode fluorescent lamp (CCFL) backlights. These incorporate new phosphors that produce more saturated reds and/or greens. (Example: Sharp's "four-wavelength" and "five-wavelength" backlight TVs, which I had mistakenly reported as using four or five primary color optical filters/subpixels.)

Using LED (light-emitting diode) backlights, either with three separate colors or all white LEDs has yielded impressive results for LCD flat-panels, but with technical challenges. (Example: Sony's higher-grade LCDs)

Mitsubishi will introduce a DLP RPTV later this year that uses laser diodes (three different colors) to expand the gamut. Laser light is a pure single-frequency light (called monochromatic light); the color is therefore fully saturated in an absolute sense. Those colors lie on the outer boundary of the horseshoe; the laser-light triangle is therefore bigger and encompasses more colors.

Organic Light Emitting Diode (OLED) TVs are capable of displaying expanded color gamuts (although they are now in the prototype stage).

Using more primary colors than three (either four or five) in LCD flat-panel TVs, with each color a sub-pixel, can produce a fuller color space up to around 110% NTSC. Genoa Color Technologies has developed this approach.

Some manufacturers of conventional DLP RPTVs use more than three primary colors (up to six) in their spinning color wheel.

With four, five, or six colors to work with, a display's color space is not limited to a triangle. The extra primary-color corners can be selected to include new colors, yielding a more realistic image.

The additional primary colors can also indirectly enhance other colors within the gamut. If you must make all colors from only red, green, and blue, then the greens and blues especially will suffer a loss of saturation because they must be made very bright to make a vivid yellow, for example. Increasing brightness decreases saturation.

If one of your added primary colors is yellow, not only will you get better yellows and oranges, but you'll also get more saturated greens and blues because you can move the green primary further away from the (white) center of the color space.

[Note that the diagram colors outside the triangle look like the colors inside the triangle because our computers cannot display more saturated colors than are inside the sRGB triangle.]

There is a catch to expanding outside the NTSC/sRGB triangle: it doesn't conform to TV standards. Programming coming into the TV's video processor will be encoded for a standard color space. This is not to say that the colors that have been encoded are strictly realistic. The most advanced TVs will likely be able to display more realistic colors than what has been coded.

The producer of a DVD or broadcast show by necessity converted the original realistic colors to the most pleasing colors possible given the constraints of the NTSC color space.

I have the sense (from what I've been reading) that manufacturers of the new TVs that are capable of displaying colors outside of the NTSC triangle are taking two approaches.

1) Use that capability to display ALL of the NTSC colors encoded in the signal and none outside. After all, this is still a lot better than mapping the program's full NTSC color space down to the perhaps 75% that other lesser sets are capable of displaying. Manufacturers taking this option are waiting for the content producers and broadcasters to adopt a new standard that reflects the emerging display technologies. OR . .

2) Develop algorithms based on the source of the programming (e.g. film or video cameras) that attempt to undo or modify the producer's original mapping into the confines of the NTSC color space, and then map those colors to match the capabilities of the TV (i.e. extending outside of the NTSC color space), and to match as closely as possible the colors that were present in the original captured image.

There are no rules for this latter alternative; manufacturers must develop their own proprietary approaches. All this stuff is difficult; you won't find that sort of processing in budget-priced TVs.

So if broadcasters adopt a new higher-fidelity, higher-bit color standard, such as xvYCC, what happens when the resulting coded content is fed into an older RGB-based TV, with its older video processor?

People in the industry are talking about this, but don't expect quick action (industry technical standards committees, etc.) before there are lots of TVs with expanded gamut and processing capabilities in people's homes. Switching standards is a big deal, and the government is not going to mandate this one.

What does any of this mean for the consumer?

It means that you can now buy a TV that displays more colors than televisions used to be capable of, that will look significantly better, and that just might cost more.

If content producers are still encoding for the old NTSC triangle, should you wait for a TV that has HDMI 1.3 inputs (that will support "Deep Color" and the xvYCC color standard)?

I suppose that depends on how future-proof you want your TV to be. You can get improvements in color space with an advanced TV that doesn't have HDMI 1.3, but if higher-bit-rate "Deep Color" and xvYCC content comes along in a few years, will you care? And would you notice a difference in the colors? Probably not. Most consumers don't even bother to adjust the brightness and contrast on their new TVs when they get them home, much less use a calibration disc to optimize the picture, and much much less to bring in a professional calibrator to get the best picture the TV is capable of.

At this point, I doubt that broadcasters and cable/satellite service providers are even thinking about moving to a new standard. It would be hugely expensive. Broadcasters are saddled with 6 MHz of bandwidth and MPEG-2 compression (for now) and they seem to want to compress more so they can add standard-definition subchannels to their primary HD channel, cutting bits from all of them. Cable and satellite want the bits for more channels and non-video services; their video quality tends to be lower than broadcast.

Blu-ray and HD-DVD are more likely to add the bits needed for greater color depth and range, but I have yet to see or hear anything about such plans. Substantial improvements in color are already being realized due to the much greater fidelity of these high-definition formats, even without HDMI 1.3.

The more immediate advantage of HDMI 1.3 is in its lossless audio capability. But to take advantage of that, you'll be using a separate A/V receiver and not the TV's internal speakers, so the TV's HDMI 1.3 status will not matter.

Either your Blu-ray (or HD-DVD) player or the A/V receiver has to be able to decode the Dolby TrueHD or DTS-HD Master Audio track, not both player and receiver. But both would need HDMI 1.3 interconnects if you want the receiver to do the decoding. The compressed lossless audio signal will only pass through an HDMI 1.3 interconnect. Other options are available if the player does the decoding.

One thing is apparent. TV display technology is not going to plateau any time soon.

- - -

For more on multi-primary color LCD TV technology, see www.genoacolor.com.