Advertisement

What would motivate you to buy a new TV receiver? 3D? Probably not. 4K resolution? Well maybe, if you can afford an 80 inch display. High dynamic range? Once you see HDR video on an HDR display, yes, that’s the ticket.

HDR and its technology cousin wide color gamut (WCG) are advances in digital video technology that you can really see, unlike 4K (3840 x 2160 pixels) resolution. HDR video provides brighter highlights and darker blacks. Side by side with standard dynamic range (SDR), the difference is obvious. WCG delivers more colors, more consistent with the ability of our eyes, than today’s TV displays can reproduce.

First, some technical background.

The candela is the unit of luminous intensity. The unit of luminance is candela per square meter, also known as a “nit.” The majority of TVs today have a peak luminance of perhaps 100-400 nits. The first generation of HDR displays will have a peak luminance of perhaps 1,000 nits. In the real world, for comparison, a fluorescent bulb might have a luminance of 6,000 nits, and a glint of sunlight reflecting off a shiny surface might have a luminance of 100,000 nits or more.

The dynamic range of the human visual system ranges from about 100,000:1 to 1,000,000:1, depending on the scene. Digital TV programming has been created to be consistent with the 100:1 dynamic range of CRT television displays. While flat panel displays could be built with greater dynamic range, today they are matched to the programming, which continues to be produced to align with the capabilities of CRT displays.

Color gamut deals with the differences between the range of physical pure colors (wavelengths), the range of colors that can be perceived by the human eye, and the range of colors that can be reproduced on a TV display. Human color vision response is usually described in terms of the “CIE 1931 color space,” which uses a two dimensional figure that shows all the colors that can be perceived by the human eye. (The CIE is the International Committee on Illumination.)

Digital television uses the specification found in ITU-R Recommendation BT.709 to describe the color components used in digital television today. The range of colors supported by BT.709 is far smaller than the range of colors in the CIE 1931 color space. According to one source, the BT.709 color space covers 35.9 percent of the CIE 1931 color space. Today’s digital televisions are designed to reproduce the BT.709 colors, and might not have the capability to reproduce a wider gamut of colors. But there is a newer ITU-R Recommendation, BT.2020, that provides for a wider color gamut. BT.2020 is said to cover 75.8 percent of the CIE 1931 color space. 

Here’s a goal for the next few years: TVs that comply with the BT.2020 color gamut, and have a peak luminance of 4,000 nits. So far as I know, there aren’t any commercially-available TVs today that meet this goal. But in the near term there will be displays with wider color gamut and peak luminosity of 1,000 to 2,000 nits. 

So what happens when an HDR signal is delivered to an SDR display, or a WCG signal is delivered to a BT.709 display? Things get ugly. That’s why the Blu-ray Disc Association asked the Consumer Electronics Association to define a way to signal across the HDMI interface when the content is HDR. Blu-ray discs with HDR and WCG content are expected to be on the market later this year, as well as disc players and displays that are compatible. 

Consequently, CEA recently adopted the 861.3 extension to CEA 861, which provides for signaling when the content complies with two new HDR standards from the Society of Motion Picture and Television Engineers (SMPTE): 

  • SMPTE ST 2084:2014, “High Dynamic Range Electro-Optical Transfer Function of Mastering 
  • Reference Displays” and SMPTE ST 2086:2014, “Mastering Display Color Volume Metadata Supporting High Luminance and Wide Color Gamut Images.”

These two SMPTE standards define one version of HDR and WCG, and it’s the version that Blu-ray discs will employ initially, but there are other proposals as well. So if necessary, the CEA 861.3 standard can be extended later to add signaling for other versions, such as the proposals from Dolby, Philips and Technicolor.

Meanwhile, CableLabs is doing research into consumer preferences. Although the initial evaluation of picture resolution was done under very informal conditions, not in controlled experiments, the results were pretty clear. Most consumers could tell little or no difference between up-scaled 1080P pictures and native 4K pictures. CableLabs recently concluded formal consumer preference testing of HDR, and the results indicate a high percentage of participants showed a preference for HDR over SDR.

One other factor. The data rate difference between 1080P pictures and 4K pictures is huge, up to a factor of four. On the other hand, going from SDR to HDR requires only 10-25 percent more data capacity. That has implications for cable network resource allocations.
So you can go out and buy a 4K TV today if you want. But I think the most important advances in digital TV will be coming later this year and next, with the availability of HDR and WCG capabilities and content.

Advertisement
Advertisement