By Jeffrey Krauss, President of Telecommunications and Technology Policy

It's been about eight years since the United States adopted MPEG-2 as the video coding standard for digital television. But now a better algorithm has come along–actually, two better algorithms. Let's look at how these new technologies might work their way into video distribution and into consumer products.

In 1993, the Grand Alliance was formed by the companies that developed the four competing digital TV systems. They decided to use the MPEG-2 video coding algorithm as part of the combined Grand Alliance system, partly because MPEG-2 was part of a documented international standard. Another feature of the Grand Alliance system was a modulation method that allowed broadcasters to pack 19 Mbps into their 6 MHz channel. And it was determined that MPEG-2 coding at a 19 Mbps rate gave acceptable HDTV picture quality. Standard definition picture quality requires about 4 Mbps with MPEG-2 coding.

Fast-forward to today. Many consumers have purchased HD-ready TV displays. Some consumers have DTV tuners to watch digital broadcasts. Many don't have tuners but use the sets to display DVD movies. Many U.S. broadcasters have installed digital TV transmitters and are delivering HDTV programming, and the rest will soon follow. Both the DVDs and the DTV broadcasts use MPEG-2 video coding. Digital cable programming services use MPEG-2 coding for both HDTV and standard definition.

Between DVD players, DTV receivers and digital cable boxes, there are around 80 million MPEG-2 decoders deployed today.

But within the last year or so, two new digital video methods have emerged. One of them is referred to as AVC for Advanced Video Coding, and has two formal names: H.264 (for the ITU-T standards designation) and MPEG-4 Part 10 (for the ISO standards designation). The other is WM9V for Windows Media 9 Video. WM9V will eventually have a formal name, because WM9V has been submitted by Microsoft to SMPTE to become a standard.

Instead of HDTV at 19 Mbps and standard definition video at 4 Mbps, both of the new coding methods seem to provide good HDTV picture quality at around 5 Mbps, and SDTV around 1 Mbps. A four-to-one improvement in capacity makes the engineers pay attention–and the business executives, too. The question is how to achieve these rates while still providing programming to those who bought legacy MPEG-2 receivers.

The broadcast industry has decided on a first step. Broadcasters are in the process of defining an "enhanced" modulation method that allows a portion of their 19 Mbps to be delivered with increased error coding and decreased payload capacity to viewers beyond the reach of their current signal, and perhaps viewers using portable hand-held PDAs. That enhanced signal, receivable only by viewers with "new" DTV sets, could carry programming that was encoded with the "new" coding method. Viewers with legacy DTV receivers would continue to receive the remainder of the 19 Mbps signal that carries legacy MPEG-2 programming.

Under the cable industry's old way of doing business, advanced video coding would have been easy to deploy because the cable operators owned all of the set-top boxes. If an MSO wanted to deploy advanced coding (or any advanced service) in a market, the MSO could simply move the legacy boxes to another market and deploy the "new technology" boxes in the roll-out market. With FCC rules allowing customer ownership of set-top boxes, and with FCC approval of the Plug-and-Play agreement between the cable and consumer electronics industries, MSOs have lost that kind of control. The cable industry has been studying the new video coding technology, but so far as I know, does not have a roadmap for deploying it.

The consumer electronics folks don't seem to have any problems with building DTV receivers with both MPEG-2 and advanced video decoders, so long as only one advanced method is chosen. They want the broadcast industry to choose between AVC and WM9V, because it costs more money to build in two new decoders than just one new decoder.

But the broadcast industry doesn't really have a mechanism for making that selection. For example, there hasn't been any comparative picture quality testing yet, and some think it's needed. And first, a decision is needed on whether the primary use of advanced video coding will be for extending the coverage area of TV stations, or for delivering low bit rate, real-time video to next-generation hand-held PDAs, or simply increasing the capacity to TV receivers within the existing coverage area. One algorithm might perform better for one purpose, but not the others.

And there are still unanswered questions about licensing and royalty fees. If the picture quality of the two proponents is equal, and if the decoder circuitries have equal cost and complexity, then licensing fees might be the key decisional factor. But standards bodies normally make their decisions on technical criteria, not business factors. So there are still challenges, but a shift to new video coding technology will take place over the next 5 to 10 years, similar to the MPEG-2 timeline. Now's the time to start planning, particularly for the cable industry.

Have a comment? Contact Jeff via e-mail at: