Local television stations are entitled to be carried on cable systems. The Communications Act defines a “local commercial television station” as one that delivers a “signal of good quality” to a cable system. In 2001, the FCC adopted a technical specification for a good quality ATSC 1.0 digital TV signal. It was a simple and uncontroversial exercise. That won’t be the case for ATSC 3.0.

In the days of analog television broadcasting, a good quality signal was specified as –45 dBm for UHF signals (off-air channels 14 and above) and –49 dBm for VHF signals (off-air channels 2-13). If the TV station did not deliver a signal of that level to the cable headend, it was not entitled to be carried on the cable system. 

In the very early days of cable, of course, cable operators erected tower-mounted antennas on mountain tops to bring in far distant signals that might have lower signal strength. But as more TV stations went on the air, circumstances changed. To mitigate the capacity squeeze on smaller cable systems, cable operators only had to carry channels that delivered a good quality signal.

In 1998, well into the transition to digital television, the FCC tackled a number of questions dealing with cable carriage of digital television signals. One of them was “whether new parameters for good signal quality should be established.” But, the FCC said in its 2001 decision, “no commenters addressed this issue.” So the FCC did its own calculation, and came up with –61 dBm as the required signal level.

That value was based on the single “operating point” that the industry had chosen, namely a signal-to-noise threshold of 15.2 dB and a channel capacity of 19.39 Mbps in a 6 MHz channel, using a modulation known as 8VSB. The signal to noise threshold is a measure of the channel robustness, or coverage area.

The choice could have been different. A lower signal to noise threshold would result in a signal that was more robust, provide a greater coverage area at the same transmitter power, but the bits/sec capacity would be lower.

So there were possible tradeoffs, but the broadcast industry chose a single operating point for ATSC 1.0. No so for ATSC 3.0.

ATSC 3.0 will use modulation and error correction techniques that are both more efficient and more robust than ATSC 1.0. And rather than picking a single operating point, it will allow broadcasters to make individualized tradeoffs between signal to noise threshold, data rate and coverage area.

So, for example, if a broadcaster wanted to retain the same coverage area as today based on the 15.2 dB signal to noise threshold, he could achieve a data rate of about 25 Mbps. Since the signal to noise level is the same as today, the FCC’s calculation would produce the same –61 dBm as the required “good quality” signal level.

But if the broadcaster wanted to retain the same 19.39 Mbps capacity, he could operate at a signal to nose level of about 10 dB, which would produce a larger coverage area. In that case, the required signal level to produce a good quality signal would be about –66 dBm.

Or the broadcaster could decide to deliver 30 Mbit/sec, which would require a signal-to-noise level of 20 dB, and the result would be a smaller coverage area than today, maybe so small that the cable headend is too far away to receive the signal.

So far, that’s pretty straightforward. But ATSC 3.0 has another feature, the notion of Physical Layer Pipes, which adds complication. The 6 MHz channel can be divided up into subchannels that have different levels of robustness. The content can be separated in some elements that travel in the more robust pipe (larger coverage area) and other elements that are carried in the less robust pipe. 

So, for example, the signal can consist of an HD picture in the more robust pipe (larger coverage area) and an “enhancement” signal that, when added to the HD picture, produces a 4K UHD picture. But the 4K UHD quality is only available to viewers close to the transmitter (smaller coverage area).

If the cable headend is located near the edge of coverage, and can only receive the HD resolution picture, is that “good enough” signal quality? In that case, cable subscribers throughout the cable system would be deprived of the signal quality that is delivered to the close-in off-air TV viewers.

But that’s only one of the tradeoffs that ATSC 3.0 provides. Some broadcasters want to deliver video to handheld receivers like cell phones. That requires a more robust signal because those receivers have tiny antennas compared to fixed receivers in homes, with a signal-to-noise threshold perhaps in the 4 to 8 dB range. But maybe that signal is lower resolution than HD, since the displays are smaller. Is that a “signal of good quality”?

Anyway, ATSC 3.0 will offer a lot more flexibility that today’s ATSC 1.0 technology. But with that flexibility comes a whole raft of new cable carriage questions, including what constitutes a “signal of good quality.”