Like the proverbial freight train that's thundering down the mountain, cable TV's rush into the digital era has begun in earnest. Led by Tele-Communications Inc.'s deployment of hundreds of digital headends and hundreds of thousands of digital set-tops, cable installers and technicians are setting up and troubleshooting digital equipment every day.
While this day was inevitable, it's still a bit scary for some, especially those who have long histories of dealing with analog video signals only. In public, the word is that digital signals work just fine over a good cable network. But to prove that, affordable test gear has to be developed and new testing methods have to be deployed.
Already, new terms have to be learned when measuring the signals in a digital network. Cable troubleshooters will need to learn and understand modulation error ratio (MER), bit error rate (BER), estimated noise margin (ENM) and even the old desired-to-undesired signal ratio (D/U) in order to quantify their system's performance. Test gear that employs constellation diagrams and other pertinent digital information will become as important as today's analog signal level meter.
The following story illustrates just how much there is to learn when it comes to making the transition to digital signal delivery. The article shows how bit error rate testing can be used to provide the performance data a cable technician will need as he tests his system. But it also points out that, so far, the test methods are impractical in the real world and acknowledges that equipment vendors must begin providing useful BER data to customers as well as provide practical test methods so that field alignment and testing can be performed.
The proliferation of Internet Service Providers (ISPs) has gained the interest of cable TV system operators who can provide cable modems, and of companies that manufacture turnkey cable data platforms. Bolstered by the hundreds of thousands of miles of network plant already, these enterprises see themselves as viable alternatives to existing ISPs and telephone providers. To provide the service that will engender growth in their customer bases, they must quantify just how well their return paths can perform. The author evaluates two measurement techniques: 6-T channel analog distortion tests, and Bit Error Rate (BER) tests performed on a single return amplifier. He then compares the data gathered from the two test methods.
We've heard much about the current information revolution: droves of data consumers rushing onto the Internet in ever-increasing numbers. The hunger for instant information is, in turn, creating keen demand for technology that provides access to it, and the faster, the better. MSOs, armed with the speed and reliability of cable modems and the miles of plant already in place, stand poised to wrench considerable market share from the grips of Internet access providers. To ensure a loyal and growing customer base, however, MSOs need dependable methods to predict the performance of their return path network.
Accuracy is crucial. We evaluated two common measurement techniques: the 6-T channel analog distortion test and the Bit Error Rate (BER) test by applying them to a single return amplifier.The 6-T channel analog test
The 6-T channel analog test involves supplying a device under test (DUT) with a specific analog channel input. By specifying parameters (for example, input level, gain, output level and interstage equalization), we can measure carrier-to-noise ratio (CNR), composite second order (CSO), composite triple beat (CTB), and cross-modulation (X-MOD). These tests are recognized as the industry-wide standard for several reasons:
- We well understand the relationships between changes to input levels and corresponding effects on distortion products;
- These tests require a minimum of equipment;
- They can be completed in a short time period; and
- They can be performed on an existing and operational cable TV plant.
In all our analog distortion tests, we've assumed that the return system uses the 5–42 MHz bandwidth.Analog testing: single amp
Equipment specification sheets define amplifier performance at specific input levels with a certain number of channels. Essentially, we have a very static measurement with several points, all valid for a single input condition.
For qualification testing of return path devices, MSOs typically require that the device be set up to provide a specific gain and output level. T-channels are then supplied to the input of the device at a specific level per channel, and the distortion products are then measured at the output of the device. Once the single set of distortion products are tabulated, the measurement is completed.
In reality, however, levels are dynamic and vary over time. A more complete characterization would measure the CNR, CSO and CTB performance for a range of different channel levels. Consider the performance of a single return amplifier tested over a range of input levels.
Figure 1 shows distortion products at the amplifier output over a range of input levels. The plot is not specific for either CSO or CTB. This is a very narrow region, existing over a range of only 20 dB. The return amplifier's thermal noise floor limits measured distortions for low-level input. At input levels greater than +50 dBmV/ch., the return amplifier is well outside the linear performance region. Modifying the test setup could increase this plot's active region.
The plot in Figure 2 demonstrates the measured CNR as a function of input level per channel. At low-level inputs, the return amplifier's thermal noise floor again limits the measured CNR. At input levels greater than 30 dBmV/ch, we are limited by the dynamic range of the spectrum analyzer. This plot gives a somewhat wider range over which performance can be measured, but the question remains: how does this relate to the device's actual performance when loaded with digital channels? In our tests, we haven't been able to predict a return system's BER performance using measured CSO, CTB and CNR performance.Bit Error Rate (BER) testing
BER characterizes the limitations of active devices to support digital data. David R. Smith, in Digital Transmission Systems, defines BER as "the ratio of errored bits to the total transmitted bits in some measurement interval."
BER differs from traditional analog testing: the measurement is not static, and the device is tested over the entire operational range of input signals. At low-level inputs, noise figures for amplifiers and sporadic noise for lasers limit performance; at high-level inputs, compression for amplifiers and clipping for lasers restrict performance. By plotting the measured BER vs. input level, we create a plot representing the operational or dynamic range for which the device will operate error free.
In the example, all plots corresponding to measured BER performance (as illustrated in Figure 6) refer specifically to T-1 rate, 1.5 MHz-wide Quadrature Phase Shift Key (QPSK) data.
The BER is plotted as a function of the "total RF power" present at the input of a device. The "total RF power" is plotted along the X axis while the measured BER is plotted along the Y axis. "Total RF power" refers to the total integrated power due to the 23 QPSK channels plus any parasitic noise (presumably very low) measured at the DUT's input. Twenty-three evenly-spaced QPSK channels will fill a 5–42 MHz bandwidth shown in Figure 3. Thus, the conversion from a "power per channel" to "total RF power" is given by:
Total RF power = [10*LOG(23) + N]dBmV, where N = power per channel in dBmV
For example, given 23 QPSK channels at +15 dBmV each:
Total RF power = [10*LOG(23) + 15]dBmV = [13.6 +15]dBmV
Total RF power = 28.6 dBmV
We use this conversion to "total RF power" for several reasons. In functional systems, channels originating from different locations can have significantly different individual channel levels measured at a common point within the system like the input to a return transmitter. This makes it difficult or even impossible to calculate power on a "power per channel" basis. Another reason is ingress, for which a power "budget" can't be assigned on a per-channel basis because most ingress is random in event strength and duration.
Defining the BER operational window based on total RF power is practical because it can be measured with an RF power meter, a tool to which system operators typically have access during return systems alignment.Additional equipment requirements
A data transmission analyzer (DTA) measures the actual BER, which is calculated from detected pattern errors. An arbitrary waveform synthesizer (AWS) provides 22 simulated QPSK carriers to fully load the 5–42 MHz return. The channels have the correct peak-to-average power ratio for a QPSK signal, and every simulated channel can be demodulated into a constellation or eye diagram format using a vector signal analyzer (VSA). The only difference between the 22 simulated channels from the AWS and the single live channel at 19.3 MHz is the simulated channels carry no data.
The channel carrying the data used in the BER test has a marker placed at the peak.
Generally, active devices will be limited for low-level inputs either by the thermal or a "colored" noise floor. Thermal noise is sometimes called "white noise" because it's random in power, frequency and phase. Colored or sporadic noise sometimes describes the noise floor behavior of a Fabry Perot (FP) laser. The noise floor can have a combination of random and impulsive noise.Noise-side BER test
The data transmission analyzer provides clock and data to the modulator (see Figure 4). The live channel at 19.3 MHz is combined with 22 simulated QPSK channels from the AWS. The combined attenuator provides the appropriate signal level at the DUT input. At the DUT output, a bandpass filter centered on 19.3 MHz limits excess RF power from the demodulator's input. Adjusting the demodulator maintains a constant level input that prevents its AGC circuit from becoming a variable.
For a return amplifier powered by a 24V DC power supply, it's safe to say that this setup has a predictable BER performance for low signal input levels. When the same return amp is installed in a housing with a forward amplifier and receives its 24V DC from the 60V (or 90V) AC line power, the noise floor may not be white. Power supply switching noise, forward channels bleeding into the return, and other interactions between amplifiers and power supplies can affect the noise-side BER performance of the return amplifier.BER-compression/clipping-side BER test
Devices such as RF amplifiers can experience gain compression as input increases. According to Morris Engelson's Modern Spectrum Analyzer Measurements, the 1 dB compression point is the amplitude where a circuit output departs by 1 dB from a 1:1 relationship with the input. In an amplifier operating near its compression point, a signal level increase at the input will cause little to no increase in the output level. In an extreme case, the amplifier output level can actually drop. For QPSK data, an amplifier operating in gain compression can be somewhat tolerated.Clipping
For devices such as optical transmitters, clipping limits BER performance at high-level inputs. The RF modulation at the laser input will exceed the device's limitations. In effect, you'll attempt either to create negative photons when biasing below threshold or create more light output than is possible from the device. This is an important distinction because within the RF amplifier cascade, you can tolerate some gain compression and still operate. Within the optical link, however, you can't operate if clipping occurs.
Figure 6 displays noise side and compression side data on the same plot to simplify interpretation of the relationship between input level and BER performance.
The dynamic range of this return amplifier as tested with T-1 rate QPSK data is on the order of 75 dB wide at a 1 × 10-6 BER. For low-level inputs, we are still limited by the thermal noise floor of the device, and at high-level inputs, by the intermodulation products falling under the live channel, diminishing the signal-to-noise ratio at the input to the demodulator.
Analog testing of return components and return systems are practical tests for determining the performance limitations under analog channel loading.
Bit Error Rate testing provides accurate representation of device performance over the entire operational range of the device, and is appropriate in lab environments and in limited field environments.
Bit Error Rate testing does require extensive specialized equipment.
Knowledge of individual device BER performance combined with solid systems design practices can be used to predict overall system BER performance.
In the near future, a simplified field test procedure based on BER test data will be used, as researched and documented in the paper entitled, "Return System Validation and Verification: An Enlightening Look at Competing Techniques."