The past year saw many gains in engineers' understanding of the performance parameters of hybrid fiber/coaxial networks, but the technical cost/benefit equation that is the key to the digital future remains nearly as elusive as ever.

"The key pacing item is our ability to upgrade our networks," says David Woodrow, senior vice president for broadband services at Cox Communications. "That's a bigger issue than the availability of equipment to support new services."

Woodrow adds: "My sense is it will be 12 to 18 months before you see fullscale launch of telephone services beyond the early markets. Data could be a little ahead, but, either way, you need to have the two-way plant in place."

Putting that plant in place will require knowing a lot more than is known now about the performance requirements which market demand is likely to impose on the cable network, executives note. "We can't wait for every issue to be resolved before we make our networks ready for telephony," says Ron Cooper, executive vice president of Continental Cablevision, "but we don't want to pour huge amounts of capital into the effort without knowing more than we know now."

While results from early technical field trials in '95 lend strong support to the conviction that the HFC network is up to meeting the full-service agenda, these trials have also shown that converting today's networks to meet the required level of performance is not going to be as painless as some MSO executives had hoped. "From the two-foot perspective, as opposed to the 30,000-foot perspective we started out with, there are a lot of technical details we have to resolve which we couldn't deal with until we got to where we are now, and these things take time," says Andy Paff, president and CEO of Integration Technologies Inc., the new joint venture between Antec Corp. and Northern Telecom.

"In the initial market trial and rollout phases, there's going to be a certain amount of overkill in terms of making absolutely sure everything works," Paff adds. "But, going forward, people will figure out which solutions achieve the necessary performance levels at the lowest costs, and that will be what they do in fullscale deployment."

The network availability benchmark

No company has a more ambitious agenda than Tele-Communications Inc., which plans to thoroughly test market demand for wireline and wireless telephone services, high-speed data connections and digital video this year. But TCI, like everyone else, is searching for the right benchmarks before unleashing the upgrade juggernaut at full scale.

"Our research shows HFC networks can provide high levels of reliability," says Tony Werner, vice president of engineering for Tele-Communications Inc. "But as we add advanced services, we're going to need a higher and higher degree of reliability."

Werner, who discussed network availability requirements at the recent SCTE Emerging Technologies Conference in San Francisco, suggests industry strategists must think of digital video and multimedia entertainment services as the most daunting performance benchmarks, rather than telephony.


Noting that household use of telephone service represents a fraction of the time spent on watching TV, Werner says, "Bellcore (in Technical Advisory 000909) makes the point that a video viewer is 10 times more likely to experience an outage watching TV than a person who is on a phone call."

While telcos talk of 53 minutes of downtime per customer per year as the maximum allowable network failure rate, the full multimedia service network might need to aim for a higher benchmark, Werner says, citing various industry sources for projections on new service usage patterns. With customer time devoted to using new transactional, digital video and data services added to the seven-and-a-half hours per week spent watching today's analog TV, "the customer could have 20 times as strong a chance of experiencing an outage as someone using the telephone," he notes.

Werner says a computer model of an HFC network operating with four amplifiers in cascade beyond the node calculates total annual downtime per user would be 41.33 minutes, not counting time contributions of the network interface unit, the drop and the bandwidth manager. He estimates these would bring the figure to close to 47 minutes.

But Werner believes the number that will be required in the full multimedia service environment is somewhere between 20 and 30 minutes of downtime per customer per year. The key to getting there, he says, is better maintenance, faster repair times and stronger surge protection. This may be no small feat, given the fact that Werner's estimates on projected HFC performance were based on the best mean-time-to-repair figures registered among three unnamed large systems that were the focus of his SCTE paper, co-authored with Oleh Sniezko, TCI director of transmission engineering.

Optoelectronic uncertainties

Beneath the overarching uncertainty about network availability requirements in the digital era are a host of uncertainties about virtually every facet of network design, from the optical electronics array, to the performance requirements of amplifiers, to the reliability of drops, to the barriers imposed by customer premises wiring. Where optoelectronics is concerned, the questions apply to signals in both directions. In the downstream, the biggest issue right now appears to be whether the industry should continue depending on the splitting of wideband distributed feedback (DFB) lasers over three paths in the 50–550 MHz range, while relying on dedicated, low-cost lasers to add 200 MHz of digital signals for telephony, data and video on a per-node basis.

"The costs of DFBs are falling to the point where it's possible to think of using one laser per node, incorporating the analog and digital tiers on a dedicated basis," says Larry Stark, marketing vice president for Ortel Corp., a leading manufacturer of DFBs for HFC applications. "But there are a lot of unknowns that have to be factored in before you can make the decision."

One of the unknowns is the slope of the cost curve into the future. Right now, Paff notes, costs favor the three-way split/dedicated laser combination, given the fact that wideband DFBs cost in excess of $7,000. On a per-node basis, the combination of one-third the cost of an $8,000 wideband DFB with the $3,000 cost of a dedicated 200 MHz DFB and the $2,500 transceiver and combining costs brings the optoelectronics pricetag to about $8,000 per node. The link cost for a single laser and transceiver would be the same if the DFB price falls to $6,000 for a lower-power unit covering the full 50–750 MHz spectrum, he says.

A slightly different way of looking at the tradeoffs is reflected in a new line of optical gear now in shipment from Scientific-Atlanta. The key to reaching parity with standard approaches is recognizing the new fiber penetration target is 500 homes, as opposed to 1,500 or more, which means a laser split three ways that once served 6,000 customers would only be serving 1,500 customers in the upgraded environment.

"Today, you have transmitters serving two or more links reaching serving areas as large as 2,000 homes," says Andy Meyer, director of marketing for transmission systems at S-A. "As the industry extends fiber to the 500-home serving area, people are looking for cost-effective ways to narrow the number of homes served per laser to 1,500 or 2,000."

Such lasers, split over links with loss budgets ranging between 3 dB and 8 dB per link, will carry the full 50–750 MHz downstream signal load, allowing the 550–750 MHz digital segment to be allocated to telephony, data, on-demand and broadcast digital services, Meyer says. The dedicated service segments would be allocated as if they were going out to a total serving area of 1,500 or 2,000 households, and contention levels would be figured accordingly.

But cost tradeoffs aren't the only question that must be resolved as operators decide how to allocate optical budgets, Paff notes. There's also uncertainty as to what level of signal power will be required for digital video, which, owing to the demands of advanced modulation techniques, might be much higher than originally thought.

"If digital video requires carriers at -10 dB or -6 dB below analog carrier levels, rather than -15 dB as previously assumed, there will be an impact on linearity requirements for lasers and on the RF amplification hybrids as well," Paff says.

Telephony further complicates matters, Paff adds, raising a question of cost tradeoffs between redundancy on the one hand and deeper fiber penetration on the other. With relatively low fiber penetration, fully redundant electronic and optoelectronic backup is required to achieve network availability targets. Deeper fiber penetration, with fewer than anywhere from 48 to 150 customers served, depending on whose benchmarks are used, lowers the number of users vulnerable to outage to a point where network availability targets can be met without adding redundant electronics.

"We need more information in all these areas," Paff says.

Drops and premises wiring

Drops, too, raise unanswered questions. Right now, Werner notes, TCI studies indicate that one-third of all outage calls are drop-related, and that 10 percent of the drop calls are "hard failures" requiring replacement.

The advent of telephony and digital services adds new pressure on today's F-connected drops, with powering to the network interface unit required in the case of telephony, and high levels of protection against impedance mismatches required in the case of advanced modulation for digital data and video. "We may be at the point where the drop system should be rethought completely," Paff says.

Another factor adding uncertainty, despite continuing efforts to better understand it, is the impact of premises wiring on digital services. Cable Television Laboratories has spent more time than anyone looking at digital signal performance over cable plant, but remains uncertain about the home wiring side, notes Claude Baggett, staff specialist for consumer interface and conditional access at CableLabs.

"Home wiring requires testing on an individual system basis," Baggett says. "If other TV sets cause problems, you might have to isolate the set getting the digital signal."

Rogers Cablesystems has begun looking at the performance of advanced modulation techniques in real-world situations, says Nick Hamilton-Piercy, senior vice president of engineering and technology at the Canadian MSO. "So far, we're getting good results with 64 QAM (quadrature amplitude modulation), 256 QAM and 16 VSB (vestigial sideband), but we're just getting underway."

Hamilton-Piercy notes that a large portion of homes in the MSO's serving areas have been newly wired or rewired within the past five years. But more testing will be necessary to confirm early results, especially since the MSO hasn't looked at the impact of second outlet terminations on the digital signals.

As research at Bellcore has demonstrated, microreflections from such unused termination points can wreak havoc on digital signals (see CED, 10/95, page 80).

Equally important, as many engineers have noted, ingress from all kinds of sources in the home can add up over the coaxial bus to create a disastrous impact on upstream signals from any given serving area.

"If we get to the ugly conclusion that we need traps to isolate in-home wiring, we might have to think about going to smaller node sizes," Paff says.

Arguing further for deeper fiber penetration, he adds, is the possibility that, with bandwidth in the upstream at a premium, techniques such as frequency hopping that are meant to overcome noise problems might not be adequate for long-term needs.

Finding power solutions

Another area of continuing uncertainty is the means by which operators will provide power and backup power to network components, given the lifeline requirements that come with telephony. Here, too, there is an impact on the drop issue, insofar as most operators in field trials so far are relying on a separate twisted pair connection from a power tap to deliver electricity to the home, rather than using the drop itself, thereby adding significant costs to the system.

"Do we want to power down that drop?" asks David Large, a principal in the consulting firm of Media Connections Group, who clearly believes the answer is no. "The answer isn't 412 (gauge coax) to the house, but we need to do something."

Both Large and Paff point to eliminating the F-connector and going to hard wiring as a possible solution. "We have to come up with a migration path from the current tap system," Paff says, noting that re-use of the existing drop cable will be possible only "if the power can be distributed down the drop center conductor."

Backup power adds further complications, given the environmental and maintenance hassles associated with maintaining lead acid batteries at every fiber node. Here the industry is searching for new solutions and is evincing some preliminary support for a new technology based on the principle of flywheel energy storage.

Satcon Technology Corp., a startup based in Cambridge, Mass., plans to have flywheel power units available for field trials by the fourth quarter of this year, with commercial production slated to get underway next year, says Richard Hockney, vice president and CTO of Satcon. The units, designed for underground installation at pedestal- or utility pole-mounted node distribution sites, are capable of generating one kilowatt of power for up to two hours in the event of a power outage, he says. "This is a good idea," says Dan Pike, vice president of engineering for Prime Cable. "There's a lot of interest in the technology."

Forging ahead

While MSOs want answers, they also understand they can't wait forever to push forward with the upgrades required to support full service ambitions. While companies are hesitant to plow more money into recently upgraded systems that operate at the 550 MHz, 2,000 homes per node tier, they understand fibering aggressively is the name of the game.

>Glenn Jones, chairman and CEO of Jones Intercable, counsels an aggressive approach to expansion, noting that "we can't wait for the RBOCs to find another way into the home with high bandwidth services before we get there."

"You can't go wrong by fibering the networks," Jones says. "That's a winner no matter how it goes with regard to data and telephony."

Continental has the same view, having embarked on a company-wide upgrade to 750 MHz that is to be completed by the end of '98, with most of those systems being activated for two-way communications as they are upgraded.

"The transition to fiber-rich HFC will add the capacity to do more with pay-per-view, service multiplexing and adding new tiers to the core business, no matter where we go with data and telephony," Cooper says. "And fiber gives us the signal quality and service reliability we need to make us more competitive with satellite services."

In fact, the core business imperatives behind fibering the networks are cable's best guarantee that it will be in a position to launch voice and data services. As Cooper puts it, "Ultimately, the engine for entry into new business is cash flow from our core business."