New power strategies arise in the wake of
foul weather and lifeline VoIP services

The harsh, bitter cold of Barrow, Alaska, and South Florida's menacing hurricanes, are two powerful forces fueling next-generation powering strategies and back-up systems. Those systems are now being outfitted to cope with nature's unpredictable, and at times cruel, assault on cable networks and the communities they serve.

Mix in the added power needed to service the Yankee Group's prediction of 18 million cable voice subscribers by 2009, and the pressure to cost-efficiently sustain and back up power systems 24/7 (all while satisfying 911 and emergency requirements) becomes a high voltage issue.

And the stakes are rising. Total consumer spending in communications services—voice, cable TV, dial-up and broadband—was $115 billion in 2004, with 50 percent of it for voice services, states a report from In-Stat, a sister company to CED.

The addition of VoIP, data and IP-based services, along with the severe weather lessons learned in Barrow and South Florida, are prompting more cable operators to re-evaluate their powering and back-up power strategies and push for more reliable, cost-effective powering systems, including network-based power.

"Operators now recognize they are competing against the telcos and others, but weren't sure of the revenue opportunities of VoIP upfront. Now, with significant VoIP customer projections, it's easier for them to get ROI for network-based powering strategies," says Lindsey Schroth, senior analyst, broadband access technology for the Yankee Group.

The cost of batteries and eMTA (embedded multimedia terminal adapter) equipment, Schroth notes, is still relatively high. Yet the migration to a more expensive network-based power system remains a long-term decision based on economies of scale. "It's more expensive for a network-powered base, but easier to justify when penetration levels rise for VoIP. But, we're not past the battery back-up phase yet, especially in severe weather regions, where powering is even more of a challenge."

Challenge, indeed. In 2004, hurricanes Frances and Jeanne blasted ashore on Florida's Treasure Coast north of Miami just two weeks apart, stressing to the max Comcast Cable's network there and exposing just how vulnerable networks are to foul weather—from lightning strikes and ice storms to hurricanes.

And more hurricanes are expected, with seven to nine predicted this season, according to the National Oceanic and Atmospheric Administration (NOAA).

"Power was the key from day one. We looked at each node to determine if the power was out, and from the first day, 90 percent of our plant was down and all commercial power was impacted. We had standby batteries for several hours and emergency installed generators at each plant, but with an event of this magnitude (Frances and Jeanne), it was nowhere near enough," recalls Sue Reinhold, vice president of operations for Comcast South Florida.

With no standard number of emergency batteries or generators in the powering/emergency strategy, and fuel to operate the generators at a premium, or non-existent, Reinhold and Comcast found themselves far short of the necessary equipment to quickly recover from the hurricane's force, which Reinhold says takes a full two quarters to repair damage, replace batteries and power supplies.

"In the aftermath, we thought we had enough. We didn't. But we learned a valuable lesson, and have incorporated into our revised emergency plan a 100 percent battery backed-up system with several additional generators. We know exactly where they are and where the fuel is for generators. Our emergency plan is now creative and responsive," she explains.

Comcast has since deployed a three-pronged emergency plan that stresses preparation, internal and external contact information and updated powering strategies, and post-hurricane deployments. "There just wasn't enough continuity once the system started coming back, so the first step is to assess the damage and stay in communication," concludes Reinhold.

"Welcome To Barrow"

Adelphia Communications' West Palm Beach cable system was hit hard as well, and has since added a 275 kw trailer-housed generator that can quickly power up its remote hub sites. And by year's end, the system will have launched VoIP service along with VOD, adding to its power and back-up system requirements.

"We're pushing for monitoring all the power supplies remotely at the headends to switch and re-evaluate the power, and preparing for powering loads for VoIP. That is critical. When we can monitor real-time information, that's the key. It's a battle to justify added power, but when it goes down, like in the hurricanes, that's the payoff," says Skip Buck, system engineer for Adelphia of West Palm Beach.

All of Adelphia's generators are diesel, Buck explains, since diesel fuel is easier to obtain than propane during hurricane season, and 90 percent of its power supply is checked at least twice a year, with the focus being on hospitals, community centers and other critical services. Its master headend has two 400 kw generators.

"Each system has a power supply team, which monitors run cycle times, battery replacements and load voltages. We also do quarterly load tests, when we shut down commercial power and switch the load to generators to check them. If the generators don't start, we can run 45 minutes of UPS, which is loaded to carry about 55 percent of the powering load," adds Buck.

Squeezing every minute of additional run time from battery back-up is an on-going challenge as well, Buck notes. "If we can increase run time, it gives us more time to respond and make sure the system is up and running."

The growing awareness of increased power and back-up power needs is not lost on cable's vendors. More power and battery manufacturers are devising methods of squeezing additional run times into their batteries to help deal with severe weather emergencies, while attempting to keep the costs down, which is admittedly a tricky proposition.

"Hurricanes, ice storms and bitter cold are difficult, so operators are asking us to provide more intelligence on the run time side in the design. This is not earth-shaking technology, but if we improve a battery's run time by 15 minutes, it can reduce the number of technicians and generators. But it costs about 10 percent more to get those 15 extra minutes, so we have to keep the cost down," says John Hewitt, vice president of cable sales for Alpha Technologies.

Alpha recently added a 28-generator trailer equipped with additional cable equipment to its power supply line.

Adds Hewitt: "There's a push to add more batteries as voice is deployed, but when a storm hits, network reliability and customer expectations are still crucial. So, status monitoring will continue to grow, and we must continue to improve at less cost and with more reliability."

And with more back-up power, says Dave Hebert, vice president of operations for Supply Performance Testers Inc. "The major engineering standards had eight hours of back-up power time and maintenance for networks. Now, cable has rebuilds completed, but not eight hours built into the networks. They're not designed for that, so the standards are changing to monitor power supplies."

The standard, Hebert insists, must include a better managed battery program. "MSOs can get a lot of additional life from batteries—five to eight years—with a good maintenance program. With batteries costing around $100, and with a thousand services for six-battery packages, the 'swapping-out batteries' mentality is going by the wayside. And when you add logic circuitry to interface with status monitoring, more can go wrong, especially in harsh weather conditions."

And Alaska should know about harsh conditions (see sidebar below). From Juneau, with its wildly fluctuating temperatures of 50 below zero to 100 above, to frigid Barrow, GCI's 16 cable systems are spread across Alaska and routinely deal with treacherous weather, outages and battery back-up/powering issues.

"We're adding status monitoring and using Lithium-Metal-Polymer (LPM) batteries, which we expect to last 8 to10 years. By getting a view of the network, especially with three product lines, status monitoring has helped more than anticipated, and batteries are smarter. But if they're not maintained, they'll fail," says Gary Haynes, vice president of operations for cable and entertainment for GCI.

Ice storms in Kentucky will cause power outages and batteries to fail as well. And quickly. "We're in the belt where ice storms happen (Illinois, Indiana, Kentucky, Ohio) and now include in our business plan alarm systems and status monitoring. That's been the biggest improvement in our power system," explains Jerry Knights, vice president of telephone engineering for Insight Communications.

Insight, Knights adds, is testing its VoIP network using generators and back-up battery units, which he says have performed admirably. "They have performed better than expected, giving us about 18 hours of back-up time. We change one-third of our battery base each year. Even in generator plants, we have three standby batteries to transition power until generators come up. The batteries are much better than before."

Most operators agree back-up battery systems and generators are better than ever, but is it enough to cost-effectively satisfy the growing power and back-up power needs to future-proof triple play networks?

"Back-up power systems have definitely improved, and MSOs want heat-resistant, longer-lasting, reliable batteries, but there's only so much power you can stretch, and cost is a concern. Does it pay to carry a product that produces revenue with off-the-shelf battery back-up? That's where the economics are—for both manufacturers and MSOs," says Farah Saeed, program manager for back-up power solutions for Frost & Sullivan, a media research and analyst firm.

Moving to network power is the ultimate answer, most experts admit, but that option will only become viable once certain VoIP economies of scale are reached—about 30 percent penetration, industry observers estimate.

"Standby power protection is being driven by VoIP and weather-related issues, so portable generators and battery back-up systems are still being sold. And we're seeing a migration to DOCSIS standby monitoring equipment," says John Precopio, senior product manager for broadband power systems at American Power Conversion Corp.

In the meantime, the real thing in power and back-up power systems is likely to remain batteries, generators and various fuel cell and other technologies, with subtle changes in run times and output to deal with the added power requirements of services such as VoIP and severe weather issues.

"The issue now is protecting VoIP service from lightning and other weather issues. How do you plan for battery back-up for phone service that's out for five days?," asks John Chamberlain, president of Broadband Telecommunications, a provider of VoIP lightning protection equipment.

There are few answers to that question once the fury of a hurricane or dangerous ice storm is in full force. No protection, however, isn't the answer. Concludes Schroth: "Clearly [having] no battery back-up is not an option. The key is finding the tipping point to cost-effectively manage back-up power with all the new subscribers."


Northern exposure

Barrow, Alaska sits high atop the Arctic Circle on the frigid shores of the Arctic Ocean, a three-hour flight from Anchorage.

When GCI, an MSO serving 135,000 subscribers in 16 systems across Alaska, acquired the Barrow cable system, it brought a whole new meaning to the terms maintenance and power supplies.

With winter temperatures routinely dropping to 50 below zero, just negotiating through the bone-freezing cold streets of Barrow can be an adventure. For 82 days during the winter, the sun never rises above the skyline, and in summer, it never sets.

So it's no far stretch to say keeping the system's power on and backed-up in Alaska's most northern outpost is a chilling proposition. "We can only work the plant from June to September; then it's frozen. It's an arctic desert," says Gary Haynes, vice president of operations for cable and entertainment at GCI.

Walk-out crews need heaters to keep the hydraulics working in their vehicles, and at least eight hours of back-up power is essential to keep the system up and running, along with hardened power schemes and transformers, Haynes says.

In the unforgiving, brutal climate of Barrow, a little innovation is needed as well, Haynes adds. "During our walk-out, we saw an old oil burner on the side of a lift-van. It was belching smoke to keep the hydraulics going. They had modified old coal burners. Later, a Polar bear casually strode through town. Our technician struggled to pay attention to the lines he was checking," Haynes relates with a laugh.