Subscribers merely want to feel like they’re there.
QoE is the way to get them there
There’s been a lot of talk, talk, talk about quality of experience in the last few years, but now the industry is beginning to walk the walk. Operators have been educating themselves about, and are beginning to implement, technologies and processes to provide QoE, and that is encouraging more vendors to support those efforts.
The traditional standard for video quality has been making sure video signals reliably get to customers, making sure packets get from Point A to Point Z – a process that came to be called quality of service, or QoS.
The shortcoming of that approach is that picture quality, what customers actually see on their screens, is beyond the purview of QoS. Picture quality has always been a value, but until recently, service providers could get away with adequate picture quality, which meant QoS was adequate for the job.
The advent of high-definition video is forcing the issue higher up the list of priorities, however. For customers who have dropped a couple grand on a plasma widescreen and are shelling out a C-note or more per month for HD, just getting a signal is no longer anywhere near good enough. Those folks are investing in video and are unlikely to put up with anything but the most fleeting of problems in picture quality.
Competition is also a factor that is rendering QoS less than adequate. Ten years ago, dissatisfied customers might churn, but if an operator so desired, it could offer those former customers inducements to churn back. Now with the market approaching saturation, coupled with the advent of sticky bundles, losing a customer is a more final, and therefore less tolerable, proposition.
|A key element of assuring quality of experience (QoE) is mapping network performance data to degradations in video quality that are actually detectable by customers. The next step is creating a scale for the severity of the degradation, which in turn can be used to set triggers for alerts when video quality deteriorates below a threshold that can be set by the service provider. Source: Mixed Signals.|
And the competitors know it and show it. Several cable companies tout their quality, and Verizon and DirecTV both do marketing based on uptime and quality.
“I can’t say it’s a large problem,” said Gregg McEntee, director of network services for Comcast California. “But even if a problem affects only 1 percent of customers, that’s a million phone calls.”
Marketing quality video is one thing. Delivering it is another. Monitoring equipment companies like IneoQuest, Mixed Signals and Symmetricom anticipated the concern, have been talking up QoE and report that most major MSOs are beginning to embrace the new ethic.
“MSOs are only beginning to get their arms around what QoE means,” said Mixed Signals CEO Eric Conley, an observation echoed by most sources for this article. “Look at the top-six major MSOs in the U.S. I’d say two of them have a handle on it. On average, the rest don’t. Everyone is still relying on suppliers to tell them what the deal is.”
Comcast is one of the two, Conley said.
Comcast’s McEntee has been working with several QoE vendors. One challenge of assuring QoE is extracting usable information from a consternating surfeit of data from the network. “We can detect a lot of things,” he explained. “But what’s meaningful? We’re still in the infancy of this.”
Evaluating volumes of data and mapping the results to what viewers can actually perceive is an exhausting process that is far, far from complete. It’s a critical need, and multiple vendors are doing it. That’s two steps forward, but there is no standard means for the process or for presenting the results (scores or grades), which is one step back.
RGB Networks uses monitoring systems to evaluate the performance of its own products, not only to make sure that its bandwidth optimization, grooming, ad insertion and other equipment works, but also to ensure that the company isn’t becoming a source of video degradation itself.
Tools from different vendors “can come up with different variables. It can all be valuable information,” said Ramin Farassat, RGB’s vice president of product marketing.
“Yes, the tools are helpful,” Farassat said, “but at the end of the day you still want to do side-by-side comparisons, done objectively, with a bunch of guys in a semi-dark room.”
Mixed Signals, for example, said it can look at video streams and identify errors that will result in specific problems, such as frozen video, tiling, black screen and audio level disruptions. The company has developed a Perceptual Video and Audio Scoring System, which provides a score that reflects subscribers’ TV-viewing experience. An operator can set a threshold for quality, and can set alerts if video quality degrades below that level.
“The key is thresholds,” McEntee said, “setting them to where you consider it an impairment.”
Comcast is taking its metrics and turning around and sharing them with its network equipment suppliers – “the BigBands and Terayons,” McEntee said – so that they have actual network data they can use to tweak their systems to perform better.
Implementing QoE is starting to catch on with vendors of encoders and multiplexers, and it is likely to move to QAMs. Most other network systems simply transport video, though, so adding QoE support there might be unjustifiable.
Imagine Communications has been talking in terms of video quality from its inception. It recognized early on that video quality would become a competitive differentiator, explained President and CEO Jamie Howard.
|Although consumers are somewhat more forgiving of quality impairments in data and voice, expectations of quality are rising, especially as video is more frequently being transmitted through the data channel. Deep packet inspection is one tool that will contribute to managing quality.
Video quality is “very complicated for service providers to manage,” Howard said. “The vendor community has responded with better encoding, leveraging human visual models, mezzanine encoding,” – encoders at content providers’ sites – “and bandwidth efficiency across the plant, as well as monitors and remote probes.”
(Source encoding is a critical concern. One of the first things Comcast realized is that it can’t improve quality, so it’s been turning to its video sources and making sure they conform to minimum video quality standards, McEntee observed.)
Imagine has taken a systemic approach to video, anticipating that service providers will have to serve video to multiple devices – TVs, laptops, cell phones – at different quality levels appropriate for each, while considering bandwidth available for each.
Because of the complexity of managing video through different paths, the company developed a quality manager utility that allows Imagine systems to measure both inputs and outputs for quality consistency.
Howard also revealed a glimpse at where all can go next: healing. Howard said Imagine is not doing forward error correction, but instead is doing something in the MPEG layer that is analogous to FEC. He declined to offer details but promised Imagine will debut the development early next year.
Harmonic, in April, jumped into the fray with its Iris video quality monitoring and optimization software. Although the company couches Iris’ capabilities in terms of QoS, that Iris can monitor source quality, examine individual channels, evaluate signal degradations and – significantly – provide an optimization score is an argument that Iris has as much to do with QoE as it does with QoS.
The other half of the challenge of having enormous volumes of data to sift through is having a paucity of data from the last mile – specifically in customers’ homes.
The problem in subscribers’ homes is often noise. The source might be something intermittent, or might be something that the consumer introduced, perhaps even unwittingly.
The data required is often there, just sitting in the set-top box, but it’s too often inaccessible from a remote location. New set-tops with DOCSIS or DOCSIS set-top gateway (DSG) signaling or OCAP can be polled for data. McEntee said his operation is now rigging a system to poll the Motorola boxes used in Comcast’s California markets. “We’re on the precipice of change,” he said.
McEntee also said that SNMP (the Simple Network Management Protocol) is becoming a prerequisite for the equipment Comcast buys.
Another element of QoE, not involving engineering but absolutely critical, is getting the information to call centers. It’s all about customer service, McEntee said. If the network operations center detects a problem in video quality, “we get that info to the call center, so if a customer calls we can say, ‘Yep, we know about it; we’re working on it.’ Whether you can get the problem corrected before you get too many calls – that’s the $64 million question.”
QoE is heavily skewed toward video, but what about the data channel – Internet access and VoIP? There seems to be a difference of opinion on whether QoE is needed there or not, and the differences seem to hinge on perceptions of consumer behavior and expectation.
McEntee said modems and E-MTAs can be polled for plenty of data, and noted that the data channel already has error correction built in. Besides, he said, customers still have a tolerance for service degradations in broadband: “The average consumer might not even recognize it as deterioration of service, where with video, it’s smack-dab in front of you.”
“The telephone has been around for 100 years, and people have expectations of how that’s supposed to work,” said Jonathon Gordon, Allot’s director of marketing. “They’ve been watching TV, and they have expectations for how that’s supposed to work. For them, it doesn’t matter if they turn on the TV or the PC – the expectations are similar. That puts the onus on the infrastructure providers to provide that quality.”
Providing quality will depend on identifying the application or service being offered and identifying what equipment the consumer has, and perhaps also on the tier of service the consumer has subscribed to. That argues for putting a lot more intelligence in the network than is currently there, Gordon said. Everyone is going to have to either manage or support things like Skype.
Don Bowman, Sandvine’s CTO, believes consumers still have a greater tolerance for compromises in quality with PC-based services. Problems, he said, “may be compensated by the convenience.” That said, consumer expectation of quality is a moving target, he noted.
The difficulty in the U.S., said both Gordon and Bowman, is that the network neutrality debate is forcing service providers to beg off using deep packet inspection (DPI) tools, which elsewhere are being used successfully for traffic management, and literally profitably to aid in service creation.
The concept of scoring or grading video quality is migrating to the data channel. Sandvine is working on developing a mean opinion score. “It’s still eluding us, but we’re working toward it,” Bowman said.
Imagine’s Howard noted that MSOs are experimenting with offering cable content to cable subscribers through the DOCSIS channel. As far as customers are concerned, they’ll be paying for that content. “If you’re working on the buffer model, and the flow of bits into playout is faster than the flow into the buffer, you get a pause, and that’s disruptive. People are going to complain if they’re paying for that.”
Providing video through the traditional MPEG channel, through the DOCSIS channel, plus transporting video from third parties – Hulu, YouTube, etc. – will be “challenging,” Howard said.