Intelligent broadband networks and fair share for subscribers
The challenges facing today's broadband network are a result of technical and business decisions made early in the evolution of public data networks. There is a constant contention between users and operators, applications and networks, as well as regulation and flexibility.
Today's broadband service providers are exploiting the latest application and user-aware, policy-based network management systems to ensure that every user receives their fair share of bandwidth.
In the beginning
Since the dawn of the information age, voice — and later data — has been delivered over shared networks. The public switched telephone network (PSTN) was built with more access points (phones) than actual switching capacity. Operators designed their networks based on realistic peak usage and invented mathematics (e.g., Erlang distribution) to help them model for peak times.
The PSTN was managed with call admission control - a call was not admitted to the network unless end-to-end capacity existed to handle it. This was a perfectly acceptable management model for voice circuit-switched calls, but it would not hold up in a modern network with today's voice and data demands.
Data access emerged
Data access brought a new complexity to the network. Access to data or network resources was no longer determined by slow, human-driven needs such as making a phone call. The destinations for voice and data grew exponentially, and so did the number of paths.
Managing this data network became even more complex when applications converged onto one network. Networks experienced the stress of the triple-play, converged voice, data and video being delivered over one pipe to a vast number of destinations.
The emergence of new applications, each with unique quality and timeliness guarantees, added another dimension and complexity to the network. An admission control model was no longer sufficient. The end-to-end network changed dynamically with mobile IP and other applications competing for the same limited bandwidth.
Traffic management was born
In the earliest days of consumer data access, traffic optimization was given little priority. Congestion was straightforward and limited to number of ports and switching capability on the PSTN.
As dial-up access matured, subscribers used prioritized PSTN lines for their modems. This caused conflict in dial-up access equipment and phone switches. Content started to move from proprietary forums to the Web, and broadband emerged as a new means to increase capacity and lower cost for providers.
The introduction of cable and DSL broadband access meant that Internet access grew more mainstream and users developed an always-on behavior. As a largely client-server paradigm, users consumed content that was generated by grassroots publishing. The term user-generated content did not yet exist, and at the time was limited to e-mail.
Soon enough, tech-savvy users with fast computer hardware discovered how to make digital copies of music and learned to 'rip' music from compact discs. Then music sharing grew to meet the digital age with new peer-to-peer (P2P) technology that powered free music sharing sites like Napster. As Napster and other file-sharing networks such as Gnutella, Kazaa and WinMX emerged and grew in popularity, bandwidth rates per subscriber soared.
Broadband service providers added access capacity as fast as possible to meet subscriber growth and implemented access limits on the TCP port numbers used by these bandwidth-intensive applications.
Millions of new subscribers around the world connected to the Internet. Drawn by popular applications such as P2P file sharing, voice-over-IP (VoIP) services like Skype, online gaming and digital media such as YouTube, bandwidth consumption absolutely soared.
With a growing number of applications, each with its own unique characteristics and delivery demands competing for available bandwidth, packets were easily dropped and quality of service (QoS) suffered. As a result, a very small number of users could cause quality problems for a wide range of popular applications.
With quality of service being threatened, service providers invested in intelligent network tools to get a better understanding of subscriber and application traffic on their networks. Network intelligence was the first step in balancing the competing network demands and establishing reasonable network management practices.
Models of traffic optimization
As broadband adoption continued to grow worldwide, service providers started to leverage their policy-management infrastructure to improve operational efficiencies in areas such as network security. Traffic optimization remained relatively static as long as the service provider left sufficient capacity for consumers to access the content they wanted.
However, the rise of mobile data also meant increasingly expensive and scarce access resources were being shared by unknown and varying numbers of users. In response, service providers added user-based management technologies to ensure fairness and provide a consistent quality of experience (QoE) for all users.
MODELS CURRENTLY IN USE
Application-based traffic optimization uses the properties of each network protocol to provide the minimum bandwidth that guarantees acceptable quality. Bulk file transfer applications are given the lowest priority since they are typically non-interactive and long-lived. For example, a one-way bulk non-interactive application, such as a file download, is lowest priority, while one-way streaming media, such as YouTube, may be next in priority, and an interactive application such as VoIP has the highest priority. As the network becomes heavily congested, this prioritization becomes important as each application is degraded, if not prioritized. Application-based optimization delivers excellent overall quality and subscriber satisfaction.
User-based traffic optimization is measured over relatively short time periods. This model gives the service provider a strong tool to ensure consistent quality on an individual subscriber basis. However, a strictly user-based model can be unfair to the heaviest users, as their traffic is indiscriminately treated regardless of the application they are using. A better solution would be to combine application and user-based models, allowing users to maintain their overall bandwidth behavior and control which applications are affected during periods of congestion.
Application- and user-based:
In this method, access to bandwidth is given to both the service provider and the end-user. The provider enforces user-to-user fairness allocation, and the end-user controls how their individual traffic operates within that allocation. For example, a user may wish to prioritize their VPN access higher than their HTTP, while another user may choose online gaming as their top priority. During periods of network congestion, the application- and user-based model ensures one end-user's prioritized application does not overly impact another's.
This traffic optimization model would increase subscriber satisfaction by offering personalized service, allowing end-users more control over their own priorities. This model may involve a 'quota' of QoS points or be presented as a Web page, which gives specific weightings per application or per application class. There would be no change in billing plans to operate this service, which makes it very feasible with today's technology and consumer education level. This model is optimal because it provides a network-neutral and consumer-transparent sharing of bandwidth.
BACK TO THE FUTURE
Internet traffic optimization has come a long way from the early days of dial-up access, both in terms of demand and complexity. End-user controls provide an enforced inter-user fairness that gives subscribers the ability to prioritize their applications as they see fit - effectively removing any bias that the service provider may impose upon applications.
Once traffic optimization reaches the stage where both the needs of the end-user and the needs of the operator are effectively balanced, traffic optimization will evolve once again. The new model may resemble an economic free market form that ensures fairness through the alignment of every party's interests. Transparency will be the overriding factor in determining the best possible network solution in all circumstances where quality of experience is concerned.
Email Don Bowman at: email@example.com