Network Neutrality – Overgovernance in the Digital Age
Moving forward, the debate should keep focus on the relevant issues.
The Internet today is the backbone of American and world culture and economics. There is almost no limit to the content available today. Any person with an idea and access to the Internet can share that idea with the world more quickly than in any other time in human history. But the idea that this “network of networks” is open to everyone is under attack. There is a heated debate raging today under the title of network neutrality. Unfortunately, this debate is also focused on misconceptions and the factors that drive decisions in each of those interconnected networks.
Network neutrality should be about how the last mile of each network, the part between the customer and the service provider, is used and understood, and not about hypothetical plans by corporate Internet service providers. Understanding the networks, what governs their use and the goals of the service providers will lead to a better understanding of the debate.
An important prerequisite for this debate is the definition of neutrality. Neutral is defined as “not causing or reflecting a change in something.” From a network management perspective, neutrality would be complete indiscrimination in the routing of IP packets, with no regard to the source, destination, nature of the packet or the application for which the packet is intended.
In practice, neutrality has almost never been a function of network management. Discrimination has been exercised with respect to the type of traffic that the network owner wished to allow across varying parts of the network. The motivation for this discrimination is two-fold: to keep the network running efficiently and to protect the network from intrusion or attack.
The discriminatory management of networks dates back to the first instances of “hacking” ever publicized. A good example of ethical discrimination would be for a network to prioritize voice traffic over traffic for a Web page so that the phone call does not suffer voice quality problems. The fact that network administrators have managed their networks in a manner that allows the general public to believe that the network is entirely neutral is a testament to the method in which administrators have worked diligently to ethically discriminate.
Advocates of network neutrality have a variety of goals that fall under the umbrella of an “open and free Internet.” The first is that all traffic in all circumstances should be treated exactly the same, with no regard for the source, destination, nature of the traffic or intended application. The second is that there exists a duopoly in the ISP industry. The advocates further maintain that the cable and telephone companies own the vast majority of the infrastructure that allows content providers and Internet users to connect to the Internet. Consequently, network neutrality advocates maintain that consumer choice is severely restricted and puts the cable and telephone companies in an unfair position of advantage.
Thirdly, those who promote network neutrality contend that, as a result of the duopoly, ISPs will use anti-competitive tactics and manipulate Internet traffic in order to maximize profits at the expense of the content owners and the Internet users. There is some anecdotal evidence to support this claim, as in the cases of Madison River Communications and Comcast in the recent past. And a final argument for network neutrality asserts that a network owner’s ability to inspect the traffic places at risk consumers’ privacy and freedom of speech.
In the case of neutral management of network traffic, many flaws exist. First, proponents of network neutrality contend that until now, all traffic has been treated “neutrally” and, as a result, innovation on the Internet has burgeoned.
The mistake in the argument has already been previously outlined. Ethical discriminatory management practices are almost as old as the networks themselves. Using legislation to take away the ability of a network administrator to inspect and manage the traffic would effectively take away the administrator’s ability to protect the network and keep it running efficiently.
The result would actually make the network less open as more bandwidth-intensive applications, such as video services, would begin to absorb so many network resources that users of less bandwidth-intensive applications, such as e-mail and Web browsing, would suffer significant degradation of their service.
Imagine a scenario where 95 percent of the users on a particular network are simply browsing a variety of websites for information, and the remaining 5 percent are streaming videos. If those 5 percent are demanding equal prioritization of traffic, 95 percent of the users could experience a noticeable delay in their browsing for the duration of the streaming video. Conversely, if prioritization of traffic allows the low-bandwidth browsing through first, only 5 percent of users would experience a delay, and that delay will be negligible when compared with the experience of viewing the video, especially as most software-driven video players buffer many of the packets in the stream anyhow.
As early as 1996, there were thousands of ISPs across the country, but over time the largest providers acquired the smaller ones. As recently as 2006, around 94 percent of the subscribers on the Internet received services from either telephone or cable companies.
In the same year, Paul Misener, vice president of global public policy for Amazon.com, testified to the Committee on the Judiciary that there was no foreseeable competition in the near future to cable and telco.
In just four years since this statement, the Federal Communications Commission is reporting that high-speed Internet service from cable modems and DSL comprise only about half of the total high-speed lines. Predictions for the future show that there is no sign of this trend slowing. This is a significant indicator of the emergence of new technologies to deliver high-speed Internet access. Prime examples of emerging technologies are wireless Internet, or “air” cards, as well as the latest iterations of Palms and Blackberrys and the like.
One of the primary arguments offered by supporters of network neutrality is exemplified in remarks by Tim Wu, a Columbia Law School professor and one of the earliest voices in this debate. Based on his analysis of the future plans of broadband ISPs, Wu has consistently stated that these corporations will inevitably choose to use their control over the gateways to the Internet in a manner that squeezes more and more revenue from the content providers.
In essence, he has repeatedly suggested that every ISP will unquestionably make the decision to force large content providers, such as Google and Amazon.com, to pay a premium for high-quality access to the consumer. This argument is predicated on the assumption that all large corporations will always choose the path of increased revenue, even in the face of ethical and legal concerns.
This generalization is difficult to support in that many corporations have consistent track records of corporate and social responsibility. A prime example is AOL’s handling of Nazi hate websites in 1998. When complaints began about the websites, AOL immediately closed the sites, rather than taking a chance on offending some in the public.
Misener has made assertions following the same tract as Wu. In a written statement to Congress in 2006, he claimed that the phone and cable companies would “fundamentally alter the Internet in America unless Congress acts to stop them.”
In the wake of FCC and Supreme Court decisions the prior year, he stated that the ISPs would act to impede traffic on the Internet in the interest of profit. This is despite the fact that in the four years since his statement, very few ISPs have implemented network management policies that fit Misener’s description. And the implication that the FCC’s decision forces it to allow such behavior doesn’t hold water in light of the FCC’s actions in two cases of illicit network management practices: Comcast and Madison River.
These two cases highlight the specific claims made by sponsors of network neutrality. In 2005, Madison River was found to have been blocking VoIP traffic across its network, presumably as a way to obstruct competition. The FCC stepped in and secured agreement from Madison River that it would allow the traffic to flow.
In 2007, Comcast was accused of severely degrading – and in some cases blocking – traffic for a file-sharing application called BitTorrent. In this case, the FCC determined that Comcast had been “substantially impeding consumers’ ability to access [BitTorrent] content and to use the applications of their choice.” Further, the FCC ruled that Comcast had to disclose its network management practices to both its customers and the FCC and end the discriminatory practices. (Comcast has since been legally vindicated, winning a suit that upheld its contention that the FCC lacked the authority to manage data services.)
One of the pioneers of the Internet, Vinton Cerf, often referred to as the father of the Internet, also added his concerns to the debate: “As long as we have clumsy or consumer-unfriendly regimes for network management, we will see problems for some reasonable applications, and thus some inhibition of innovative new services.”
Despite his attempt to fortify the position of network neutrality, Cerf argues for and against network neutrality with this statement. When companies keep their customers in the dark and attempt network management practices without thoroughly examining the possible consequences, concerns about the future of the Internet begin to crop up.
When a company, such as Comcast, chooses to expose their network management practices to the light of day and keeps the interests of its customers in the forefront of decisions, the same concerns tend to die down. This is a strong example of how public scrutiny, as well as appropriate intervention from the FCC, can keep concerns about unethical discrimination at bay.
Endorsers of network neutrality also raise concerns about consumer privacy risks. One of the common practices in network management today is to use a process known as deep packet inspection to “peer inside” the IP packets that traverse a network to see what applications the packets are destined for.
The general intent is to keep networks flowing smoothly and to make certain that packets are routed in the most logical manner possible.
Some advocates of network neutrality believe that there is a danger beyond ISPs choosing to block or impede certain traffic. Their concern is that the data about who watches what will be collected and used, possibly in violation of subscriber privacy.
David Clark, a senior research scientist at the Massachusetts Institute of Technology, puts it bluntly: “What we are going to be fighting about … is who has the right to observe everything you do.” Clark goes on to explain that ISPs might use the information learned from inspecting a stream of packets to model the behavior of an Internet user, and potentially sell that information to any interested parties.
Recent history has shown that congressional and public forces will act to decry any behavior that looks or feels like invasion of privacy.
Carnegie Mellon professor David Farber notes that there has been some activity on the business front of modeling user behavior, but companies that have purchased these services have “usually backed off rapidly because of the noise they were getting.”
Although it is common practice for computer cookies to be used to track Internet user behavior, it is important to note that content providers, not service providers, track this behavior.
What remains is to reasonably address the arguments in favor of network neutrality with an eye on practical applications and their implications. Proper network management dictates ethical discrimination of traffic on an IP network. If a network isn’t designed and managed with this principle in mind, preventing delays in low-bandwidth browsing applications will be difficult. The principle is to manage the traffic in a way that negatively affects the fewest number of users for as short a time as possible.
One of the greatest strengths of network and Internet security is the ability of routers to distinguish between different types of traffic. A savvy network administrator will be able to identify harmful traffic and prevent it from accessing important network elements through the policies enabled on a gateway router. One common method of attack on a network is a denial of service attack – a brute force method of throwing as much dummy traffic as possible at a network, with the goal of tying up network resources to the point that legitimate traffic is severely degraded or unable to pass at all.
Traffic management features of routers allow them to drop all packets that meet this description in order to protect the network. Indiscriminate routing of all packets would render this security method unusable.
On the matter of network discrimination described by Wu above, the issue of corporate governance has been discussed. Wu noted to the Committee on the Judiciary that there is a long history of network discrimination in this country, dating back to the 1860s. He described an agreement between Western Union, the telegraph company, and The Associated Press in which Western Union consented to only carry news on its wires from AP. This left all other news outlets in the dark.
This obvious content discrimination makes an inaccurate comparison to today’s information services in that there were no other mechanisms for delivering content back then. Today, if information is blocked in one avenue, there are countless other mass media avenues through which that information will inevitably disseminate. The recent social media phenomena in the Middle East during recent elections demonstrates this idea conclusively.
Wu’s other prime example of network discrimination hits closer to home. He pointed out that the original Bell telephone companies created a rule in the 1960s that did not allow any device other than a Bell telephone to be connected to the telephone network. He went on to explain that the D.C. Circuit Court and the FCC had to step in and enforce the right of the public to connect any device to the network that didn’t pose a danger to that network. Note that this scenario didn’t require creation of additional legislation. The circuit court and the FCC simply acted to enforce rules that already existed in this case. This is exactly the means of correction proposed by opponents of network neutrality.
The arguments that follow the example of Misener and Cerf also suffer the same fate when examined closely. No legislation has been implemented since the decisions by the Supreme Court and the FCC in 2005, yet market forces and public intolerance for inappropriate behavior have kept the ISPs in check. The Comcast and Madison River examples of unethical treatment of Internet traffic were quickly corrected by the FCC and have not been repeated. The lessons that have been learned since the Internet’s earliest days continue to be learned as the Internet evolves.
The privacy issues raised have proven to correct themselves to date through the same means. NebuAd, a company dedicated to analyzing and selling the profiles of Web users to ISPs, ceased operations in the U.S. after Congress began to investigate their business.
Moving into the future, the debate over network neutrality should keep focus on the relevant issues. Those issues are related to the way the networks were designed to operate. The business models are built on the aforementioned principle of oversubscription. That has proven to be the most favorable means of implementing broadband access in terms of return on investment. It is precisely that kind of innovation – unchecked by regulation or legislation – that has helped lead to the explosion of the Internet to date.
Modern media attention has created an informal system of backlash for any organization that chooses to do business in an unethical fashion. There is no monopoly on information in the United States. The network neutrality debate would not exist if there was a monopoly. That there is no modern case of collusion similar to the one between Western Union and AP demonstrates that. No public discourse on the matter was ever possible back then. Corporations can no longer hide from public scrutiny under the pretense that the general public is too stupid to keep up. The market simply will not bear behavior inconsistent with the new vision of a globally connected world. The FCC continues to maintain that it has the authority to keep ISPs in check, and it has asserted that it plans to do so.
Christopher Yoo, professor of law and communication at the University of Pennsylvania Law School, sums up the role of the FCC now and in the future in his written statement to Congress in 2008: “The vigor with which the FCC has pursued allegations of improper network management suggests that the regulatory structure may already be in place to ensure that consumers are both protected and able to enjoy the Internet’s tremendous promise in the future.”