QUESTIONS TO ASK REGARDING INTERNET REGULATION COMMISSIONER ROBERT MCDOWELL INSTITUTE FOR POLICY INNOVATION (IPI) COMMUNICATIONS SUMMIT THURSDAY, NOVEMBER 12, 2009 11:15 a.m. to 11:45 a.m. Reserve Officers Association (ROA) Headquarters Minuteman Memorial Building Minuteman Ballroom, 5th Floor One Constitution Avenue, NE Washington, DC Thank you, Bart, for that kind introduction. And thank you to IPI for inviting me to this Summit. You have an interesting and full agenda today, which I know will generate some productive and thoughtful dialogue. It is an exciting time to be an FCC Commissioner. In August, we were restored to our full complement of five commissioners. Three are new commissioners, and all three bring a wealth of differing experiences, including personal and professional, public sector and private sector, that will be invaluable during the course of our future decision-making. And we will have many important decisions to make in the coming months. By some estimates, what we do at the FCC directly affects one-sixth of our economy and indirectly influences up to forty percent. Another way of looking at what we do reveals that the FCC touches the lives of virtually every American every day. Hasn’t almost every American made a phone call, watched TV or listened to the radio? Haven’t most Americans used a wireless device, sent a text message or email? Isn’t the Internet now part of our daily lives, whether we realize it or not? Even those without direct Internet access are affected by it. What would life in America be like without these technologies? The FCC’s regulations affect all of these aspects of our society. In fact, shortly after becoming a Commissioner, I saw a sticker on my son’s toy light saber and discovered that, apparently, the Commission regulates that too. I had to ask our 2 chief engineer to explain that one to me. So what we do has an enormous effect on nearly every aspect of our society. That’s why it is important to pay close attention to what the FCC does. But don’t just watch us, get involved. Tell us what you think. Next year promises to be the FCC’s busiest in a long time. The Stimulus Act1 mandates that the Commission present to Congress a National Broadband Plan by February 17. We kicked off our fact and opinion gathering with a Notice of Inquiry in April. Since then, our Broadband Plan team has held dozens of workshops with scores of panelists, issued numerous public notices and absorbed over 50,000 pages of ideas. Soon it will be time to crystallize what we have learned and start debating what will hopefully be constructive ideas. I would prefer that the Plan be flexible, iterative and not carved in stone. In some ways, we are trying to land an airplane on a distant foggy runway without having all of the needed navigation gear. That is to say, Congress also mandated detailed mapping of our nation’s broadband facilities and services, but that isn’t due until 2011. Yes, you heard that right, perhaps the most important piece of data needed to create a Broadband Plan, knowing where the facilities are and what services are being provided over them, won’t be available until a year after the Plan is due. Welcome to Washington. Our Broadband Plan team works ahead undaunted, however. They have already produced one key estimate. Depending on how fast we want broadband speeds to be in the future, what kind of facilities we want deployed and where we want them placed, broadband ubiquity in America could cost anywhere between $20 billion and $350 billion, according to our Broadband Plan team. At an October 1st hearing on capital formation in the broadband sector that I chaired, Chairman Genachowski expressed his preference for those capital expenditures to 1 American Recovery and Reinvestment Act of 2009, Pub. L. No. 111-5, 123 Stat. 115 § 6001(k) (2009). 3 come from the private sector. I agree. So that begs the question of how we provide incentives for such massive amounts of private sector investment? If our October 1st hearing taught us only one thing, it was that one way to provide a disincentive for investment is to create regulatory uncertainty. Our work on the National Broadband Plan has been somewhat eclipsed by our Notice of Proposed Rulemaking (NPRM) that bears the moniker “Preserving the Open Internet,” known elsewhere as our “net neutrality” NPRM. At the Commission’s October meeting, I concurred to the act of initiating a rulemaking. That is, I concurred in starting a process so the Commission could open a record for fact-gathering. I dissented, however, on the factual and legal predicates supporting the document. By way of background, during my time as a Commissioner thus far, I have never flat out voted against opening a NPRM because I believe that we should be led by the facts and the law in the course of probing into worthy debates. Of course, it always helps if we have jurisdiction and authority to act. My skepticism on this point led to my dissent on the substantive foundation put forth by the majority. However, it is important to be part of the process in the course of keeping an open mind. It is my hope that my colleagues subscribe to the same philosophy and that all of our minds can be changed purely on the basis of the facts and law. In that spirit, I praised the Chairman for including a wide variety of questions in the NPRM because so many abound. So let’s examine some threshold questions that I have, and that I hope will be answered as we move forward. The first question we should ask is: Is the Internet broken? The good news is that the government has already looked into that – recently, in fact. In 2007, the FCC launched a Notice 4 of Inquiry2 (NOI) into the status of the broadband market. We asked for anyone who had evidence of systemic market failure to give it to us. None was offered. That same year, the Federal Trade Commission issued a report on the state of the broadband market after a lengthy and thorough review. The FTC concluded, in a bi-partisan and unanimous 5-0 vote, that market failure to the degree that would warrant new regulations did not exist. In fact, the FTC took the unusual step of going much further by warning, “[W]e suggest that policy makers proceed with caution in evaluating calls for network neutrality regulation …. No regulation, however well- intended, is cost-free, and it may be particularly difficult to avoid unintended consequences here, where the conduct at which regulation would be directed largely has not yet occurred.”3 Keep in mind that this report was issued only 28 months ago. So it is imperative that we answer the next question as well before going further: What market conditions have changed in the past two years that would justify a dramatic change in policy that could withstand appellate muster? The Commission’s NPRM contains no market analysis. I hope that any future rules issued by the Commission would be buttressed by a thorough and honest market analysis that concludes a concentration of market power exists and that the players who possess such market power are abusing it to the point of harming their competitors. Any order based on evidence that falls short of being clear and convincing on this point would probably not survive appeal. Some advocates of regulation in this market point to less than a handful of troublesome actions – some several years old – by a small number of market players as sufficient evidence to justify a new regulatory regime. An important fact lacking in this debate is that once these actions were brought to light, all were resolved without imposing new regulations. Additionally, 2 Broadband Industry Practices, WC Docket No. 07-52, Notice of Inquiry, 22 FCC Rcd 7894 (2007). 3 See Federal Trade Commission, Broadband Connectivity Competition Policy, 155 (2007). 5 given the context of the uncountable number of Internet communications that occur every day, are these few incidents enough evidence to prove that the Internet is breaking to the point of needing more regulation? Over the past several years, the Commission has analyzed the broadband services market several times. Each time, the Commission has determined that every aspect of the broadband services market is sufficiently competitive to warrant removing them from the heavily regulated realm of common carriage under Title II of the Communications Act. Instead of foisting an old- fashioned Ma Bell-style monopoly regulation regime on these emerging new services, we have chosen, correctly, to classify broadband as largely unregulated Title I information services. The Supreme Court reinforced our approach in 2005 when it carefully examined and upheld our classification of cable modem service as an information service in the Brand X case.4 Since Brand X, every vote by the Commission to re-classify broadband as an unregulated information service has been without dissent. And what were the public interest benefits of liberating these new and enlightening services from the shackles of century-old monopoly regulation? Well, let’s look at the statistics. According to a recent Pew Internet & American Life Project study, in 2003, before broadband was deregulated, only about 15 percent of American adults had access to broadband at home. 5 Today, a large majority of American adults – over 60 percent – have broadband at home.6 But there’s more: New broadband platforms are emerging as well. Wireless is the fastest growing segment of the broadband market. In 2003, wireless broadband was rarely mentioned in policy debates. By December of 2005, however, there were already 3.3 million wireless 4 See NCTA v. Brand X, 545 U.S. 967 (2005). 5 See JOHN HORRIGAN, PEW INTERNET AND AMERICAN LIFE PROJECT, HOME BROADBAND ADOPTION 2009, at 11 (2009). 6 See id. 6 broadband subscribers.7 And, as of just two months ago – September 2009 – that number had grown to 94 million wireless broadband subscribers.8 So, it is easy to imagine that by next year, there could be 100 million wireless broadband subscribers. This trend should accelerate as more and more spectrum bands are built-out: BRS/EBS, AWS-1 and 700 MHz. And, recently, the Fiber-to-the-Home Council announced the results of a study conducted by RVA Market Research which found that the current number of fiber-to-the-home subscribers in America is more than 5.3 million.9 Just as positive, the study’s data also illustrates that the number of homes passed by fiber increased from only one percent five years ago to roughly 15 percent today.10 Precisely because we are Americans, we always want to do better and are never satisfied with the status quo. More Americans should have access to fatter and faster broadband pipes. But, we should ask ourselves, did we get this far as the result of government regulation, or have we seen an explosion of new offerings and increasing consumer adoption because government stepped out of the way and encouraged the construction of new delivery platforms? As we produce our National Broadband Plan and the network management regulation proceeding, I hope thorough and honest answers to that question will emerge. And that begs the second category of questions in my analytical flow chart for the Open Internet NPRM: If the Internet is broken, is government the best tool to fix it? As I have said many times, the Internet is perhaps the greatest deregulatory success story of all time. Although it was originally a government creation, it became the fastest penetrating phenomenon invented by humans not through command-and-control government industrial policy, but by privatizing it 7 See comScore M:Metrics MobiLens Market Viewer - United States (accessed Nov. 11, 2009). 8 See id. 9 See North American Fiber to the Home Connections Surge Past Five Million, Press Release, FTTH Council North American (Sept. 29, 2009). 10 See id. 7 in 1994. Since the early days of the state-run ARPANET, network management and Internet governance initiatives have migrated further away from government regulation, not closer to it. This evolution away from government intervention has been the most important ingredient in the Internet’s success. Since its early days, the Internet has had to overcome a plethora of threats: denial of service attacks, viruses and unanticipated network congestion, just to name a few. The ‘Net has not merely survived these challenges, it has prevailed. Early efforts to keep the Internet open and free ignited the creation of loosely-knit and non-state-controlled Internet governance entities staffed by volunteer engineers, academics and software developers, among others. For example, the Internet Society (ISOC), an umbrella organization founded in 1992, is home to the Internet Engineering Task Force (IETF) that develops technical standards for the Internet. It is a non- profit corporation with a board of trustees consisting of, and funded by, individuals and organizations in the Internet community virtually free from government influence. Several other organizations work with ISOC on a variety of Internet governance issues. Among them are: the Internet Engineering Steering Group (IESG), the Internet Research Task Force (IRTF), the Internet Research Steering Group (IRSG), and the Internet Architecture Board (IAB). The P4P Working Group, which works on peer-to-peer congestion issues, is similar. These collaborative bodies have never failed to resolve major network management challenges. Will we conclude that the government could do better? Will the government be able to replicate the billions of decisions that are made each day in the Internet’s ecosystem? Can the Commission really respond to cyber challenges in Internet time? I have the highest regard for each of my four colleagues on the Commission, but not one of us is an engineer. Do you really want us making these highly technical decisions? 8 Furthermore, before moving forward with a new regulatory regime in this space, we should be mindful of how closely the international community watches the FCC’s actions. While participating in the International Telecommunications Union’s conference in Geneva last month, it became obvious to me that some foreign regulators are waiting for the U.S. to assert more government authority over the Internet to justify an increased state role over the Internet’s affairs in their countries. Let us be aware that foreign governments may have a definition of the “public interest” that is far different from ours. Even if our intentions are pure, their intentions may be a bit more nefarious. So if the Commission acts further, we should be careful to avoid inadvertently giving political cover or false precedents to strong-arm regimes that wish to turn back the clock on liberty’s progress. An additional concern regarding the proposed net neutrality rules is that they attempt to draw a line between applications and networks precisely at a time when the market is sparking unprecedented convergence between the two. For instance, many proponents of network management regulation speak of unfettered innovation at the “edge” of networks – such as on consumers’ personal computers and wireless devices – while the freedom to innovate “in the middle” of networks should be more limited due to concerns regarding potential anticompetitive conduct by network operators. It is my view, however, that constructive public policy should subscribe to the philosophy that unfettered innovation should be encouraged equally at all points of the network – at the edge and in the core. As a practical matter, it is fast becoming impossible to separate the two. Consumers are telling the marketplace that they don’t always want networks that operate merely as “dumb pipes.” Sometimes they want the added value and that comes from intelligence inside a network’s core as well. For instance, Cisco produces routers that contain over 28 million lines of 9 code. Should the government attempt to determine which line is intended to serve an operating function versus one that may offer some other kind of value - all in the name of preventing anticompetitive conduct? I will keep saying this throughout this debate: Those who oversimplify this matter as a zero sum scenario between a dumb pipe and smart edge versus a smart pipe and dumb edge offer only a false choice that does not reflect the realities of today’s market. Finally, during the course of this debate, many have confused the important difference between “discriminatory” conduct and “anticompetitive” conduct. But the reality is that the Internet can function only if engineers are allowed to discriminate among different types of traffic. The word “discriminate” carries with it negative connotations, but to network engineers it means “network management.” Discriminatory conduct, in the network management context, does not necessarily mean anticompetitive conduct. The public interest would be better served if the debate would focus more on this important distinction. In conclusion, regardless of the outcome of this debate, I hope that the Commission will play a leadership role in helping to spotlight instances of market failure and convey them to appropriate non-governmental collaborative bodies for review and action. This model, supported by strict enforcement of our antitrust laws, could very well provide the benefits sought by proponents of new rules without incurring the unexpected costs of a new regulatory regime. After all, this way of doing business has worked quite well thus far. In the meantime, the best antidote to potential anticompetitive behavior is more competition. Let’s hope that all future FCC policies encourage more competition in lieu of regulation and rationing. Thank you again, Bart, for inviting me to participate in this Summit. I look forward to taking some questions from your guests today.