PUBLIC NOTICE News Media Information 202 / 418-0500 Internet: http://www.fcc.gov TTY: 1-888-835-5322 Federal Communications Commission 445 12 th Street, S.W. Washington, D.C. 20554 DA 14-1499 Release: October 16, 2014 WIRELINE COMPETITION BUREAU, WIRELESS TELECOMMUNICATIONS BUREAU, AND THE OFFICE OF ENGINEERING AND TECHNOLOGY SEEK COMMENT ON PROPOSED METHODOLOGY FOR CONNECT AMERICA HIGH-COST UNIVERSAL SERVICE SUPPORT RECIPIENTS TO MEASURE AND REPORT SPEED AND LATENCY PERFORMANCE TO FIXED LOCATIONS WC Docket No. 10-90 Comments: (30 days after date of publication in the Federal Register) I. INTRODUCTION 1. In this Public Notice, the Wireline Competition Bureau (WCB), the Wireless Telecommunications Bureau (together, the Bureaus), and the Office of Engineering and Technology (OET) seek to further develop the record on how compliance with speed (also referred to as bandwidth) obligations should be determined for recipients of high-cost support that deploy broadband networks to serve fixed locations. In addition, we seek comment on whether the same testing methodologies adopted for price cap carriers accepting model-based Phase II support should be applied to other recipients of support to serve fixed locations, such as rate-of-return providers and those that are awarded Connect America support through a competitive bidding process. 1 Finally, we seek comment on the circumstances that would trigger an audit of the speed and latency metrics. II. BACKGROUND 2. In the USF/ICC Transformation Order, the Commission required that as a condition of receiving high-cost universal service support, eligible telecommunications carriers (ETCs) offer broadband service in their supported areas that meets certain basic performance requirements. 2 In general, as a condition of receiving support for voice telephony, ETCs subject to broadband performance obligations currently are required to “offer broadband at actual speeds of at least 4 Mbps downstream and 1 Mbps upstream, with latency suitable for real-time applications, such as VoIP, and with usage capacity 1 This would include entities that will receive Connect America funding through the rural broadband experiments. Connect America Fund, ETC Annual Reports and Certifications, WC Docket Nos. 10-90, 14-58, Report and Order and Further Notice of Proposed Rulemaking, 29 FCC Rcd 8769, 8779-80, paras. 24-29 (2014). 2 See Connect America Fund et al., WC Docket No. 10-90 et al., Report and Order and Further Notice of Proposed Rulemaking, 26 FCC Rcd 17663, 17705-06, para. 109 (2011) (USF/ICC Transformation Order and FNPRM), aff’d 753 F.3d 1015 (10 th Cir. 2014). As in the USF/ICC Transformation Order, we use the term high-cost support or high-cost funding to include all existing high-cost universal service mechanisms as well as the Connect America Fund. See id. at 17695 n.126. Competitive ETCs are not subject to broadband performance obligations with the exception of those that receive Mobility Fund Phase I support. For Mobility Fund Phase I and Tribal Mobility Fund Phase I, high cost support mechanisms specifically dedicated to mobile services, the Commission adopted different performance benchmarks. See id. at 17791-93, paras. 359-368. 2reasonably comparable to that available in comparable offerings in urban areas.” 3 In addition, the Commission required that recipients of high-cost universal service support test their broadband networks for compliance with speed and latency metrics, certify and report the results to the Universal Service Administrative Company (USAC) and the relevant state or tribal government on an annual basis, with those results subject to audit. 4 In the 2011 USF/ICC Transformation FNPRM, the Commission sought comment on the specific methodology ETCs should use to measure the performance of their broadband services subject to these general guidelines, and the format in which funding recipients should report their results. 5 It directed the Bureaus and OET to work together to refine the methodology for implementation. 6 3. Subsequently, in October 2013, WCB further defined the service obligations of price cap carriers that accept Phase II model-based support through the state-level commitment process. 7 It concluded that price cap carriers must be prepared to demonstrate a provider round-trip latency of 100 milliseconds (ms) or less to meet the Commission’s latency requirement. To show compliance with latency obligations, a price cap provider must certify that 95 percent or more of all peak period measurements (also referred to as observations) of network round trip latency are at or below 100 ms when measured during the peak period between the customer premises and the nearest designated Internet core peering interconnection point (IXP). The measurements must be conducted over a minimum of two consecutive weeks during peak hours for at least 50 randomly-selected customer locations within the census blocks of each state for which the provider is receiving model-based support using existing network management systems, ping tests, or other commonly available network measurement tools. Alternatively, carriers participating in the Measuring Broadband America (MBA) program may use the results from that testing to support their certification that they meet the latency requirement, so long as they deploy at least 50 whiteboxes to customers within the Phase II-funded areas within each state and certify that 95 percent or more of the measurements taken during peak periods for a period of two weeks were at or below 100 ms. The provider is responsible for the hardware and administrative costs of these whiteboxes to the extent such whiteboxes are in addition to those deployed as part of the MBA testing. 4. In its recent Connect America Fund Further Notice of Proposed Rulemaking, the Commission sought comment on a number of proposals regarding broadband providers’ service obligations. The Commission proposed that the downstream speed requirement be increased from 4 Mbps to 10 Mbps for the Connect America Fund. 8 The Commission also proposed extending the latency standards previously implemented for price cap carriers accepting Phase II model-based support to rate- of-return carriers and ETCs that receive Connect America support through a competitive bidding process. 9 3 USF/ICC Transformation Order, 26 FCC Rcd at 17726, para. 160. These requirements do not apply to Mobility Fund recipients. 4 Id. at 17705-06, para. 109. 5 Id. at 18045-46, paras. 1013-1017. 6 Id. at 17708, para. 112; 47 C.F.R. § 54.313(a)(11). 7 Connect America Fund, Report and Order, WC Docket No. 10-90, Report and Order, 28 FCC Rcd 15060 (Wireline Comp. Bur. 2013) (Phase II Price Cap Service Obligation Order). 8 Connect America Fund, et al., WC Docket No. 10-90 et al., Report and Order, Declaratory Ruling, Order, Memorandum Opinion and Order, Seventh Order on Reconsideration, and Further Notice of Proposed Rulemaking, 29 FCC Rcd 7051, 7100-03, paras. 138-148 (2014). 9 Id. at 7103-04, paras. 149-152. 3III. MEASURING COMPLIANCEWITH SERVICE OBLIGATIONS A. Speed Performance Measurement 5. The record received in response to the 2011 USF/ICC Transformation Order FNPRM on the methodology to be implemented for testing compliance with service obligations was not well developed. We now seek to refresh the record on the methodology to be used for demonstrating compliance with the speed obligation for ETCs that receive high cost support to deploy broadband networks to fixed locations. 10 Should internal network management system (NMS) tools be used to measure speed performance? Alternatively, should external measurement tools such as Speedtest/Ookla or Network Diagnostic Tests (NDT) by M-Labs? Are there better and more reliable methods of measuring speed? 6. Internal NMS tools vary among providers. How can the Commission ensure that internal NMS tool measurements are valid? Will such tools account for multiple transmission control protocol (TCP) streams, TCP window sizes, TCP slow start, and other factors in speed measurement? How would measurements from such tools be verified? Are these types of tools too burdensome or complex for speed measurements? Would such tools have any effect on customer service if used during peak periods? If external testing is adopted, how would measurements be verified? Are there better external measurement tools than those identified above? 7. What testing parameters should be used for speed testing? Should they be different for internal and external testing? 8. What testing parameters should be used to measure broadband performance for wireless providers offering service at a given address? Should the testing parameters be different if the service utilizes a fixed attachment to the building? 9. We propose to require all ETCs subject to broadband performance obligations to serve fixed locations to utilize testing parameters for speed similar to those already adopted for latency for price cap carriers. Specifically, we propose to adopt a methodology that would require measurements to be made once hourly during peak periods, 7:00 pm to 11:00 pm daily local time, over four consecutive weeks, require 95 percent of the observations to be at or above the specified minimum speed, define the endpoints for the measurement as the customer premises to Commission-designated IXP locations, require testing to occur at least annually, and require a minimum of 50 randomly selected customers locations to be tested within the geographic area being funded in a given state. 11 To the extent parties argue that the process adopted for latency testing be adjusted and used for speed testing, they should describe with specificity what changes should be made. We also seek comment on whether the data usage in the proposed tests would have a significant effect on consumers and, if so, how such effects could be mitigated. Should any data caps or monthly usage limits be adjusted to prevent the testing from affecting consumers? 10. We propose to allow ETCs, including but not limited to price cap carriers, the option of testing compliance with speed requirements through the MBA program, similar to what WCB adopted for latency obligations. If we were to do so, could we apply the same conditions and parameters as adopted for latency testing? Would any changes be needed? 11. Should the testing options and parameters be the same for rate-of-return carriers and providers awarded support through the Phase II competitive bidding process as for price cap carriers? If not, what should they be and why? 10 The discussion that follows does not address how to test compliance with relevant requirements for the Mobility Fund. 11 We propose that FCC-designated IXP locations will be same as those identified for purposes of latency testing. See Phase II Price Cap Service Obligation Order, 28 FCC Rcd at 15071 n.63. 412. We seek to augment the record received in response to the 2011 USF/ICC Transformation Order FNPRM based on the considerations outlined above. Specifically, parties such as AT&T 12 and Alaska Communications Systems 13 argued that the testing mechanism should not require measuring service at all end-user locations. A testing mechanism for speed similar to that adopted for latency would only require testing at a certain number of locations. Frontier advocated that the Commission provide a choice of measurement test options. 14 A speed-testing mechanism similar to that adopted for latency would provide two options for testing. A number of rural associations stated that the Commission should not impose measurement requirements until technically feasible, less burdensome testing procedures were available. 15 A speed testing mechanism similar to that adopted for latency should be easily manageable for even very small carriers. We seek comment on these tentative conclusions. B. Latency Performance Testing for Rate-of-Return Carriers and Providers Awarded Connect America Support Through Competitive Bidding 13. We seek comment on whether the two methods adopted to test price cap carrier compliance with latency service obligations should also be used to test compliance with latency service obligations for other recipients of high-cost support with a broadband public interest obligation to serve fixed locations. If so, should the testing parameters be the same for rate-of-return providers and those that are awarded Phase II support through a competitive bidding process as adopted for price cap carriers? If not, what should those parameters be and why? 14. The latency-testing options adopted for price cap carriers should provide at least one readily achievable method suitable for small, rural carriers. We seek comment on this tentative conclusion. In response to the 2011 USF/ICC Transformation FNPRM, rural carriers argued that broadband performance should only be measured for those portions of the network controlled by the provider or its commonly- controlled affiliates. 16 We note that in the Phase II Price Cap Order, WCB rejected this argument for price cap carriers because (1) testing only part of the network will not demonstrate the quality of service being provided to the end user and (2) carriers have a number of options to influence the quality of service from their transit and/or peering providers. 17 Would that same reasoning be applicable to other providers, such as rate-of-return carriers and non-traditional providers that may receive support through a competitive bidding process? C. Use of MBA Program for Testing and Reporting 15. The MBA program developed out of a recommendation by the National Broadband Plan to improve the availability of information for consumers about their broadband service. The program examines service offerings from the largest broadband providers—which collectively account for over 80 12 Comments of AT&T, WC Docket No. 10-90 et al., at 20 (filed Jan. 18, 2012) (AT&T Comments). 13 Comments of Alaska Communications Systems Group, Inc., WC Docket No. 10-90 et al., at 4 (filed Jan. 18, 2012). 14 Comments of Frontier Communications Corporation, WC Docket No. 10-90 et al., at 4-5 (filed Jan. 18, 2012). 15 Initial Comments of the National Exchange Carrier Association, Inc.; National Telecommunications Cooperative Association; Organization for the Promotion and Advancement of Small Telecommunications Companies; and the Western Telecommunications Alliance, WC Docket No. 10-90 et al., at 33 (filed Jan. 18, 2012). 16 Id. at 34. 17 Phase II Price Cap Service Obligations Order, 28 FCC Rcd at 15073-74, paras. 31-32. In that Order, the Bureau specifically identified IXPs to be used for testing but allowed providers in noncontiguous areas of the United States to conduct their latency network testing from the customer location to a point at which traffic is consolidated for undersea cable transport to an IXP in the continental United States. Id. at 15071 n.63, 15075, para. 35. 5percent of all U.S. wireline broadband connections—using automated, direct measurements of broadband performance delivered to the homes of thousands of volunteer broadband subscribers. 18 The methodology for the program focuses on measuring broadband performance of an Internet service provider’s network, specifically performance from the consumer Internet access point, or consumer gateway, to a close major Internet gateway point. A collaborative process involving Commission staff, industry representatives, and academics was used to determine the test suite and operations for the MBA program. 16. The MBA program uses whiteboxes deployed to individual consumers, called panelists, to collect data on service levels. These whiteboxes perform periodic tests to determine the speed and latency of the service at a particular panelist’s location, and the results of the tests are automatically sent to and recorded by an independent vendor. Panelists are selected via a process that allows for consumer registration and verification by the service provider followed by activation as a testing panelist. More than 13,000 whiteboxes have been shipped since the MBA program began. 19 17. Currently, the MBA program tests wireline offerings of 15 large broadband providers and one satellite-based provider. If we were to adopt a regime in which ETCs subject to broadband public interest obligations could demonstrate compliance with broadband testing requirements through their MBA results, would that encourage additional providers, including smaller providers, to seek to join the MBA? Could the MBA accommodate a large number of additional participants? Is it feasible for smaller providers to participate in the MBA, particularly if they must pay the administrative and hardware costs of the whiteboxes? Are these costs likely to be greater or less than the cost of performing ping-type tests from 50 locations for latency and the testing that will be required to verify speed? Would allowing additional providers to join the MBA provide more detailed and more accurate information on provider performance at lower total cost? 18. If additional providers join the MBA program for performance testing, should their data be make public and reported in the annual MBA reports as is done for other MBA providers? Should the MBA program consider creating a separate category of membership for providers that want to limit testing to Connect America-supported areas? 19. We seek comment on these and any other issues surrounding additional provider participation in the MBA program. 18 The number of broadband providers and percentage of broadband connections has varied during the MBA program, with generally 12 to 13 providers and between 84 and 86 percent of connections. See Federal Communications Commission, Office of Engineering and Technology and Consumer and Governmental Affairs Bureau, Measuring Broadband America, A Report on Consumer Wireline Broadband Performance in the U.S. at 3 & n.10 (Aug. 2011), http://transition.fcc.gov/cgb/measuringbroadbandreport/Measuring_U.S._- _Main_Report_Full.pdf (Measuring Broadband America 2011 Report); Federal Communications Commission, Office of Engineering and Technology and Consumer and Governmental Affairs Bureau, Measuring Broadband America, A Report on Consumer Wireline Broadband Performance in the U.S. at 8 & n.12 (July 2012), http://transition.fcc.gov/cgb/measuringbroadbandreport/2012/Measuring-Broadband-America.pdf (Measuring Broadband America 2012 Report); Federal Communications Commission, Office of Engineering and Technology and Consumer and Governmental Affairs Bureau, Measuring Broadband America, A Report on Consumer Wireline Broadband Performance in the U.S. at 8 (Feb. 2013), http://transition.fcc.gov/cgb/measuringbroadbandreport/2013/Measuring-Broadband-America-feb-2013.pdf (Measuring Broadband America 2013 Report); Federal Communications Commission, Office of Engineering and Technology and Consumer and Governmental Affairs Bureau, Measuring Broadband America, A Report on Consumer Wireline Broadband Performance in the U.S. at 8 (Feb. 2014), http://data.fcc.gov/download/measuring- broadband-america/2014/2014-Fixed-Measuring-Broadband-America-Report.pdf (Measuring Broadband America 2014 Report). 19 Measuring Broadband America 2014 Report, Technical Appendix at 10 n.12. 6D. Commission-Developed Testing Mechanism 20. In the event that joining the MBA program proves infeasible for additional providers, we seek comment on whether the Commission should implement a performance testing platform specifically for Connect America-supported broadband services. 20 One possibility is to implement an oversight mechanism that would be similar to the MBA program. Like the MBA program, this could be a hardware-based test infrastructure administered by one or more service vendors with whiteboxes deployed to consumers throughout Connect America-supported areas. Having a single entity, such as USAC, procure the necessary vendor and infrastructure to administer this program would minimize the overall cost of the program as well as the costs to participating providers. We seek comment on whether such a program would be feasible. If so, should it be similar to the MBA program, or is there a better way to measure broadband performance? 21. If the Commission were to implement such a testing mechanism, should all ETCs subject to broadband public interest obligations to serve fixed locations be required to participate? To the extent commenters argue that any ETCs should be exempt, they should identify with specificity the costs and benefits of requiring them to participate, and identify alternative means of achieving the Commission’s oversight objectives. 22. We estimate that the total costs for an MBA-type performance oversight program for ETCs receiving high-cost support to serve fixed locations would be approximately $4.2 million, which would include the necessary hardware and software as well as an initial allocation of 5,000 whiteboxes, in the first year and approximately $5.9 million each year thereafter (which incorporates an additional 5000 whiteboxes per year). Our total cost calculation was based on the following estimates 21 : Year 1 Expenses Annual Expenses After Year 1 Whiteboxes (client testing devices) $1.2 million $1 million Core Servers $1.7 million $1.65 million Program Administrative Expenses (could be performed by USAC) $1.3 million $1.3 million Total Cost $4.2 million $3.9 million The cost estimates above are based on having a single entity contract for the necessary hardware and services to minimize costs through streamlined administration and bulk hardware purchases. If the Commission were to implement such a centralized testing program, should these costs be borne by 20 In prior comments, Verizon argued that the Commission should pattern the process for measuring performance on that developed for the MBA. Comments of Verizon, WC Docket No. 10-90 et al., at 21-22 (filed Jan. 18, 2012). However, AT&T argued that creating such an independent mechanism would be costly and complex. AT&T Comments at 23. 21 The cost estimates are based upon a number of assumptions: (1) a whitebox costs $100; (2) 5,000 whiteboxes will be purchased each year, with a goal of deploying 20,000 operational whiteboxes with additional whiteboxes for replacements as needed; (3) whiteboxes will be distributed based upon the number of supported lines and/or amount of funding received by the provider and the number of whiteboxes available; (4) administrative costs for deployed whiteboxes will be $500,000 per year; (5) there will be 45 test nodes (servers at the designated Internet Exchange Points) at a cost of $2,000 per node for initial installation ($90,000 total) plus $22,500 per year for replacement, upgrade, or update costs; (6) test node annual support costs will be $36,000 per node per year (this includes system administration and the significant data center bandwidth charges); (7) USAC administrative costs will total $575,000 initially, and $315,000 per year thereafter; and (8) $725,000 per year will be needed for USAC audits (including additional whiteboxes needed for audit results and subsequent analysis). 7participating providers or by USAC as part of its oversight over the universal service fund? Should USAC pay the costs of the core servers, with participating providers paying the costs of the whiteboxes deployed in their service areas? If USAC were to pay all of the equipment costs, including the whiteboxes, we anticipate that the only cost for providers would be primarily to verify the services of the panelists selected in a particular provider’s service territory. 23. If the Commission were to adopt such an approach, how many whiteboxes should be deployed in each supported area? Should the number be the same for all providers, vary based on the number of customers in the supported area, or be based on some other calculation? Should individual consumers or consumer groups located in areas served by a Connect America-supported provider be allowed to participate in such an MBA-type mechanism by purchasing their own whiteboxes? Such “citizen testing” would allow interested individuals to evaluate the quality of their services while providing additional testing data. 22 24. We seek comment on the initial performance measurement test suite that should be used, if the Commission were to implement an MBA-type testing mechanism. The MBA’s current test suite includes 13 tests that measure various aspects of network performance with respect to speed and latency and was developed on a consensus basis by academics, regulators, and industry participants. 23 Would the MBA’s test suite be an appropriate for a Connect America testing mechanism, or could it be modified in some fashion? What aspects of the MBA test suite are necessary to meet the Commission’s objectives that ETCs meet their broadband public interest obligations? 25. The MBA program has found that allowing consumers with whiteboxes (referred to as panelists) access to their testing data is an incentive to obtaining a high number of volunteers. Should a Commission-designed testing mechanism for high-cost recipients allow end user participants access to their own testing data? MBA results are currently made publically available via the Commission’s website. Should the Commission publish test results? Making such data public would allow consumers and policy makers to evaluate whether ETCs are meeting their service obligations and allow comparisons of service quality among providers. Is there any reason that such performance results should be kept confidential? If so, should the results be treated as confidential for a particular period of time? IV. AUDITING SPEED AND LATENCY 26. In the USF/ICC Transformation Order, the Commission concluded that the results of speed and latency metric testing “will be subject to audit.” 24 We seek to further develop the record on procedures for implementing this requirement for all recipients of Connect America funding. In particular, we seek comment on how to incorporate this requirement into the existing Beneficiary Compliance Audit Program (BCAP), and whether additional audits specifically focused on broadband performance should be implemented outside of BCAP. 27. High-cost recipients today are subject to random and for-cause USAC audits. We seek comment on the circumstances that would warrant examining broadband performance for cause. In particular, what events should trigger a for cause audit of speed and latency metrics? For example, failure to file a certification that service obligations are being met or a certification that standards are not being met would likely require an immediate audit. Similarly, because MBA results are publicly available, 22 Citizen testing stakeholders could include individual consumers, public interest groups, academics and academic organizations, research organizations, standards organizations, Tribal authorities, state regulatory agencies, and additional government organizations. 23 A more detailed description of the test suite in the MBA program can be found in the MBA documentation. See Measuring Broadband America 2014 Report, Technical Appendix, section 3.D. 24 USF/ICC Transformation Order, 26 FCC Rcd at 17705-06, para. 109. 8should MBA test results that demonstrate a failure to meet service obligations trigger an audit? Should consumer or other credible complaints regarding the quality of service result in an audit? If customer complaints are used to initiate an audit, we seek comment on how this should be done. Should complaints to state/local regulatory agencies, the Commission, and/or public watchdog organizations trigger audits? If so, how many complaints over what time period and what type of complaints should be triggering events for a performance audit? Should requests from local, state, or tribal authorities be sufficient to trigger an audit? Are there other events that should trigger an audit? Proposed audit triggers should address both ensuring that performance standards are met and minimizing administrative costs. 28. In addition, we seek comment on whether a provider whose audit demonstrates a need for ongoing monitoring be required to pay the costs of this additional monitoring. Should results of audits be made publicly available? If not, what justifications support keeping such results private and for how long? V. PROCEDURAL MATTERS 29. Initial Regulatory Flexibility Act Analysis. The USF/ICC Transformation Order included an Initial Regulatory Flexibility Analysis (IRFA) pursuant to 5 U.S.C. § 603, exploring the potential impact on small entities of the Commission’s proposal. 25 We invite parties to file comments on the IRFA in light of this additional notice. 30. Initial Paperwork Reduction Act of 1995 Analysis. This document seeks comment on a potential new or revised information collection requirement. If the Commission adopts any new or revised information collection requirement, the Commission will publish a separate notice in the Federal Register inviting the public to comment on the requirement, as required by the Paperwork Reduction Act of 1995, Public Law 104-13 (44 U.S.C. §§ 3501-3520). In addition, pursuant to the Small Business Paperwork Relief Act of 2002, Public Law 107-198, the Commission seeks specific comment on how it might “further reduce the information collection burden for small business concerns with fewer than 25 employees.” 26 31. Filing Requirements. Pursuant to sections 1.415 and 1.419 of the Commission’s rules, interested parties may file comments on or before the dates indicated on the first page of this document. 27 Comments may be filed using the Commission’s Electronic Comment Filing System (ECFS). 28 ? Electronic Filers: Comments may be filed electronically using the Internet by accessing the ECFS: http://fjallfoss.fcc.gov/ecfs2/. ? Paper Filers: Parties who choose to file by paper must file an original and one copy of each filing. If more than one docket or rulemaking number appears in the caption of this proceeding, filers must submit two additional copies for each additional docket or rulemaking number. 32. Filings can be sent by hand or messenger delivery, by commercial overnight courier, or by first-class or overnight U.S. Postal Service mail. All filings must be addressed to the Commission’s Secretary, Office of the Secretary, Federal Communications Commission. 25 USF/ICC Transformation Order, 26 FCC Rcd at 18364-95, App. P; Federal Communications Commission, Connect America Fund; A National Broadband Plan for Our Future; Establishing Just and Reasonable Rates for Local Exchange Carriers; High-Cost Universal Service Support, 76 Fed. Reg. 78384, 78430-42 (Dec. 16, 2011). 26 44 U.S.C. § 3506(c)(4). 27 47 C.F.R. §§ 1.415, 1.419. 28 See Electronic Filing of Documents in Rulemaking Proceedings, GC Docket No. 97-113, Report and Order, 13 FCC Rcd 11322 (1998). 9? All hand-delivered or messenger-delivered paper filings for the Commission’s Secretary must be delivered to FCC Headquarters at 445 12 th St., SW, Room TW-A325, Washington, DC 20554. The filing hours are 8:00 a.m. to 7:00 p.m. All hand deliveries must be held together with rubber bands or fasteners. Any envelopes and boxes must be disposed of before entering the building. ? Commercial overnight mail (other than U.S. Postal Service Express Mail and Priority Mail) must be sent to 9300 East Hampton Drive, Capitol Heights, MD 20743. ? U.S. Postal Service first-class, Express, and Priority mail must be addressed to 445 12 th Street, SW, Washington DC 20554. 33. People with Disabilities: To request materials in accessible formats for people with disabilities (Braille, large print, electronic files, audio format), send an e-mail to fcc504@fcc.gov or call the Consumer & Governmental Affairs Bureau at (202) 418-0530 (voice), (202) 418-0432 (tty). In addition, one copy of each pleading must be sent to each of the following: (1) Alexander Minard, Telecommunications Access Policy Division, Wireline Competition Bureau, 445 12th Street, S.W., 5-B442, Washington, D.C. 20554; e-mail: alexander.minard@fcc.gov. (2) Suzanne Yelen, Industry Analysis and Technology Division, Wireline Competition Bureau, 445 12th Street, S.W., 6-B115, Washington, D.C. 20554; e-mail: suzanne.yelen@fcc.gov. 34. The proceeding shall be treated as a “permit-but-disclose” proceeding in accordance with the Commission’s ex parte rules. 29 Persons making ex parte presentations must file a copy of any written presentation or a memorandum summarizing any oral presentation within two business days after the presentation (unless a different deadline applicable to the Sunshine period applies). Persons making oral ex parte presentations are reminded that memoranda summarizing the presentation must (1) list all persons attending or otherwise participating in the meeting at which the ex parte presentation was made, and (2) summarize all data presented and arguments made during the presentation. If the presentation consisted in whole or in part of the presentation of data or arguments already reflected in the presenter’s written comments, memoranda or other filings in the proceeding, the presenter may provide citations to such data or arguments in his or her prior comments, memoranda, or other filings (specifying the relevant page and/or paragraph numbers where such data or arguments can be found) in lieu of summarizing them in the memorandum. Documents shown or given to Commission staff during ex parte meetings are deemed to be written ex parte presentations and must be filed consistent with rule 1.1206(b). In proceedings governed by rule 1.49(f) or for which the Commission has made available a method of electronic filing, written ex parte presentations and memoranda summarizing oral ex parte presentations, and all attachments thereto, must be filed through the electronic comment filing system available for that proceeding, and must be filed in their native format (e.g., .doc, .xml, .ppt, searchable .pdf). Participants in this proceeding should familiarize themselves with the Commission’s ex parte rules. - FCC - 29 47 C.F.R. §§ 1.200 et seq.