Report: September 28, 2016 Nationwide EAS Test April 2017 Public Safety and Homeland Security Bureau Federal Communications Commission ? 445 12th Street, SW ?Washington, DC 20554 Federal Communications Commission 2 TABLE OF CONTENTS Heading Page I. EXECUTIVE SUMMARY.................................................................................................................... 3 II. BACKGROUND.................................................................................................................................... 5 A. The Emergency Alert System (EAS) ............................................................................................... 5 B. Lessons Learned from the 2011 Nationwide EAS Test and Adoption of the EAS Sixth Report and Order............................................................................................................................. 5 III. THE 2016 NATIONWIDE EAS TEST ................................................................................................. 6 A. The Parameters of the 2016 Nationwide EAS Test ......................................................................... 6 B. Participation in the Nationwide EAS Test ....................................................................................... 6 C. Participants by EAS Designation..................................................................................................... 9 D. Participant Monitoring of IPAWS ................................................................................................. 10 IV. NATIONWIDE EAS TEST RESULTS............................................................................................... 10 A. Breakdown of Test Performance by EAS Participant Type........................................................... 10 B. Source of Alert............................................................................................................................... 11 C. Language of Alert .......................................................................................................................... 12 VI. ANALYSIS OF MOST SIGNIFICANT PROBLEMS........................................................................ 12 A. Equipment Configuration............................................................................................................... 15 B. Equipment Failures ........................................................................................................................ 15 C. Failure to Update Software ............................................................................................................ 15 D. Audio Quality Problems ................................................................................................................ 15 E. System Clock Errors ...................................................................................................................... 16 VII. RECOMMENDATIONS .................................................................................................................... 16 A. Policy Recommendations............................................................................................................... 16 B. Operational Recommendations...................................................................................................... 17 VIII. CONCLUSION.................................................................................................................................. 18 APPENDIX A: HOW EAS WORKS ........................................................................................................ 19 A. The Emergency Alert System ........................................................................................................ 19 B. Legacy EAS Structure.................................................................................................................... 20 C. Alerting via IPAWS....................................................................................................................... 20 Federal Communications Commission 3 I. EXECUTIVE SUMMARY On September 28, 2016, at 2:20 p.m. Eastern Daylight Time (EDT), the Federal Emergency Management Agency (FEMA), in coordination with the Federal Communications Commission (Commission or FCC) and the National Weather Service (NWS), conducted the second nationwide test of the Emergency Alert System (EAS) (2016 Nationwide EAS Test). The primary purpose of the 2016 Nationwide EAS Test was to assess the reliability and effectiveness of FEMA’s Integrated Public Alert and Warning System (IPAWS) distribution architecture, which delivers content-rich EAS alerts over a secure Internet gateway directly to EAS Participants. The IPAWS test message specifically included English and Spanish versions of the test alert, high quality digital audio, and text files to be used to create an accessible video crawl. The secondary purpose of the 2016 Nationwide EAS Test was to assess the effectiveness of measures the Commission, its Federal partners, and other stakeholders took to address the problems uncovered by the 2011 nationwide EAS test. Among other measures to improve the EAS, the Commission introduced, prior to the 2016 Nationwide EAS Test, a national test event code, a national location code, and an online test reporting system, the EAS Test Reporting System (ETRS). Overall, the 2016 Nationwide EAS Test demonstrated that the Internet-based distribution of alerts via IPAWS has modernized the EAS and greatly improved the quality, effectiveness, and accessibility of EAS alerts: ? Over 20,000 broadcasters, cable operators, and other EAS Participants participated in the 2016 Nationwide EAS Test, totaling 95% of EAS Participants (a 25% improvement over the 2011 test). The vast majority of these EAS Participants received and retransmitted the National Periodic Test (NPT). ? The results further show that the IPAWS version of the alert delivered superior digital sound and successfully delivered non-English alerts to those EAS Participants that wished to distribute them. Test data also reveals, however, that a range of operational and technical issues still remain that affect nationwide EAS test performance across all states: ? Almost half of test participants received the test over-the-air rather than from IPAWS, and these participants were unable to deliver the CAP-formatted digital audio, Spanish, and text files as a result. ? Additionally, some EAS Participants failed to receive or retransmit alerts due to erroneous equipment configuration, equipment readiness and upkeep issues, and confusion regarding EAS rules and technical requirements. ? Finally, some EAS Participant groups had low participation rates, particularly Low Power broadcasters. The following report provides an analysis of the 2016 Nationwide EAS Test results, as well as recommended next steps to continue to improve the EAS. In particular, the Public Safety and Homeland Security Bureau (PSHSB) recommends that the Commission take the following measures to improve the quality of information available in emergency alerts: ? In light of the additional capabilities offered by IP-based alerts, the Commission should facilitate and encourage the use of IPAWS as the primary source of alerts nationwide, while preserving over-the-air alerting as a redundant, resilient, and necessary alternate alerting pathway. ? The Commission should examine how to improve and expand the content included in IPAWS alerts to bridge the gap between today’s alerting systems and future next generation alerting. For example, the Commission, its Federal partners, and other EAS stakeholders should encourage and facilitate the expanded use of multiple languages, text files, and other mechanisms to create EAS alerts accessible to a greater portion of the public. Federal Communications Commission 4 ? The Commission should take specific operational actions to ensure that the EAS remains an authoritative, efficient, and trustworthy source of emergency information. In this regard, PSHSB recommends actions that would address EAS Participant errors and other anomalies uncovered in its analysis of 2016 Nationwide EAS Test: o Take measures to improve EAS Participants’ compliance with and understanding of the part 11 rules (e.g., by partnering with FEMA and the State Emergency Communications Committees (SECCs) to conduct targeted outreach to Low Power broadcasters and other EAS Participants with poor performance in the 2016 Nationwide EAS Test, and by updating the EAS Operating Handbook to provide additional guidance on the roles and responsibilities of EAS Participants); and o Take measures to improve the quality of the EAS, and, where appropriate to strengthen the EAS, allow limited sharing of ETRS data with state partners. Federal Communications Commission 5 II. BACKGROUND A. The Emergency Alert System (EAS) 1 The EAS provides the President with the capability to communicate to the public during a national emergency via a live audio transmission. FCC rules require EAS Participants to have the capability to receive and transmit Presidential Alerts disseminated over the EAS. There are two methods by which EAS alerts may be distributed. Under the traditional, broadcast-based “legacy” structure, the EAS transmits an alert through a pre-established hierarchy of broadcast, cable, and satellite systems. EAS alerts also may be distributed over the Internet through the Integrated Public Alert and Warning System (IPAWS). Common Alerting Protocol-formatted (CAP-formatted) alerts initiated through IPAWS can include audio, video or data files, images, multilingual translations of alerts, and links providing detailed information.2 Appendix A contains additional information about the EAS. B. Lessons Learned from the 2011 Nationwide EAS Test and Adoption of the EAS Sixth Report and Order The first nationwide EAS test was held on November 9, 2011, at 2:00 p.m. EST, and was a “live code” test of the Emergency Action Notification (EAN), the code that would be used for an actual Presidential Alert. Although a large majority of EAS Participants successfully received the EAN and retransmitted it to other EAS Participants,3 the test uncovered several complications that disrupted the ability of some EAS Participants to receive or retransmit the EAN. These included widespread poor audio quality; the lack of a Primary Entry Point (PEP) station in Portland, Oregon; undisclosed EAS Participants’ use of alternatives to PEP-based EAN distribution; and anomalies in EAS equipment programming and operation.4 Some members of the public who observed the test were confused by the discrepancies between the audio and video portions of the test, by the use of the Washington, D.C. location code, and by the manner in which some EAS equipment displayed the visual portion of the alert. In June 2015, the FCC adopted rules that addressed the technical complications and public confusion experienced during the test.5 These rules required EAS Participants to use equipment capable of processing a National Periodic Test (NPT) event code and “six zeroes” (000000) as the national location code; to file test-related data in the Commission’s electronic EAS Test Report System (ETRS); and to comply with accessibility rules to ensure that EAS visual messages are readable and accessible to all members of the public, including people with disabilities. Each of these rules went into effect prior to the 2016 Nationwide EAS Test. 1 See 47 CFR § 11 et seq.; Review of the Emergency Alert System, EB Docket No. 04-296, Second Further Notice of Proposed Rulemaking, 25 FCC Rcd 564, 565, para. 2 (2010); FEMA, Integrated Public Alert & Warning System, https://www.fema.gov/integrated-public-alert-warning-system (last visited Oct. 4, 2016). 2 EAS Participants can deliver to the public the rich data contained in a CAP-formatted message received directly from the IPAWS Internet feed, but once the alert is rebroadcast over the daisy chain, the CAP data is lost, and EAS Participants receiving the alert for the first time over the air cannot deliver CAP-based features, such as digital audio or multiple languages, to the public. 3 See FCC, PSHSB, Strengthening the Emergency Alert System (EAS): Lessons Learned from the Nationwide EAS Test at 3 (2013), http://www.fcc.gov/document/strengthening-emergency-alert-system (2011 EAS Nationwide Test Report). 4 Id. PEP Stations are the initial broadcasters of a Presidential or National EAS message. They cooperate with FEMA to collectively reach over 90% of the American populace. 5 See Review of the Emergency Alert System, EB Docket No. 04-296, Sixth Report and Order, 30 FCC Rcd 6520, 6526-44, paras. 15-50 (2015) (Sixth Report and Order); see also 47 CFR §§ 11.31(f), 11.51, 11.52, 11.61(a)(iv)(3). Federal Communications Commission 6 FEMA also took steps to strengthen the EAS following the 2011 nationwide test, such as correcting the FEMA PEP technical configurations to eliminate message duplication; working with EAS device manufacturers to correct potential technical anomalies; and expanding its PEP coverage in Oregon to include both Eugene and Portland.6 III. THE 2016 NATIONWIDE EAS TEST7 A. The Parameters of the 2016 Nationwide EAS Test 1. Alert Initiation. Unlike the 2011 nationwide EAS test, which FEMA initiated by transmitting a live EAN event code over a secure telephone connection to the PEPs, that then transmitted the EAN to the public over the broadcast-based “daisy chain,” the 2016 Nationwide EAS Test was initiated by FEMA, which provided a “National Periodic Test” code on its Internet-based IPAWS feed. Each EAS Participant then would receive the alert either directly from IPAWS by polling its Internet feed, or indirectly from IPAWS via a re-broadcast of the IPAWS alert by the source that it monitored in the “daisy chain.”8 EAS Participants that first obtained the alert via IPAWS received a CAP-formatted alert with high quality digital audio, a detailed text file that could populate a video crawl, as well as English and Spanish versions of the test alert that EAS Participants could transmit to the public in accordance with the configuration of their equipment. EAS Participants that first obtained the alert over-the-air from a monitored broadcast station rather than from IPAWS received the alert in the simpler EAS Protocol rather than in the data-rich CAP format, and so received an English-only alert that lacked digital audio. 2. Event and Location Codes. The 2011 nationwide EAS test used the EAN, the code for a “live” Presidential Alert, because the EAN had never previously been tested. The test also used the Washington, D.C. location code because no nationwide location code existed at that time. The use of the EAN and Washington D.C. location codes required the Commission to conduct extensive outreach to ensure that the public would not confuse the broadcast of the live code with an actual alert. That same level of outreach was unnecessary for the 2016 Nationwide EAS Test, however, because it used a test event code (NPT) as a nationwide location code, rather than the Washington, D.C. code. The use of a dedicated test code helped ensure that the test alert closely resembled the Required Monthly Tests with which the public is already familiar, greatly reducing the risk of public confusion. B. Participation in the Nationwide EAS Test There are approximately 27,408 EAS Participants in the United States and its territories.9 This estimate includes analog and digital radio broadcast stations (including AM, FM, and Low Power FM (LPFM) stations); analog and digital television broadcast stations (including Low Power TV (LPTV)); analog and 6 2011 EAS Nationwide Test Report at 14. 7 On December 28, the FCC released a Public Notice based on preliminary data announcing an EAS Participant success rate of 94%. Public Safety and Homeland Security Bureau Releases its Initial Findings Regarding the 2016 Nationwide EAS Test, PSHSB Docket No. 15-94, Public Notice, 31 FCC Rcd 13482 (PSHSB 2016). Today’s report is based on final FCC data. 8 Participants’ EAS equipment polls the IPAWS server to check for new alerts at regular intervals. If an EAS Participant receives an over-the-air alert before it checks IPAWS, the over-the-air alert is retransmitted. 9 This total consists of the 17,381 “licensed” and “licensed and silent” radio broadcasters in the FCC’s Consolidated Database System, the 4,162 “licensed” and “licensed and silent” television broadcasters in the FCC’s Licensing and Management System database, the 5,861 headends nationwide reported by SNL Kagan, and the number of Direct Broadcast Satellite and SDARS facilities. This methodology likely overestimates number of radio and television broadcasters that participate in the EAS, as some are exempted from the part 11 rules. For example, if a hub station satisfies the EAS requirements, an analog or digital broadcast satellite station that rebroadcasts 100 percent of the hub station’s programming would not be required to file in ETRS. See 47 CFR § 11.11(b). Federal Communications Commission 7 digital cable systems; wireless cable systems; wireline video systems (e.g., FiOS and U-Verse); Direct Broadcast Satellite (DBS) services; and Sirius/XM, the one Satellite Digital Audio Radio Service (SDARS).10 Table 1 summarizes the participation rate in the 2016 Nationwide EAS Test.11 EAS Participants submitted 27,818 filings in 2016.12 More than 6,000 of these filings duplicated facilities for which EAS Participants had already filed.13 Excluding duplicate filings, EAS Participants made 21,365 unique filings (78.0%). This result is a significant improvement over the 2011 nationwide EAS test, for which EAS Participants only made 15,897 unique filings. Radio broadcasters had an above average participation rate of 87.3%, while television broadcasters had a significantly worse participation rate of 69.0%. Cable and wireline video system participants had the lowest participation rate, with only slightly more than half of those participants filing in ETRS as required by the Commission’s rules. Table 1. Overview of Filings Received in ETRS EAS Participant Type Filings Expected Filings Received Unique Filings Received Participation Rate Radio Broadcasters 17381 16372 15175 87.3% Television Broadcasters 4162 3119 2873 69.0% Cable Systems 5861 7842 3003 52.9% Wireline Video System 270 99 Other 14 N/A 215 215 N/A All Total 27404 27818 21365 78.0% Table 2 provides an overview of the completeness of the filings submitted to ETRS. Only 89.2% of filers completed Forms One, Two, and Three, as required by the Commission’s rules. 8.0% of filers submitted 10 47 CFR § 11.11(a). Test data filed in ETRS are presumed to be confidential. See Sixth Report and Order, 30 FCC Rcd at 6533, n.90; Review of the Emergency Alert System, EB Docket No. 04-296, Third Report and Order, 26 FCC Rcd 1460, 1488, para. 73 (2011). Tables 1 through 9 of this report do not include tabulated data on the performance of DBS, SDARS, or wireless cable service providers that participated in the test. Because there were three or fewer participants in each category, the Bureau is concerned that doing so could disclose confidential information. 11 A small number of EAS Participants provided their EAS Participant Type incorrectly. Those errors have been corrected for the purposes of this report. 12 There may be minor differences between the data reported here and in FEMA’s 2016 National IPAWS EAS Test Report. FEMA’s report is based on an extract of ETRS data taken in early December, whereas this report reflects additional filings and edits made by EAS Participants before the close of ETRS filing on December 31, 2016. 13 Most duplicate filings were submitted for cable systems. EAS Participants that filed for cable systems shortly after ETRS’s launch were required to file separate forms for each community they served, even if those communities were served by the same headend. Although ETRS was updated to allow the submission of one form for several communities served by a single headend, many filers continued to file separately for each community. To the extent that EAS Participants’ filings indicate that a headend serves alerts using multiple, independent sets of EAS equipment, each set of equipment is considered a unique headend in this report. 14 “Other” includes “IPTV providers,” “non-cable multichannel video programming distributors,” “cable resellers,” and other entities reported in the ETRS but not defined as EAS Participants in the EAS rules. Federal Communications Commission 8 “day of test” results, but failed to submit the detailed test results required by Form Three. 2.8% of filers failed to submit any test results, filing only their identifying information required by Form One. Cable systems filers had the best form completion rate of 96.1%,15 while wireline video system filers had the worst form completion rates of 78.8%. Table 2. Overview of Filings Received in ETRS by Form Type EAS Participant Type Unique Filings Form One Filed Only Forms One and Two filed Only Forms One, Two, and Three Filed % % % Radio Broadcasters 15175 487 3.2% 1454 9.6% 13234 87.2% Television Broadcasters 2873 58 2.0% 141 4.9% 2674 93.1% Cable Systems 3003 35 1.2% 81 2.7% 2887 96.1% Wireline Video System 99 10 10.1% 11 11.1% 78 78.8% Other 215 5 2.3% 26 12.1% 184 85.6% All Total 21365 595 2.8% 1713 8.0% 19057 89.2% While preparing for the 2016 Nationwide EAS Test, PSHSB received numerous phone calls from Low Power radio and television broadcasters suggesting that many in those entities were not aware that they were required to participate in the EAS. Table 3 compares the participation of Low Power broadcasters to that of all broadcasters. Low Power FM (LPFM) participation in the test (61.0%) was significantly lower than that of radio broadcasters overall (87.3%), and Low Power television (LPTV) participation (43.4%) was significantly lower than that of television broadcasters overall (43.4%). Further, the low participation rate of Low Power broadcasters appears to have significantly reduced the overall participation rate of all broadcasters. Of the 2,206 radio broadcasters that were expected to file but failed to do so, 723 were LPFM Broadcasters. Of the 1,289 television broadcasters that were expected to file but failed to do so, 1,115 were LPTV broadcasters. 15 PSHSB and FEMA use different methodologies to count the number of cable systems that participated in the nationwide test. While FEMA’s 2016 National IPAWS EAS Test Report separately counted each filing that reported a unique set of monitoring assignments, PSHSB excluded duplicate filings reporting identical results for the same headend. Consequently, FEMA’s approach is best understood to represent the performance of cable system encoder/decoders, while this report’s approach is best understood to represent the performance of cable headends. Federal Communications Commission 9 Table 3. Overview of Filings Received From Low Power Broadcasters EAS Participant Type Participation Rate Form One Filed Forms One and Two Filed Forms One, Two, and Three Filed % % % All Radio Broadcasters 17381 15175 87.3% 487 3.2% 1454 9.6% 13234 87.2% LPFM Broadcasters 1852 1129 61.0% 119 10.5% 96 8.5% 914 81.0% All Television Broadcasters 4162 2873 69.0% 58 2.0% 141 4.9% 2674 93.1% LPTV Broadcasters 1970 855 43.4% 24 2.8% 70 8.2% 761 89.0% C. Participants by EAS Designation ETRS Form One asked EAS Participants to identify the EAS designations assigned to them by their State EAS Plans. Table 4 provides the reported EAS designations of all test participants by participant type.16 A large number of test participants incorrectly reported their participant type. For example, 567 test participants reported that they served as National Primary (Primary Entry Point) stations, which are the source of EAS Presidential messages.17 However, according to FEMA, there are only 77 Primary Entry Point stations nationwide. 344 test participants reported that they served as state primary stations, when that number was only 94 at the time of the 2011 nationwide test.18 Under the legacy EAS structure, cable systems only serve as Participating Nationals, yet cable systems submitted more than 1,000 responses that report they are serving another role. This data provides strong evidence that many test participants do not understand their roles in the EAS structure and are unfamiliar with the State EAS Plans that inform them of those roles. Table 4. EAS Designation by Participant Type EAS Participant Type National Primary (NP) State Primary (SP) State Relay (SR) Local Primary 1 (LP1 ) Local Primary 2 (LP2 ) Participating National (PN) Radio Broadcasters 318 200 1180 1180 971 12183 Television Broadcasters 50 41 190 126 109 2538 Cable Systems 182 79 90 581 256 2313 Wireline Video System 14 11 12 28 24 62 Other 3 13 5 8 5 195 All Total 567 344 1477 1923 1365 17291 16 For this report, a “test participant” is a unique EAS Participant that completed, at a minimum, ETRS Forms One and Two. Unless otherwise specified, the analyses hereafter only considers filings made by test participants. 17 47 CFR § 11.18(a). 18 2011 EAS Nationwide Test Report at 8. Federal Communications Commission 10 D. Participant Monitoring of IPAWS All EAS Participants are required to interface with IPAWS.19 ETRS Form One asked EAS Participants to confirm whether the facility’s equipment complied with this requirement. Table 5 shows that 94.0% of test participants represent that they are complying with the IPAWS monitoring requirement.20 Of the larger test participant types, cable systems have the lowest IPAWS monitoring rate of 91.4%. Of the smaller test participant types, only 68.5% of wireline video systems are monitoring IPAWS. Table 5. IPAWS Monitoring by Participant Type EAS Participant Type Test Participants Monitoring IPAWS # % Radio Broadcasters 14688 13878 94.5% Television Broadcasters 2815 2668 94.8% Cable Systems 2968 2712 91.4% Wireline Video System 89 61 68.5% Other 210 206 98.1% All Total 20770 19525 94.0% IV. NATIONWIDE EAS TEST RESULTS A. Breakdown of Test Performance by EAS Participant Type ETRS Form Two asked EAS Participants whether they had successfully received and retransmitted the test alert on September 28, 2016. Table 6 shows test participants’ success rates for alert receipt and retransmission.21 This data indicates that 95.4% of test participants successfully received the alert. This is a significant improvement over the 82% success rate that was observed in 2011.22 Test participants experienced additional complications with retransmitting the alert to the public and other EAS Participants, but still achieved a respectable success rate of 85.8%. The most successful single identified participant type were radio broadcasters, of which 95.5% successfully received the alert and 88.0% successfully retransmitted it.23 19 47 CFR § 11.52(d)(2). 20 Possible explanations for test participants reporting that they do not monitor IPAWS include a lack of broadband access, lack of familiarity with EAS equipment functions, and noncompliance with the Commission’s rules. PSHSB is checking its current waivers of the CAP requirement and reaching out to test participants to investigate the issue. 21 Tables 6 through 9 exclude 381 EAS Participants that report to be silent pursuant to a special temporary authorization granted by the Commission. 22 2011 EAS Nationwide Test Report at 8. 23 The “other” category, which includes different types of participants, had the highest percentage overall receipt and transmission of the alert. Federal Communications Commission 11 Table 6. Test Performance by Participant Type EAS Participant Type Test Participants Successfully Received Alert Successfully Retransmitted Alert # % # % Radio Broadcasters 14521 13872 95.5% 12771 88.0% Television Broadcasters 2601 2532 97.3% 2218 85.3% Cable Systems 2968 2758 92.9% 2263 76.2% Wireline Video System 89 79 88.8% 44 49.4% Other 210 206 98.1% 198 94.3% All Total 20389 19447 95.4% 17494 85.8% Table 7 shows the performance of Low Power broadcasters in the 2016 Nationwide EAS Test. LPFM broadcasters had an alert receipt success rate of 89.0%, more than 6% less than the rate of all radio broadcasters, and an alert retransmission success rate of 74.2%, more than 13% less than the rate of all radio broadcasters. LPTV broadcasters performed only slightly worse than television broadcasters generally. 94.6% of LPTV broadcasters successfully received the alert, only 2.7% less than the rate of all television broadcasters. Table 7. Test Results of Low Power Broadcasters EAS Participant Type Test Participants Successfully Received Alert Successfully Retransmitted Alert # % # % All Radio Broadcasters 14521 13872 95.5% 12771 88.0% LPFM Broadcasters 977 869 89.0% 725 74.2% All Television Broadcasters 2601 2532 97.3% 2218 85.3% LPTV Broadcasters 625 591 94.6% 521 83.4% B. Source of Alert On ETRS Form Three, EAS Participants identified the first source from which they received the test alert. Table 8 compares the sources from which the different types of test participants received the test alert. A slight majority (56.5%) of test participants first received the alert over-the-air, and a slight minority (43.5%) first received the alert from IPAWS. Cable system participants received over-the-air alerts first more frequently than other participant types (61.6%). Wireline video systems participants received over- the-air alerts first much more frequently than other participant types (74.7%), but this result may be a consequence of a small sample size. Federal Communications Commission 12 Table 8. Source of Alert by Participant Type EAS Participant Type Test Participants that Received Alert First Received From IPAWS First Received Over-the-Air # % # % Radio Broadcasters 13872 6149 44.3% 7723 55.7% Television Broadcasters 2532 1141 45.1% 1391 54.9% Cable Systems 2758 1058 38.4% 1700 61.6% Wireline Video System 79 20 25.3% 59 74.7% Other 206 87 42.2% 119 57.8% All Total 19447 8455 43.5% 10992 56.5% C. Language of Alert Form Three asked EAS Participants to report the languages in which they received and retransmitted the test alert. Table 9 shows the language of the alerts that were received and retransmitted by test participants.24 70 test participants received the test alert in Spanish, and 74 test participants retransmitted the test alert in Spanish; the vast majority of these participants were radio and television broadcasters. Table 9. Language of Alert by Participant Type EAS Participant Type Received Alert Retransmitted Alert English Spanish English and Spanish English Spanish English and Spanish Radio Broadcasters 12078 35 11025 30 Television Broadcasters 2348 30 1 2047 44 1 Cable Systems 2726 4 2476 Wireline Video System 68 34 Other 179 1 172 All Total 17399 70 1 15754 74 1 VI. ANALYSIS OF MOST SIGNIFICANT PROBLEMS Test participants reported complications with the test that included equipment configuration issues, equipment failures, failure to update equipment software, audio quality issues, source issues, clock errors, and, in some cases, noncompliance with the part 11 rules. EAS Participants reported the complications they experienced in two ways. First, ETRS Form Three provided a series of checkboxes that allowed EAS Participants to assign categories to the issues they experienced. These categories were based on the complications observed during the 2011 nationwide EAS test, which included audio quality issues, the receipt of duplicative EAS messages from the same source, equipment performance issues, and user error. Table 10 shows the categories of complications reported by test participants that completed Form Three and demonstrates that the EAS has been strengthened considerably since the 2011 nationwide test. 80.2% 24 FEMA made the 2016 Nationwide EAS Test available only in English and Spanish. Accordingly, test results by EAS Participants that reported receiving the alert in other languages were disregarded for purposes of Table 9. Federal Communications Commission 13 of Form Three filers reported no complications. 2.6% of filers reported experiencing audio quality issues, which is a significant decline from the 11% of test participants that received no audio in 2011.25 2.1% of filers reported equipment performance issues, which is a significant decline from the 6% that reported similar issues in 2011.26 Table 10. Complication Categories Experienced By Test Participants27 Complication Experienced During Receipt Experienced During Retransmission # % # % No Complications 16226 85.1% 15298 80.2% Audio Quality Issues 490 2.6% N/A N/A Duplicate Messages 171 0.9% N/A N/A Equipment Performance Issues 397 2.1% 442 2.3% User Error 124 0.7% 131 0.7% Other 743 3.9% 800 4.2% Form Three also allowed EAS Participants to report complications by describing them in “explanation” text fields. Table 11 categorizes the responses received in those text fields.28 The most notable complications reported by test participants include equipment configuration issues, equipment failures, failure to update equipment software, audio quality issues, and system clock errors. 25 2011 EAS Nationwide Test Report at 18. 26 Id. 27 Table 10 reflects the percentage of the 19,064 Form Three filers that experienced the specified complications. 28 PSHSB notes that FEMA’s 2016 National IPAWS EAS Test Report uses a different methodology to categorize test complications. FEMA independently categorized complications based on an engineering analysis of the dominant failure mode in each participant facility. This report categorizes complications by all failure modes reported by test participants, which results in some explanations being placed in multiple categories. Federal Communications Commission 14 Table 11. Explanations Reported By Test Participants. 29 Specific Cause of Complication Participants that Experienced Complication Specific Cause of Complication Participants that Experienced Complication # % # % Configuration issue 773 4.1% Obsolete equipment 45 0.2% Equipment failure 546 2.9% Display quality issues 38 0.2% Update required 510 2.7% Equipment not set up 37 0.2% Licensed, but silent 381 2.0% Equipment installation error 33 0.2% Audio quality 361 1.9% Logging error 32 0.2% Source issues 341 1.8% Automation system issues 22 0.1% No audio 264 1.4% Power failure 22 0.1% Antenna / reception 189 1.0% Unexpected alert language 22 0.1% Clock error 151 0.8% User error 16 0.1% Failure to comply with EAS rules 133 0.7% Interference 16 0.1% Connectivity 59 0.3% Equipment struck by lightning 15 0.1% Middleware 57 0.3% Late alert receipt 13 0.1% EOM error 56 0.3% No tones 13 0.1% Wiring errors 54 0.3% Firewall issues 10 0.1% Equipment out for repairs / replacement 47 0.3% Wrong monitoring source 10 0.1% Duplicate messages 45 0.2% EAS box stolen 1 < 0.1% 29 Table 11 reflects the percentage of the 19,064 Form Three filers that experienced the specified complications. Federal Communications Commission 15 A. Equipment Configuration Of the 773 test participants that provided detailed explanations of complications related to equipment configuration, 278 were radio broadcasters, 129 were television broadcasters, and 338 were cable providers. Many of these test participants reported that they had recently updated their EAS equipment’s software, but the alert failed to retransmit because the nationwide location code or the NPT code were not properly configured. Test participants also reported that their equipment was incorrectly configured for monitoring the correct EAS sources. A few test participants indicated that their equipment audio levels were set too low or high for their equipment to properly activate or were not configured to auto-forward the alert. The majority of test participants that reported complications related to equipment configuration also reported that they had successfully identified and corrected the cause of those complications. B. Equipment Failures Of the 546 test participants that provided detailed explanations of complications related to equipment failures, 345 were radio broadcasters, 128 were television broadcasters, and 68 were cable providers. Test participants reported equipment failure issues across all makes of EAS equipment. Those issues included failure to boot on the day of the test and equipment losing programming. One test participant reportedly discovered that its EAS equipment had been infected by malware. Some test participants also experienced failures in other kinds of equipment. Some participants reported failure of their receivers and FM tuners. Some cable systems reported failures of specialized equipment located downstream from their EAS encoder/decoder, such as comb generators that generate substitution channels for analog cable systems. While many test participants indicated that they were continuing to work with EAS equipment manufacturers to identify the cause of the equipment failures, some participants reported that they replaced the EAS equipment used during the test with more recently manufactured models. C. Failure to Update Software Of the 510 test participants that provided detailed explanations of complications related to failure to update EAS equipment software, 360 were radio broadcasters, 64 were television broadcasters, and 74 were cable providers. The impact of failing to install recent software updates varied. Some test participants reported that failure to install a software update prevented their equipment from receiving the alert, while others reported that they were unable to successfully retransmit the alert. It is likely that these test participants experienced failures because their EAS equipment was unable to process the NPT or nationwide location codes as required by the Commission’s rules. Most test participants that reported issues with outdated software also reported that they have since made the necessary updates. D. Audio Quality Problems Of the 361 test participants that provided detailed explanations of audio quality complications, 300 were radio broadcasters, 40 were television broadcasters, and 13 were cable providers. Many test participants reported audio quality issues that included background noise, static, distortion, echoing, low volume, and slow audio playback. Some test participants identified their issues as related to receiving a weak signal from the over-the-air sources they were monitoring. Many test participants that first received the alert via IPAWS reported that the alert featured excellent audio quality. Some test participants reported that they could not take advantage of the high quality digital audio provided by the CAP-formatted alert from IPAWS because they received the over-the-air alert first. Table 12 shows the sources from which the test alert was received by those test participants that reported audio quality problems using the appropriate checkbox on Form Three. 83.9% of these test participants received the alert from an over-the-air source. This is disproportionate to the number of test participants that received the alert from an over-the-air source (56.5%). This suggests that EAS Participants are much less likely to experience audio quality complications if they receive and retransmit the CAP-formatted version of the alert that is provided via IPAWS. Federal Communications Commission 16 Table 12. Audio Quality Problems by Source of Alert First Source Received # of Participants Experiencing Audio Quality Problems % of All Audio Quality Problems Broadcast 411 83.9% IPAWS 79 16.1% E. System Clock Errors Of the 151 test participants that provided detailed explanations of complications related to the system clocks of their EAS equipment, 91 were radio broadcasters, 12 were television broadcasters, and 46 were cable providers. Some test participants reported that they did not succeed in retransmitting the alert because the clock and date on their EAS equipment were improperly set, causing the alert to either “expire on arrival” or retransmit much later in the day. Other test participants did not report any problems receiving or retransmitting the alert, instead noting that they could not determine the times at which receipt and retransmission occurred from their logs because the clock and date features of their EAS equipment were turned off. Most test participants reported that they corrected their system clocks following the test. VII. RECOMMENDATIONS A. Policy Recommendations The 2016 Nationwide EAS Test demonstrated that the IPAWS can deliver an alert with high quality audio, multiple languages, and overall, more detailed, content-rich information than an over-the-air alert. Accordingly, PSHSB recommends that the Commission consider ways in which it can leverage IPAWS capabilities to improve the quality of information available in the emergency alerts themselves and, over the long term, to explore new ways to bridge the gap between today’s alerting systems and future alerting systems that emerging technologies will make possible. To achieve these ends, PSHSB recommends that Commission do three things. 1. Promote the Use of IPAWS First, the Commission should facilitate the use of IPAWS as the primary source of alerts nationwide while preserving over-the-air alerting as a redundant and resilient alternative alerting pathway. Roughly half of the test participants reported that they received the test over the broadcast-based “daisy chain,” rather than from IPAWS. Accordingly, some of these test participants experienced poor quality audio or were unable to deliver the Spanish language alert. If the Commission were to allow EAS Participants to check IPAWS for a high quality, IP-based alert whenever they receive an over-the-air alert, IPAWS would evolve into the primary alerting source nationwide, while over-the-air alerting would remain as a redundant and resilient alternate alerting pathway. Federal Communications Commission 17 2. Leverage IP-Based Capabilities to Improve the Content of Alerts Second, the Commission, its Federal partners, and other EAS stakeholders should seek to leverage the capabilities of IPAWS to initiate alerts that include multiple languages, text files, and other advanced features. By encouraging state and local EAS initiators to take advantage of these additional capabilities, state and local EAS initiators can create alerts that address their own particular communities’ needs and are accessible to a greater portion of the public. Commenters have consistently called for expanded use of non-English languages in the EAS and have equally called for alerts to be fully accessible to people with disabilities.30 The 2016 Nationwide EAS Test has shown that IPAWS has the capability to provide such accessible alerts. 3. Re-imagine Alerting Third, the Commission should examine alerting anew by asking how developing technologies can provide even more current, useful, and life-saving alerts to the public, while imposing minimal regulatory and financial burdens on broadcast, cable, satellite, and other EAS Participants. Specifically, the Commission should focus on encouraging innovation in light of emerging technologies, modernizing the EAS in light of consumer needs, minimizing burdens on stakeholders, ensuring detailed and high-quality alert content, and providing greater alert accessibility to people with disabilities and people who communicate in languages other than English. B. Operational Recommendations To facilitate this transition while simultaneously ensuring that the EAS remains an authoritative, efficient, and trustworthy source of emergency information, PSHSB also makes the following operational recommendations based on observed categories of EAS Participant error in the 2016 Nationwide EAS Test. PSHSB recommends that these recommendations be adopted in sufficient time so that EAS Participants can take corrective action before the next nationwide EAS test. 1. Take Measures to Improve Compliance with and Understanding of the Part 11 Rules The test results indicate that the Commission can take several measures to improve EAS Participants’ compliance with and understanding of the part 11 rules. First, the 2016 Nationwide EAS Test results indicate performance shortfalls among different types of EAS Participant types. Despite participation being required by the Commission’s rules, low participation rates were observed among cable providers and Low Power broadcasters. PSHSB recommends that it partner with FEMA, the SECCs, and EAS equipment vendors to conduct outreach to EAS Participants that did not participate or underperformed. The goals of the outreach would be to ensure that all EAS Participants are aware that they are required to participate in nationwide tests, correct technical issues observed during the test, and help EAS Participants comply with the Commission’s rules without enforcement action. To better identify gaps in EAS participation, PSHSB recommends the development of a method for using the Commission’s databases to generate an accurate list of all EAS Participants that are required to participate in the EAS. Further, during preparations for the 2016 Nationwide EAS Test, PSHSB received numerous phone calls and e-mails from EAS Participants that indicated confusion over basic elements of the EAS, such as EAS Participants’ designations, operating areas, and monitoring assignments. PSHSB proposes that the EAS Operating Handbook be revised to include additional guidance on the roles and responsibilities of EAS Participants.31 Specifically, the handbook would cite to the relevant rules and explain how to access ETRS and EAS State Plan information to obtain proper designations, monitoring obligations, and operational area information. 30 See, e.g., Notice, 31 FCC Rcd at 627-29, paras. 69-74. 31 “The EAS Operating Handbook states in summary form the actions to be taken by personnel at EAS Participant facilities upon receipt of an EAN, an EAT, tests, or State and Local Area alerts. It is issued by the FCC and contains instructions for the above situations.” 47 CFR § 11.15. Federal Communications Commission 18 2. Improve the Quality of ETRS Data The Commission should also improve the clarity of ETRS by creating additional categories in ETRS for classification of new voluntary participants. During the preparations for the 2016 Nationwide EAS Test, several entities contacted PSHSB to seek clarification of whether they should be considered EAS Participants under part 11 of the Commission’s rules. From those interactions, PSHSB has identified two types of entities that offer video programming services to the public that are not discussed in part 11: cable resellers, and IPTV providers that do not utilize a wireline video system.32 PSHSB recommends that these entities be added to the ETRS as new categories of voluntary test participants. VIII. CONCLUSION The 2016 Nationwide EAS Test largely was a success, demonstrating that the national EAS has been significantly strengthened since the 2011 nationwide test. The test also highlights several areas in which the EAS can continue to be improved. PSHSB will continue to work with FEMA, EAS Participants, and other EAS stakeholders to address these problems and to ensure that the EAS can deliver timely and accurate national alerts and critical emergency information to the public. Federal Communications Commission 19 APPENDIX A: HOW EAS WORKS A. The Emergency Alert System The EAS is designed primarily to provide the President with the capability to communicate via a live audio transmission to the public during a national emergency.33 The EAS is the successor to prior national warning systems Control of Electromagnetic Radiation (CONELRAD), established in 1951, and the EBS, established in 1963.34 The FCC, in conjunction with FEMA and the NWS, implements the EAS at the federal level.35 The respective roles these agencies play are defined by a 1981 Memorandum of Understanding between FEMA, NWS and the FCC;36 a 1984 Executive Order;37 a 1995 Presidential Statement of EAS Requirements;38 and a 2006 Public Alert and Warning System Executive Order.39 As a general matter, the Commission, FEMA, and NWS all work closely with radio and television broadcasters, cable providers, and other EAS Participants and stakeholders – including state, local, territorial and tribal governments – to ensure the integrity and utility of the EAS. FCC rules require EAS Participants to have the capability to receive and transmit Presidential alerts disseminated over the EAS, and generally govern all aspects of EAS participation.40 However, a Presidential alert has never been issued, and prior to the 2011 nationwide EAS test, the national alerting capability of the EAS had never been tested. Although EAS Participants also voluntarily transmit thousands of alerts and warnings issued annually by the NWS and state, tribal, and local governments, these alerts typically address severe weather threats, child abductions, and other local emergencies. As discussed in more detail below, non-Presidential EAS alerts do not require that EAS Participants open a live audio feed from the alerting source, but rather deliver alerts with prerecorded messages that can be delivered at the discretion of the EAS Participant, rendering non-Presidential alerts (and their related testing procedures) inappropriate for end to end testing of a national alert.41 33 See Review of the Emergency Alert System, EB Docket No. 04-296, Second Further Notice of Proposed Rulemaking, 25 FCC Rcd 564, 565, para. 2 (2010). 34 CONELRAD was not an alerting system per se, but was rather a Cold War emergency system under which most radio and television transmission would be shut down in case of an enemy missile attack to prevent incoming missiles from homing in on broadcast transmissions. The radio stations that were allowed to remain on the air, the CONELRAD stations, would remain on the air to provide emergency information. See Defense: Sign-off for CONELRAD, Time Magazine, July 12, 1963. 35 FEMA acts as Executive Agent for the development, operation, and maintenance of the national-level EAS. See Memorandum, Presidential Communications with the General Public During Periods of National Emergency, The White House (Sept. 15, 1995) (1995 Presidential Statement). 36 See 1981 State and Local Emergency Broadcasting System (EBS) Memorandum of Understanding among the Federal Emergency Management Agency (FEMA), Federal Communications Commission (FCC), the National Oceanic and Atmospheric Administration (NOAA), and the National Industry Advisory Committee (NIAC), reprinted as Appendix K to Partnership for Public Warning Report 2004-1, The Emergency Alert System (EAS): An Assessment. 37 See Assignment of National Security and Emergency Preparedness Telecommunications Function, Exec. Order No. 12472, 49 Fed. Reg. 13471 (1984). 38 See 1995 Presidential Statement. 39 See Public Alert and Warning System, Exec. Order No. 13407, 71 Fed. Reg. 36975 (June 26, 2006) (Executive Order). 40 See 47 CFR Part 11. 41 See 2011 EAS Nationwide Test Report at 7 n.13. Federal Communications Commission 20 B. Legacy EAS Structure There are two methods by which EAS alerts may be distributed. Under the traditional “legacy” structure, illustrated in Figure 1 below, the EAS is designed to cascade the EAN through a pre-established hierarchy of broadcast, cable, and satellite systems. FEMA initiates a nationwide, Presidential alert using specific encoding equipment to send the EAN code to the PEPs over a secure telephone (wireline) connection.42 Upon receipt of the code, the PEPs open a live audio channel to FEMA and broadcast the EAN throughout their listening areas. A group of selected EAS Participants in each PEP’s broadcast area, known as Local Primary (LP) stations, monitor these PEP stations. When LP stations receive the EAN, they, in turn, open up an audio channel to FEMA via the PEP, and broadcast the EAN in their listening areas. The remaining 22,500 broadcasters, cable television facilities and other EAS Participants located in each LP’s broadcast footprint receive the alerts from the LP stations, deliver the alerts to the public (or, in the case of cable, to customers’ set top boxes), and open up the audio channel to FEMA through their PEP and LP. Figure 1. EAS Architecture C. Alerting via IPAWS EAS alerts also may be distributed over the Internet through the Integrated Public Alert and Warning System (IPAWS), illustrated in Figure 2 below.43 As of June 30, 2012, EAS Participants are required to be able to receive EAS alerts formatted in Common Alerting Protocol (CAP)44 from authorized 42 The EAN and other EAS codes are part of the Specific Area Message Encoding (SAME) protocol used both for the EAS and NOAA weather radio. See National Weather Service, NOAA Weather Radio All Hazards, http://www.nws.noaa.gov/nwr/ (last visited Apr. 21, 2017). 43 FEMA, Integrated Public Alert & Warning System, https://www.fema.gov/integrated-public-alert-warning-system (last visited Oct. 4, 2016). 44 See Review of the Emergency Alert System; Independent Spanish Broadcasters Association, the Office of Communication of the United Church of Christ, Inc., and the Minority Media and Telecommunications Council, Petition for Immediate Relief; Randy Gehman Petition for Rulemaking, EB Docket 04-296, Fourth Report and Order, 26 FCC Rcd 13710, 13719, para. 20 (2011) (Fourth Report and Order). CAP is an open, interoperable standard developed by the Organization for the Advancement of Structure Information Standards (OASIS), and it incorporates an XML-based language developed and widely used for web documents. See Review of the Emergency Alert System; Independent Spanish Broadcasters Association, the Office of Communication of the United Church of (continued….) Federal Communications Commission 21 emergency alert initiators over the Internet via IPAWS. CAP-formatted alerts can include audio, video or data files, images, multilingual translations of alerts, and links providing more detailed information than what is contained in the initial alert (such as streaming audio or video).45 An EAS Participant that receives a CAP-formatted message can utilize the CAP-formatted content to generate messages in synchronous audio and visual formats, which then can be broadcast to local viewers and listeners.46 CAP also provides each alert with a unique alert identifier and supports alert authentication through the provision of a digital signature and an encryption field that enables greater protection of the CAP message.47 Figure 2. Common Alerting Protocol (CAP) Alert Distribution Architecture (Continued from previous page) Christ, Inc., and the Minority Media and Telecommunications Council, Petition for Immediate Relief; Randy Gehman Petition for Rulemaking, EB Docket No. 04-296, Fifth Report and Order, 27 FCC Rcd 642, 648, para. 10 (2012). CAP messages contain standardized fields that facilitate interoperability between and among devices, and are backwards-compatible with the EAS Protocol. See id. 45 See id. However, any data contained in a CAP-formatted message beyond the EAS codes and audio message (if present), such as enhanced text or video files, can be utilized locally by the EAS Participant that receives it, but cannot be converted into the EAS Protocol and thus cannot be distributed via the daisy chain process, as reflected in the Part 11 rules. See, e.g., 47 CFR § 11.51(d), (g)(3), (h)(3), (j)(2). 46 See 47 CFR § 11.51(d), (g)(3), (j)(2). 47 See OASIS, Common Alerting Protocol Version 1.2 (2010), available at http://docs.oasis-open.org/emergency/ cap/v1.2/CAP-v1.2-os.html (last visited Sept. 29, 2016).