Emergency Alert System faces its own (unnecessary) emergency
In the world of emergency alerts, we’ve come a long way from the iconic “Emergency! Everybody to get from street!” issued multiple times on the streets of the fictional Gloucester Island by Russian Lt. Yuri Rozanov (Alan Arkin) and his fellow sailors in the 1966 comedy “The Russians Are Coming the Russians Are Coming.”
We’re far more advanced. We’ve got the Emergency Alert System (EAS), which can instantly send alerts to smartphones, TVs, and radio stations throughout a county, a state, or the entire nation. So much better.
Turns out the current emergency isn’t a Russian sub accidentally running aground off the Massachusetts coast, but major software vulnerabilities in equipment used for the EAS itself, which operates under the Federal Emergency Management Agency (FEMA) within the Department of Homeland Security (DHS). Ken Pyle, a security researcher at CYBIR, started probing those vulnerabilities in 2019 and was on the agenda to present his findings at the recent DEFCON security conference in Las Vegas.
He told security blogger Brian Krebs that the vulnerabilities — for which patches have been available since shortly after he alerted the vendor in 2019 — could allow a hacker to compromise an EAS station and send out alerts locally that could be picked up by other EAS systems and retransmitted across the nation.
And he told Bleeping Computer that multiple vulnerabilities confirmed by other researchers haven’t been patched for several years, collectively creating a major flaw.
By exploiting the vulnerabilities, “I can easily obtain access to the credentials, certs, devices, exploit the web server, send fake alerts via crafts message, have them [validate/pre-empt] signals at will. I can also lock legitimate users out when I do, neutralizing or disabling a response,” he said.
Beyond business risk
It’s long been said that a software risk is a business risk. Which is true and important. But software is used for much more than running businesses. Exploitable software vulnerabilities can also pose health and safety risks. In the case of the EAS, there’s the risk of a criminal hacker or simply a prankster causing a national panic.
A statewide illustration of that already happened four years ago in Hawaii. In January 2018, an early morning emergency alert sent by mistake to cellphones across the state warned of an incoming ballistic missile attack, with the exhortation, “Seek immediate shelter. This is not a drill.”
The alert came at a time when Hawaiians were already on edge because of tensions between the U.S. and North Korea. It generated widespread panic during the 38 minutes it took to revoke the warning. Officials said it wasn’t the work of a hacker but simply a mistake by a worker who “clicked the wrong thing on the computer.”
Of course, the vulnerabilities Pyle discovered would allow a hacker to click the wrong thing on purpose.
Pyle told Krebs that he first discovered vulnerabilities in encoder/decoder devices used to broadcast EAS warnings when he started collecting old EAS equipment in 2019. He notified the vendor, Digital Alert Systems (DAS), formerly Monroe Electronics, Inc., and the company said in a recent security advisory that it had issued patches for DASDEC/One-Net software in 2019, shortly after hearing from Pyle.
According to that advisory, DAS “released the version 4 series of software, which addressed these vulnerabilities along with a host of new features.”
But as is often the case with complex systems, applying a patch or update to DASDEC/One-Net is not as simple as tapping an icon on your smartphone.
The problem is that some of the more than 1,700 EAS “alerting authorities” are using versions 2 and 3 of that software, which are listed as “deprecated, end of life” in 2016 and 2019 respectively. That means they no longer receive support or security updates. And according to Pyle, while both the EAS and DAS have urged all users to install the updates, the older models apparently don’t support the new software.
Persistent patch problems
That’s no surprise to many security experts, including Travis Biehn, technical strategist with the Synopsys Software Integrity Group. “There’s a long tail of software used in industries that aren’t exactly considered high tech, and it’s left out-of-date just because it works well enough,” he said. “These were installed and established long ago — they are entrenched. If history provides any lessons, it tells us that only when the pain becomes clear does a problem like this get fixed.”
That is apparently true of the EAS, even though it is presumably high tech — an information technology (IT) system backed by the money and power of the federal government.
Michael Fabian, principal security consultant at Synopsys said it is a familiar story. “Departments that manage these systems are not IT departments and pay very little attention to them, due to the old adage ‘this is important, don’t touch it’,” he said. “There are a multitude of problems here — the vendor’s notification of fixes and how important they were, and legacy equipment that cannot support the newest versions.”
And failure to patch isn’t the only problem. Pyle told Krebs that many EAS participants are making it easy for hackers to get access to the equipment by ignoring basic security advice from DAS — change default passwords, keep the devices behind a firewall and not connected to the public internet, and restrict access to trusted hosts and networks.
Pyle also said the system is automated, so if a hacker gets access to a single local EAS station, a fake alert could go nationwide.
“There’s no centralized control of the EAS because these devices are designed such that someone locally can issue an alert,” Pyle told Krebs, “but there’s no central control over whether I am the one person who can send or whatever. If you are a local operator, you can send out nationwide alerts. That’s how easy it is to do this.”
All of which point to two rather obvious needs: A better patch/update system, especially for things as critical as the EAS. And more pressure on participants to implement security basics, which are the digital equivalent of locking the doors and windows of a house.
How can that get done?
More pressure, please
For starters, this is one time when the cliché “there oughta be a law” applies. It’s federal agencies that deploy the EAS, including FEMA and the National Oceanic and Atmospheric Administration.
According to FEMA, those Wireless Emergency Alerts (WEA) provide “authenticated emergency and life-saving information” to the public via mobile phones, radio, and television. The agency said in the past decade more than 1,700 alerting authorities have sent more than 70,000 WEA warnings.
An advisory from FEMA’s Integrated Alert and Public Warning System does address the need to update the EAS encoder/decoder devices and to take other security measures. But it says only that those measures are only “strongly encourage[d],” not required.
Why not required? According to FEMA, it doesn’t have the authority. A FEMA spokesperson said in a statement that “The Federal Communications Commission (FCC) regulates the private sector communications entities that participate in delivering emergency alerts and information to the public from authorized public safety agencies.”
“FEMA partners with the FCC in developing rules for participating systems, but FEMA does not have regulatory authority for the private-sector system providers.”
The FCC didn’t respond to a request for comment.
But it’s hard to argue that it should be optional for EAS participants to correct what a security researcher said are obvious flaws, because if the public can’t count on “authenticated” alerts actually being authenticated, trust in the system will evaporate.
Lack of trust
It already has, according to some who commented on a story in Ars Technica about the problem. A reader with the handle “gavron” declared that “everyone I know has all alerts turned off — even weather alerts. Those that don’t have them disabled. You see this out in public — the alert hits everyone about the same time and they all roll their eyes and rush to silence the damn phone.”
According to gavron, that’s because both the previous Emergency Broadcast System and the current EAS suffer from design flaws as well as security vulnerabilities. “You can’t have 330 million people searching for Amber, taken in an SUV. All cars are SUVs now. Amber alerts are 17% effective, 5% hoaxes, and 78% annoyance,” he or she wrote.
Gavron also contended that DHS has done nothing to monitor or evaluate the effectiveness of the system. “Weather alerts have been called 20,000 times in 20 years starting [in] 1996. There are no numbers for how effective this is at all.”
Fabian would like to see such numbers as well, noting some of the same things gavron did — that there have been a number of mistaken or fraudulent alerts, and overall too many of them. “It would be interesting to see statistics on how many people have these alerts disabled on their phones because they are intrusive,” he said. “I can understand the need for EAS to broadcast messages out to the public in a rapid and automatic fashion, but there have been some misuses of the system, even intentionally.”
Fundamentally this is about fundamentals. Any alerting authority should be required to comply with the security measures issued by both FEMA and DAS.
- Keep the software up-to-date. If the equipment is too old to be updated, buy new equipment.
- Change default passwords.
- Don’t connect the encoder/decoder to the public internet.
- Keep it behind a firewall.
Because there is no good reason that, three years after Pyle discovered those vulnerabilities in a national government alert system, he should be telling an audience at a security conference that they still exist because of failure to patch.
Biehn isn’t optimistic. “I’m ready for it to get much worse before it gets any better,” he said.
“For those who want to stay out of the headlines and away from the pain, take serious steps to find those systems that have functioned well for the past five or more years and get them updated, replaced, or at the very least, give them a security-sniff-test.”