Push is on to secure critical infrastructure, but hurdles remain

Taylor Armerding
Nerd For Tech
Published in
7 min readApr 5, 2022

It hasn’t generated the avalanche of daily headlines that the slap heard ‘round the world has. But it should. Because a recent “special publication” from the National Institute of Standards and Technology (NIST) on improving the cybersecurity of U.S. critical infrastructure (CI) highlights a problem that’s much more significant to the physical security of the nation, and won’t be easy to solve.

It’s titled “Protecting Information and System Integrity in Industrial Control System Environments: Cybersecurity for the Manufacturing Sector.” Or, a much shorter but far less explanatory title, “NIST SP 1800–10.”

The new guidebook for CI cybersecurity is a public / private collaboration between NIST’s National Cybersecurity Center of Excellence (NCCoE) and the MITRE Corporation plus Microsoft, VMware, Tenable, Dragos, Dispel, Forescout, OSIsoft, TDi Technologies, and GreenTec.

There probably aren’t many documents for which the need is so obvious. If we in the general population didn’t know it before, last year’s ransomware attacks on a major pipeline and a food distributor, as well as an attempt to poison the water supply of a Florida community, should have made it clear: U.S. CI is an attractive target for cyber criminals.

The reasons are obvious. If all attackers want is money, they’re more likely to get it if the attack means the target can’t provide a critical need — like energy, food, or water — to millions of people. If attackers have political goals, they can damage a major portion of what a society needs to function without using bombs, missiles, or bullets. No need for an air force to travel thousands of miles when everything on the planet is accessible with computer keystrokes.

So far, nothing approaching what a number of high government officials have warned for decades could amount to a “cyber Pearl Harbor” has occurred. But the interest in such an attack from bad actors is clear. There are 16 CI sectors in the U.S. with thousands of providers, most in the private sector. That’s a large attack surface.

The FBI’s recently released “Internet Crime Report 2021” noted 649 ransomware attacks last year on U.S. CI operators. But that number is almost certainly low by at least half, since the FBI’s Internet Crime Complaint Center didn’t start breaking out CI attacks until June. And the total only included attacks that had been reported. The agency and most experts agree that many are not reported.

So, no surprise, the FBI said it “anticipates an increase in critical infrastructure victimization in 2022.”

Not a new problem

Meanwhile, the security of industrial control systems (ICS) used by most CI operators remains dangerously porous. And this is not a new problem.

Five years ago, in March 2017, Joel Brenner, a former senior counsel and inspector general at the National Security Agency, in a report titled “Keeping America Safe: Toward More Secure Networks for Critical Sectors,” wrote that “the digital systems that control critical infrastructure in the United States and most other countries are easily penetrated and architecturally weak, and we have known it for a long time.”

In a later blog post, Brenner declared, “The White House has been issuing ineffective directives addressing critical networks like clockwork since the ’90s.”

So, will the NCCoE effort be any different? A game-changer? There’s some debate about that.

Given the attack surface and a heightened threat level at a time of international unrest, you might think the figurative emergency lights would be flashing at the federal level. And they are, to some extent, but only in an advisory way. The NIST publication is a guide, not a mandate.

On the positive side, it is aimed at addressing the major cybersecurity problem with ICS — its operators have increasingly connected their operational technology (OT) systems to their information technology (IT) systems. There are incentives to do so — it improves productivity and makes the operations more efficient, and thus more competitive.

It’s just that “enterprise-wide connectivity … has also provided malicious actors, including nation states, common criminals, and insider threats a fertile landscape where they can exploit cybersecurity vulnerabilities to compromise the integrity of ICS and ICS data,” the NIST guidebook said.

Hence the extensive manual to help them address those vulnerabilities — 396 pages in three volumes: An executive summary; an approach, architecture, and security characteristics section; and how-to guides.

A blog post from cybersecurity firm Dragos, one of the collaborating companies, noted that the guidance is “based on lab-tested analysis of several essential manufacturing system testbeds … built to mimic real-world manufacturing environments. The scenarios focused on known cyber challenges.”

And then the downsides

But downsides remain. There are multiple hurdles to implementing what may be very good, real-world advice. The CI world is populated by a wide range of organizations, many of which don’t have the staff, expertise, or the money to implement that advice.

“Many ICS operators have a low level of security maturity,” said Jonathan Knudsen, head of global research within the Synopsys Cybersecurity Research Center. “Following this guide would be a good starting point but building a real software security initiative takes time and often requires external help.”

It would also take a lot of money because much of the nation’s critical infrastructure is what security experts euphemistically label “legacy.” That’s not a compliment. It means that while it was designed to function safely and for a very long time — decades — it wasn’t designed to be connected to the internet.

In the past, a low-tech but effective way to deal with that was the air gap — keeping the operational part of ICSs off the internet and disconnected from any other computers that are on the internet, like IT networks.

While an air gap doesn’t guarantee security — the notorious Stuxnet attack on the Iranian nuclear program overcame the air gap with a simple USB thumb drive — it makes it a lot harder for malware to jump from one system to another.

But the air gap has been eroding or disappearing entirely, as NIST reported, due to the convergence of OT and IT. And with the air gap gone, ICS vulnerabilities are more easily exploited remotely by hackers.

In short, instead of security being “built in” to the convergence of IT and OT, the NCCoE effort is now to patch it on after the fact. Boris Cipot, senior security engineer with the Synopsys Software Integrity Group, said given “all the weaknesses we see in the supply chain, all the security breaches we know about and the vast complexity we are trying to get under control, are we still confident that an IT/OT convergence is the right way forward?”

“Undoubtedly, there is a way to make this convergence possible,” he said, “but it must come from a more restricted perspective. It must lead with procedures and processes that push for the awareness of the complexity of the threat itself.”

Necessary, but not sufficient

Beyond that, Joe Weiss, managing partner at Applied Control Systems, said the guidebook doesn’t cover the whole OT security landscape, because it can’t — the process sensors that OT systems depend on don’t have security capabilities.

He noted that NIST acknowledges in the guidebook that “many of the device cybersecurity capabilities may not be available in modern sensors and actuators.”

That means the scenarios and recommendations in the guidebook are “necessary, but not sufficient,” he said.

In a recent blog post, Weiss wrote that in a presentation he gave at the U.S. Air Force War College, he told the audience that “good cyber hygiene doesn’t apply to insecure process sensors” because those sensors “have no capability for passwords, multifactor authentication, encryption, keys, signed certificates, etc. Despite the lack of any cybersecurity, these devices are the 100% trusted input to OT networks and manual operation.”

Weiss acknowledges that many IT experts disagree with him. “They say network security will take care of it. It won’t,” he said. “So again, what NIST is recommending is good and necessary but it’s not sufficient. Sensors have no authentication, and we need to be able to say physically that something is coming from my sensor, not China.”

Why not just update and patch ICS vulnerabilities and replace process sensors with new ones that have authentication? That’s complicated too — vastly more complicated and expensive than downloading a free patch for an app on your phone or laptop. It usually means getting the vendor of the system to install the patch and retest the system to make sure it works. Many will charge for it.

It can also be a scheduling nightmare — preplanning as much as six months ahead to take down a system in a very narrow window of time.

And sometimes patches can be incompatible with an older operating system.

Sammy Migues, principal scientist with the Synopsys Software Integrity Group, said late last year that ICS security is a hard problem in large part due to a massive amount of “technical debt” — the money it would take to bring ICSs up to date.

He said he’s not blaming small ICS operators who do the best they can with limited resources. “But if we want more security in OT that actually monitors and/or controls kinetic events, we need to invest more in making that stuff modern, workable, and manageable,” he said, adding that in the case of a water supply attack, “we need to invest in software between the IT and OT that double-checks what the physics and chemistry are saying, but not unilaterally override it when the software thinks it’s smarter than a gurgling sump pump.”

Slow going

All of which means that if this new NIST guidebook does change the cybersecurity of the CI world, it will take time — years.

“Some will read it and try to deploy [updates] but many will fail because they don’t have the staff, expertise, or money, so there will be half solutions implemented in the beginning,” Cipot said. “Then, I hope not, but maybe something like Stuxnet will happen, and recommendations like this might evolve to a standard that will say if you have OT/IT convergence, then you must also fulfil X security requirements.”

“Why it’s not that way from the start? Well, again it will be ‘staff, expertise, or money to implement’ vs. ‘it would be nice to have’ vs. ‘calculated risk.’ All those play a role in whether the NIST recommendations will become a standard sooner or later,” he said.

Finally, motivation is a factor, since these are recommendations, not mandates. As Knudsen put it, “You can lead a horse to water …”

--

--

Taylor Armerding
Nerd For Tech

I’m a security advocate at the Synopsys Software Integrity Group. I write mainly about software security, data security and privacy.