The Biden cybersecurity EO: A push for better software
Presidential executive orders (EO) on cyber security are nothing new. You could even argue they’re getting old. They’ve come from five presidents now, starting with Bill Clinton in 2000, meaning the existing stack of them began more than two decades ago.
And all of them, while well intentioned and filled with worthy advice, exhortations, and even mandates, haven’t tipped the scales in any major way against cyber crime. If anything, it’s more rampant than ever, as we are all reminded by nearly daily headlines about attacks ranging from identity theft to espionage to major disruptions of critical supply chains.
Just this past week the federal Cybersecurity & Infrastructure Security Agency (CISA) reported that there were three ransomware attacks against water treatment plants during the past year; and the Sinclair Broadcast Group, owner of dozens of TV stations in the U.S., said some of its servers and work stations had been hit with ransomware.
So why should the latest on the list — President Joe Biden’s executive order on “Improving the Nation’s Cybersecurity”(more bureaucratically known as EO 14028), issued May 12 — be any different?
Perhaps it won’t be. As the cliché wisely puts it, time will tell, and the number of deadlines for action on the EO stretch well into 2022. But a number of cyber security experts believe this one has some potential.
Robert Chesney and Trey Herr, writing on the Lawfare blog just after the EO was issued, said it “deserves your attention. It contains concrete measures tailored to respond to lessons learned from recent crises, especially the SolarWinds and Microsoft Exchange compromises.”
Tim Mackey, principal security strategist within the Synopsys Cybersecurity Research Center, and David London, managing director, cybersecurity, with the security risk management firm the Chertoff Group, noted in a recent webinarthat the EO contains “74 directives within its 15 pages,” a number of which “raise the bar for secure software development.”
Of course, directives aren’t mandates — at least not directly — for the private sector. Imposing mandates requires passing laws, which is the domain of Congress.
But, as Herr and Chesney put it, an EO conveys policy directives that “can be used to require federal entities to take actions that would be useful from a cybersecurity perspective.”
Leverage without a law
Which means, among other things, that a president can use an EO for procurement leverage — requiring federal agencies to purchase only products or services that meet a certain standard. That means private-sector vendors who want to sell to the government will have to meet that standard, legislation or not. And if they spend the money and time to make those improvements to get government contracts, the expectation is they’ll do it for everybody else. Call it trickle-down quality, safety, and security.
“What you’ll see is a potential flow down of expectations from the EO and U.S. government to other critical infrastructure,” London said.
The EO covers multiple issues, including sharing of threat information, securing the cloud, and better detection and response to cyber incidents.
But software, which runs every system, network, application, and device in the online world, is at the heart of improving cyber security. Section 4 of the EO, titled “Enhancing Software Supply Chain Security,” seeks to address vulnerabilities that have enabled attackers to jeopardize not just private companies’ bottom lines but also the U.S. economy and critical infrastructure.
It calls for a definition of “critical software,” which the federal National Institute of Standards and Technology (NIST) has since issued. In a white paper released earlier this month, NIST said critical software is “any software that has, or has direct software dependencies upon, one or more components with at least one of these attributes:”
- Is designed to run with elevated privilege or manage privileges;
- Has direct or privileged access to networking or computing resources;
- Is designed to control access to data or operational technology;
- Performs a function critical to trust; or,
- Operates outside of normal trust boundaries with privileged access.
London said since that definition was issued, NIST has issued a set of security requirements for critical software that align with standards already available from NIST and other organizations, covering measures like access control, data protection, backups and logging, and testing.
A number of those testing measures “are already considered best practices, such as threat modeling, a blend of staticand dynamic analysis, hard coding of review of secrets, and black box testing,” he said.
How will this affect the world of business? That was the major focus of the webinar given that, as has been said many times, every company today is a software company — those that don’t build it still have to buy it and use it.
Those who build it “need to understand the software they produce, and if it falls within the category of critical software by understanding its critical functionality within the client environment,” London said.
Both builders and buyers of software need to know “where software code is coming from, where it originates, who authored it, why it’s there, and how it was tested,” Mackey said.
How to do that? One of the major, and soon mandatory (at least for vendors to the federal government), ways is with a software bill of materials (SBOM).
“One of the things the EO recognizes pretty early and consistently is a requirement to understand where code is coming from,” Mackey said.
An SBOM that complies with the EO must include:
Supplier name: the name of the entity that creates, defines, and identifies components.
Component name: designation assigned to a unit of software defined by the original supplier.
Version of the component: identifier used by the supplier to specify a change in software from a previously identified version.
Other unique identifiers: other identifiers that are used to identify a component or serve as a lookup key for relevant databases.
Dependency relationship: characterizing the relationship that an upstream component X is included in software Y.
Author of SBOM data: the name of the entity that creates the SBOM data for this component.
Timestamp: record of the date and time of the SBOM data assembly.
The need to maintain information like that might seem obvious—how can an organization manage, protect, or vet the quality of something it doesn’t even know it’s using? But SBOMs are still not mainstream.
“Historically there has been a lot of opacity around software ingredients, especially for open source and third-party code,” London said. “There are obvious implications here on security.”
Endless rabbit holes
One of the likely reasons software is opaque and SBOMs aren’t mainstream is that compiling and maintaining them can be dizzyingly complex. Tracking the number of components in a single app can feel like trying to navigate an endless network of rabbit holes.
Mackey spoke of a single application that, on the surface, has just eight “dependencies” or components from other suppliers. But just one of those eight has 15 dependencies of its own. And one of those 15 has 30 more.
“So, when you peel back the onion on this, (the app) actually has 133 separate components in it that go eight levels deep,” Mackey said. “The decision to use it means I now have 133 components that are part of my overall supply chain to power this application.”
“Trying to figure out all those dependencies can be a really difficult challenge, but it’s also a scale problem,” he said.
That’s not the only challenge. Mackey noted that if an application is going to be packaged in a container, “I have some additional attack profiles that I need to address, so when the build process for that Docker image is run, it’s actually referencing the correct artifact, and the images it’s using for a base image are actually hardened to some form of industry standard.”
Then there is the end-of-life problem. If the creators of a component abandon it, putting it in a “deprecated” state, that means patches and updates will no longer be issued if vulnerabilities are discovered, which will make it an increasingly attractive target for hackers.
The results of that kind of complexity are predictable. Mackey cited the Open Source Security and Risk Analysis, an annual report by Synopsys on the use and security of open source software, which is now the large majority of most codebases. The latest report found that the average commercial codebase had 528 components.
Also, no surprise, it found that 75% of organizations take longer than a week to resolve high-severity vulnerabilities.
That means, even with the compilation of an SBOM, there is plenty of work still to do. “The SBOM is just the beginning,” Mackey said. “You need to define your process.
“That includes having implementation plans for what you’re going to do when you receive an SBOM. How are you going to deal with legacy systems? Do you have any compliance issues that are going to be at the regulatory level? How are you automating all of this? What is your vulnerability response plan for anything related to a supply chain issue?”
Automate!
Obviously, there’s no way to do all that manually. So the solution is “automation, automation, automation,” he said.
Fortunately, automated tools are available. Mackey noted that Black Duck (within Synopsys), offers an automated software composition analysis tool to help organizations find open source software components along with any known vulnerabilities and possible licensing conflicts. He said Black Duck is now able to export an SBOM that complies with the EO “as articulated by the NTIA [National Telecommunications and Information Administration] on July 12. It will generate an SBOM you can use as part of any requirements that your organization has around the EO.”
All of which suggests that a year from now the world of software security really may be different, whether the catalyst is prodding from the EO or market pressure.
Sammy Migues, principal scientist with the Synopsys Software Integrity Group, said in a recent Forbes story on tech trends for the coming year that “more people will demand to know what their software is made of. Whether it’s a ‘nutrition label,’ ‘bill of materials’ or something similar, organizations will demand that vendors account for all the software used in apps and devices, where the software came from, how it was built and tested, and how it’s being maintained.”
“In a few years, selling opaque software will be the exception rather than the rule,” he said.
Which would be very good for the good guys and bad for the bad guys.