Attention, makers of lousy software. Biden is coming after you

Taylor Armerding
Nerd For Tech
Published in
7 min readFeb 13


It was the late stand-up comic Professor Irwin Corey who injected some humor into an obvious truth: “You can get a lot more with a kind word and a gun than you can with a kind word.”

As in, sometimes force works better than persuasion. And it looks like the federal government will soon attempt to apply a version of that philosophy to cybersecurity. The message from the Biden administration’s impending “National Cybersecurity Strategy,” expected to go public later this month, is that since organizations have failed to practice rigorous cybersecurity voluntarily — when the feds requested it — the request is about to turn into a command, beginning with critical infrastructure but likely spreading beyond that.

As the Washington Post put it, the strategy “for the first time calls for comprehensive cybersecurity regulation of the nation’s critical infrastructure, explicitly recognizing that years of a voluntary approach have failed to secure the nation against cyberattacks.”

According to the Post, which cited a draft copy of the strategy, the “gun” will be to shift liability for cyberattack damages onto organizations “that fail to take reasonable precautions to secure their software.”

The Post quoted National Cyber Director Chris Inglis saying at a cyber conference last September that if “self-enlightenment and market forces take us [only] so far […] then we have to go a little bit further as we have for cars, or airplanes, or drugs and therapeutics.”

According to Paul Rosenzweig, founder of Red Branch Consulting and a senior advisor to The Chertoff Group, it’s about time.

“This is truly revolutionary. It is risky. It is ambitious. It is new and different,” he wrote on the Lawfare blog, adding that even if it gets watered down by inevitable lobbying from businesses that will be the targets of proposed regulations, “one has to admire the ambition. Kudos to the Biden administration for putting the issue on the table.”

Actually, it’s been on the table within the cybersecurity industry for quite a while. Bruce Schneier, blogger, author and chief of security architecture at Inrupt, argued before the House Committee on Energy and Commerce in 2016 about the need for government intervention in making the Internet of Things more secure.

The market wouldn’t do it, Schneier said, contending that the lack of security is “a form of invisible pollution. And, like pollution, the only solution is to regulate.”

In 2017 he wrote on his blog that “the [high tech] industry is filled with market failures that, until now, have been largely ignorable. As computers continue to permeate our homes, cars, businesses, these market failures will no longer be tolerable. Our only solution will be regulation, and that regulation will be foisted on us by a government desperate to ‘do something’ in the face of disaster.”

His recommendation in that blog post reads like the template for the Biden strategy, six years later. “We need government to ensure companies follow good security practices: testing, patching, secure defaults — and we need to be able to hold companies liable when they fail to do these things,” he wrote.

That doesn’t mean the world of cyber is currently a lawless free-for-all. The major credit card brands created the Payment Card Industry Data Security Standard, better known as PCI DSS, which is “a set of security standards designed to ensure that ALL companies that accept, process, store, or transmit credit card information maintain a secure environment.” Failure to comply can lead to fines and sanctions.

And state laws like the California Consumer Privacy Act can apply to cybersecurity as well, since unsecure software can lead to data breaches, which compromise privacy.

You can get sanctioned, fined, sued, and more for writing buggy software, for failing to patch it, or for failing to install available updates or patches to software vulnerabilities.

But Jane Chong, former managing editor of the Lawfare blog, wrote in a post three years ago that liability for those failures tends to be random, inconsistent, and disjointed. She called it “a body without bones: big occasional settlements that strike fear in the hearts of vendors, but paired with little substantive development of the law to reliably guide vendors’ development, monitoring, and patching practices.”

And there is general agreement within the cybersecurity industry that software makers too often get a free pass when they do lousy work. Indeed, their licensing agreements and terms of service (TOS), filled with endless, dense legalese, basically tell users that they (the companies) bear no responsibility for software defects or any damage arising from those defects.

“Since its inception, the software industry has enjoyed an unprecedented and unrivaled ability to fend off product liability and externalize those risks and costs through its licensing agreements,” said Emile Monette, director of government contracts and value chain security at Synopsys.

“They don’t have absolute liability protection of course, but in many ways software consumers operate in a caveat emptor kind of environment.”

And most consumers are much more consumed by the features of a software product than they are wary of its possible security defects, so they click “agree” without even reading a TOS document.

The Finland-based cybersecurity company F-Secure famously offered proof of that nearly a decade ago when it created a spoof TOS that required users to turn over their first-born child, and if there were no children, “your most beloved pet will be taken instead. The terms of this agreement stand for eternity.”

Of course, people agreed to it. F-Secure or any other company could run the same experiment today and get the same result. Because nobody reads that stuff.

So clearly the current model isn’t working. But there are still multiple reasons — legal, political, and technological — why federal regulation of cybersecurity didn’t happen six years ago and why it will be difficult to make it happen now.

First, there looks to be significant wiggle room in a term like “reasonable precautions.” Especially when nothing is perfect. Even the most advanced software security programs won’t make organizations or connected devices bulletproof. So enforcement could devolve into lawyers fighting over what “reasonable” means.

Monette said that’s not an insurmountable problem. “Reasonableness is an established legal concept and term — it’s based on what a reasonable individual would do in similar circumstances — so determining it would not be impossible,” he said.

But he acknowledged every case would be “a fact-specific inquiry and the standard would shift over time. For sure, CISA [Cybersecurity and Infrastructure Security Agency] and other agencies charged with implementing some part of the strategy will need to promulgate guidance and possibly even regulations to affect the goals of the strategy. So there will very likely be more detailed information after this is released,” he said.

Rosenzweig said there are plenty of existing standards that could provide the detail necessary to define reasonable. “They might define it by reference to standards like the NIST [National Institute of Standards and Technology] cybersecurity framework, which are pretty detailed, or they might let the common law develop them through experience,” he said, but added that simply requiring some basics that should be obvious would help.

“Some stuff like two-factor authentication is low-hanging fruit, but lots of folks still don’t do it. So start with something that is totally unreasonable, and punish that,” he said.

Then there is politics. Rosenzweig acknowledged in his blog that opposition would be “intense.” Indeed, that opposition would be coming from some of the biggest, most powerful players in the tech world, who would likely argue that, among other things, the prices for software products would spike if government required them to spend much more on security while also putting them in legal jeopardy.

Monette agreed. “This kind of erosion of the industry’s legal footing will be met with fierce opposition,” he said.

And Schneier said in an interview that while the goal of the strategy is “great,” he is also enough of a realist to know that “lobbying will be heavy by a lot of the rich and powerful, who tend to get what they want.”

Finally, there is the reality that cybercriminals adapt and evolve rapidly, while government doesn’t. Setting and maintaining security standards for software products is not as simple as mandating seatbelts or airbags in vehicles.

Not to mention that government is notoriously poor at securing its own operations. Its catastrophic security failures are well known.

  • The Office of Personnel Management couldn’t protect the personally identifiable information of more than 22 million current and former federal employees.
  • The National Security Agency couldn’t protect its own stash of so-called zero-day vulnerabilities that it hoped to use to spy on, or attack, hostile nation states or terrorist groups. Instead, the stash ended up in the hands of Wikileaks.

But Schneier said the expertise part of the solution can be left to the market. “The trick is to regulate outcomes, not mechanisms,” he said. “If government just says you can’t pollute, that leaves how to do it up to the innovation of society. If government says to banks that if they let someone’s money get stolen they have to pay it back, it’s left up to the banks to figure out how to avoid that.”

Monette said he thinks the goal of the strategy is laudable. “The idea that software companies should enjoy greater product liability protection than other industries is problematic,” he said, “and if we make the risk of penalties greater for selling shoddy software, it might cause some companies to pay more attention to security. But that cost would have to go up significantly to get the bulk of the market to move.”

Schneier won’t be holding his breath, given the exemptions and loopholes that the congressional process could create. “If I were betting today, I’d say if it passes it won’t be useful,” he said. “But I’d love to be surprised.”

Rosenzweig sees at least the potential for the glass to end up half full. “Realistically this will be a tough sell,” he said, “but you gotta start somewhere.”



Taylor Armerding
Nerd For Tech

I’m a security advocate at the Synopsys Software Integrity Group. I write mainly about software security, data security and privacy.