The Start of Accountability or the End of Software?

What’s the Ukrainian word for “scapegoat?”

The small Ukrainian tax software company that is accused of being the patient zero of a damaging global cyberepidemic is under investigation and will face charges, the head of Ukraine’s CyberPolice suggested Monday. Col. Serhiy Demydiuk, the head of Ukraine’s national Cyberpolice unit, said in an interview with The Associated Press that Kiev-based M.E. Doc’s employees had blown off repeated warnings about the security of their information technology infrastructure. “They knew about it,” he told the AP at his office. “They were told many times by various anti-virus firms. … For this neglect, the people in this case will face criminal responsibility.”

For the software liability crowd this is going to come as good news. “Finally!” They cry, “People will start to be held accountable for their carelessness and negligence!” That seems all well and good; who doesn’t want to hold people accountable for preventable problems? But is this really the path we want to go down?

Everyone gets behind liability in the physical world because such activity is usually linked to death. Usually a lot of them in a short period of time. If there is one thing we learn from it is death.

Here is the problem with that sort of thinking as it migrates to cyberspace: victims have minimal control over how hard or easy it is to defend themselves. Beyond issues related to basic functionality, they don’t know what’s going on inside the operating systems they use, they don’t know what the chips inside their computers are doing, they don’t know what the (commercial) software they’re running is doing, and they don’t have any control over the bytes that flow into their enterprise.

Bad guys know far more about all of the aforementioned technology because they’re happy to violate terms of service, license agreements, and so forth in order to learn where and how things are exploitable. There is money in breaking things, and they’re good at it to the tune of billions of dollars a year.

The flip side of the coin is that there is money in running an efficient business as well, which means focusing on functionality, not security, because the vast majority of people using a given software or service do so for legitimate, authorized, legal reasons. This applies to both users of such technology as well as those who make it. Everyone is responding to market forces, and if there is one thing more powerful than the righteousness of someone’s security-based argument, it’s the invisible hand.

Holding victims liable for not addressing vulnerabilities — absent clear and compelling evidence of negligence — is blaming the victim. The online version of ‘she was asking for it because look how she was dressed.’ If you find that last sentence invalid in meat-space, you shouldn’t be entertaining it as a viable excuse in cyberspace.

Initial media reports suggest the victim in this case ignored repeated warnings to address these issues but failed to do so. If that’s true, it’s a compelling argument to hold them at least partially responsible for what happened because no one operating online today can claim ignorance of risks or threats. But let’s be clear about where this leads us if this becomes a precedent:

Software gets a lot more expensive. A LOT more expensive. How much more? I’m guessing orders of magnitude more. That’s not just direct costs linked to checking and re-writing code, but in compliance with new regulations that will inevitably come down the pike.

Software is still going to be exploitable. Even after all that cost and effort, there will still be weaknesses. Nothing operates in a vacuum. Do you know how many discrete “<standard> compliant” technologies stop being compliant the moment they are installed in some environment that is not lab-perfect? Pretty much all of them.

Progress takes a holiday. Every technology-related advance we’re counting on today — both on a personal and societal level — comes to a grinding halt. Software will stop eating the world and go into hibernation. The need for programmers will go up, but it’ll be because they’re needed to fix old stuff, not develop new stuff. Everything everyone is hoping becomes the next big thing immediately transitions from unicorn to potato bug.

Security is not the issue security practitioners and proponents think it is. It just isn’t. This doesn’t mean we don’t stop trying to make things better; it does mean we need to start to think of ways to accomplish what we want without impeding progress, causing immediate harm in a quest for long-term good, and tanking the economy.

Like what you read? Give Michael Tanji a round of applause.

From a quick cheer to a standing ovation, clap to show how much you enjoyed this story.