Nerd For Tech
Published in

Nerd For Tech

Will banning the shame game on hacking victims improve cyber information sharing?

Nobody likes being shamed. That’s why, in many areas of life, it’s essentially banned. Especially when it comes to things largely out of people’s control, like body type or addiction. There’s nothing redeeming or constructive about it — it’s simply cruel and destructive. And you can get fired for doing it.

But the line isn’t as clear in areas like cybersecurity, where organizations do have some control over how vulnerable they are to online attackers. If they are breached by hackers, are they helpless victims deserving only of sympathy? Or, if they’ve failed to implement security fundamentals they’ve heard about for years, do they bear some responsibility and deserve some criticism?

Maybe shaming, or fear of shaming, might get them to do what they should have been doing all along.

There’s no broad consensus on that. Some organizations argue that it doesn’t really matter whether their cybersecurity is rigorous or not because a determined hacker or hostile nation state will eventually get through no matter what they do. Avoiding a major breach is more a matter of luck, they contend, than whether one has implemented the “layered security” that has been preached in the industry for decades.

Indeed, a post in The Hill from more than five years ago argued that expecting organizations to be secure from cyberattacks would be like expecting your local corner grocery to have “a fortified perimeter — barbed wire, tanks, and guard towers looming while helicopters and fighter planes scream overhead.”

The counter argument is that while it’s true that nothing can make you bulletproof online, it is possible to become a much more difficult target.

Joseph T. Carrigan, senior security engineer at the Johns Hopkins University Information Security Institute, said carrying analogies to the extreme miss the point — nobody expects a corner grocery to have a military presence for security, but they also don’t expect the door to be left unlocked every night.

“There are best practices in loss prevention, and they don’t involve fighter planes,” he said. “There are also best practices in cybersecurity that can be affordably implemented by most companies.”

Undermining community

But another argument is that shaming, even for security failures that enable a breach, undermines a sense of community and information sharing that could improve security overall.

In a recent interview with TechRadar Pro, SolarWinds CEO Sudhakar Ramakrishna called for an end to cyberattack “victim shaming” because it makes companies unwilling to share vital intelligence with other potentially vulnerable organizations about a breach.

Ramakrishna has cause to be more sensitive about the topic than most, given that SolarWinds was the victim of one of the most notorious cyberattacks of 2020. Hackers believed to be directed by SVR, the Russian intelligence service, were able to slip malware into an update of the company’s IT performance monitoring system called Orion.

So besides the usual “bug fixes, performance enhancements, and other improvements” that come with a normal update, between March and June of 2020 the corrupted update spread malware to an estimated 18,000 SolarWinds customers when they did what experts are forever telling them to do: Keep your software up-to-date!

It was a stark example of the risks of a vulnerable software supply chain. Instead of having to hack into thousands of organizations separately, the attackers just compromised one vendor and let supply chain connections take care of the rest, giving them access to the data and networks of those SolarWinds customers.

Among the downstream victims were tech giants Microsoft, Intel, and Cisco. Among federal agencies compromised were the departments of State, Justice, Commerce, and Treasury; plus NASA, the FAA, National Institutes of Health, and National Nuclear Security Administration.

The list even included the Cybersecurity and Infrastructure Security Agency, within the Department of Homeland Security, whose job it is to protect federal computer networks from cyberattacks.

The blame game

So was this the result of weak security or an example that nobody is bulletproof? Apparently some of both. The attackers were highly sophisticated and used a technique that was unprecedented and floored some of the experts who sought to analyze it after the fact. One described it as “phenomenal tradecraft.”

Ramakrishna wasn’t in charge at SolarWinds when the hack occurred — it began in September 2019 and did most of its damage in mid 2020, and he didn’t become CEO until January 2021. But he told TechRadar that the result of victim shaming is that “companies often end up fixing problems without saying anything about them. There is definitely hesitation to speak up.”

“Each one of us is defending against an attacker. But on one side is a coordinated army with a singular purpose, to attack, and on the other is a set of fragmented soldiers. In the event of an incident, it’s important to leverage help from the community,” he said. “We need to make people aware of issues faster; that mindset needs to establish itself in software security.”

But Carrigan said holding organizations accountable isn’t gratuitous shaming. “If you sell a network management tool to large swaths of the federal government, the security of that product must be a primary concern,” he said, noting that both the New York Times and Bloomberg reported after the breach was disclosed that SolarWinds had been warned in 2017 by one of its internal experts that its lax security could lead to a catastrophic cyberattack.

Also, the company apparently wasn’t eager to share details of the attack with the “community,” at least when it was discovered. National Public Radio (NPR) reported in April 2021 that the company’s first move was to hire a lawyer.

“One of the first things companies tend to do after cyberattacks is hire lawyers, and they put them in charge of the investigation,” NPR reported. “They do this for a specific reason — it means everything they find is protected by attorney/client privilege and typically is not discoverable in court.”

Carrigan added that, in the company’s defense, when the attack became public, “SolarWinds was very transparent about the incident. That transparency helped identify those affected.”

Still, he said he doubts that if victim shaming stopped, more organizations would share information about being breached. “I think many companies would be perfectly willing to never disclose any incident like this,” he said. “I also think there are many incidents that go unreported because the company was either successful in keeping it quiet or was unaware that an incident occurred.”

That would indicate that it’s not so much the threat of public shaming that makes companies hesitant to share the details of an attack — it’s more the risk of brand damage, potential legal liability and regulatory sanctions for security failures.

The one-way street

Ramakrishna didn’t respond to a request for comment. But information sharing on cyberattacks has been a contentious issue for decades, and most of the complaints haven’t been about shaming.

Instead, privacy advocates have argued for years at major security conferences that the government version of information sharing is a “one-way street” in which they are expected to share while government doesn’t.

And private organizations are understandably loath to share anything about being the victim of a cyberattack since it could bring them even more misery than trying to recover from the attack itself — bad publicity and legal liability.

There is — or there should be — a major difference in the response to an organization being breached even though its security is rigorous, and one that gets breached because it was careless. The notorious breach in 2017 of credit reporting giant Equifax was enabled by the company’s failure to install a patch that had been available for several months.

After that debacle, nobody had much sympathy for then-CEO Richard Smith when he was called before a congressional committee for some rhetorical flogging.

John Tapp, senior security consultant within the Synopsys Software Integrity Group, said whatever shaming and scolding happens “is at its worst when it appears as though the victim is behind the curve and unable to reset their security posture from a reactive one.”

And Carrigan said some shaming or criticism can serve a useful purpose — better security. Media reports of lax security at SolarWinds “led to an internal change to integrate security into the corporate culture at SolarWinds. I don’t think that happens if the organization is not taken to task and asked the hard questions,” he said.

No more choice

Whether shaming inhibits information sharing or not, it looks like the debate over it is on its way to becoming irrelevant, at least in critical infrastructure industries. Information sharing is becoming mandatory. Starting May 1, banking organizations and their service providers will be required to notify federal regulators within 36 hours of any computer-security incidents of a certain severity.

Yes, required — not recommended. It’s a rule, issued jointly last November by the Federal Deposit Insurance Corporation, the Federal Reserve, and the Office of the Comptroller of the Currency.

While the agencies oversee different components of the financial industry, the rule is essentially the same for all of them for an attack that rises to the level of a “notification incident.”

That means any event that has “materially disrupted or degraded, or is reasonably likely to materially disrupt or degrade,” a banking organization’s ability to operate or deliver its products and services.

Examples of notification incidents include “a major computer-system failure; a cyber-related interruption, such as a distributed denial of service or ransomware attack; or another type of significant operational interruption.”

In addition, President Biden signed legislation last month that for the first time in internet history will require operators of U.S. critical infrastructure (CI) to report “significant” cyber incidents and any ransomware payments to the federal government.

The Cyber Incident Reporting for Critical Infrastructure Act of 2022 was adopted as part of the 2,741-page, $1.5 trillion omnibus spending bill approved in March. There have been information-sharing initiatives in the past, but they have been voluntary. This one isn’t. And it covers 16 CI sectors including food and agriculture, information technology, energy, healthcare, water supply, finance, nuclear, and transportation.

So while it could take more than three years for the rulemaking process to be completed, it looks like cyberattack information sharing will no longer be optional.

That, Carrigan said, is the way it should be. “It’s important for the community to be helpful,” he said. “By and large I think all of us in cybersecurity are willing to do what we can. That doesn’t stop us from being frustrated by companies that experience breaches.”



Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Taylor Armerding

Taylor Armerding

I’m a security advocate at the Synopsys Software Integrity Group. I write mainly about software security, data security and privacy.