“Our cybersecurity ‘industry best practices’ keep allowing breaches” Makes some good points
Nah I’m kidding. But the title is true — why?
Infosec had a hilarious near-universal melt-down when a business school’s IT prof, Allen Gwinn, somehow got a security opinion piece published in an inside-the-beltway political blog. Ok I guess? It was… not wholly accurate.
Well, someone’s gotta take this one on — and really, who better to comment than a twice-canceled social pariah former-tech-journalist-slash-former-mediocre-sysadmin on what really causes breaches?
Breaches and bugs are not an inevitable part of everything complicated thing we make — no one has yet taken over a Mars rover for a joyride, or used the ISS to mine Bitcoin. But there’s a reason that when I talk about reliable software I tend to pick the easy route and talk about old NASA work or the like. Space agencies have the luxury of taking things slow and throwing gobs of work hours at projects before they ever leave the door, much less go to space. Industry doesn’t, and at least in the foreseeable future, won’t get to do that.
But, you see, racing every release out the door on an obsolescence schedule which is often the only thing that drives the kind of high revenue streams both billionaire owners and nihilistic markets demand *kinda* precludes a lot of slow and careful process reflection.
And, it is process that’s the problem. Whether it’s actual insider threats or single dads smuggling all your corporate data onto file sharing sites so they can keep their job while looking after a sick kid from home, your company data ends up all over Pastebin because your people can’t or don’t know how to do what they need to in ways that are safe.
This is all doubly true for ransomware — a problem that was efficiently solved decades before it was born, like buffer overflows on websites. (The first is solved with backups, the second by using a memory safe language. You’re welcome! For actual implementation, hire an actual professional and pay them actual money.) (Also you have to listen to them after that. How do I have to say this part in 2021? But you do actually have to listen to them and implement the fixes or your shit stays broken.)
Almost all security problems can be solved by slowing your roll, auditing your systems and code and taking real time to educate your workforce. But you’re not going to do that because it costs real money and time, not just “competitive salaries,” and no one ever fires the board and CEO for the mere fact of sucking at their jobs. You fire your sysadmin when the CEO sucks at their job, obviously.
At least insurance companies are getting wise, and are becoming the unlikely heroes of public security. Which says something about what counts as a “hero” in this system. Insurance companies are beginning to require the shit they insure get fixed, bless them. I cannot tell you how many times I’ve heard that some red team friend of mine got a contract, happily pwned everything and submitted their report — to which the CXOs replied by buying more insurance. This, by the way, made every single one of the my red team friends deeply unhappy, and is why I’ve heard these stories over booze so often. Insurance companies have started to hear these stories too, and have somehow become our allies by requiring more auditing and evidence that problems get fixed? It’s a weird world these days.
Seriously management, you are the problem. If there’s a security breach, sure, your sec staff might be incompetent, or maybe you sent them into the breach armed with a pointy stick. And a six figure salary, since that’s what you think makes things secure.
If you can’t audit your code, audit your processes, segregate your network and functions, and how am I having to say this in 2021 — back up your shit — Then yeah, it’s all getting owned, ransomware is an industry, and society’s infrastructure is just going to get more expensive while being increasingly shit from the word go.
Software engineering isn’t. And while I love to blame programmers and industry norms for absolutely everything being shit as much as the next systems administer (And former UI person… programmers, I have so many reasons to hate you!) I will grudgingly admit that the larger problem for the security people and the programmers is leadership truly, really, gives not a single flaming shit until that flaming shit is giant and covering a significant portion of global infrastructure.
At that point it’s definitely someone else’s fault, bad hackers, bad security, bad something or other that isn’t management, government, or some other part of society that demands too much too fast with far too little investment.
Why do “Our cybersecurity ‘industry best practices’ keep allowing breaches”? Because you get what you pay for.