Nerd For Tech
Published in

Nerd For Tech

Software development and security: Still an uneasy alliance

There are two crucial goals for people who develop software products: First, make sure they do what they’re supposed to do. Second, make sure nobody can make them do what they’re not supposed to do.

Or, more briefly, build both quality and security into software.

Those tasks are equally important. If an application won’t work, it’s useless. But it doesn’t really matter how well an app works or how many cutting-edge bells and whistles it has if users can’t trust it to keep them secure from hackers, who are forever trying to exploit bugs or other defects in software to do everything from stealing identity to stealing money to threatening the physical safety of their victims.

So you might think that anybody who has the technical savvy to write and/or assemble software code components would also have the savvy to know how to keep software secure.

Some of them do. But many don’t, not because they’re incompetent but because it’s a different skill set. As Michelle Drolet, CEO of the cybersecurity firm Towerwall, put it in Forbes, “Security requires high expertise, making it nearly impossible to be competent at everything.”

Sammy Migues, principal scientist within the Synopsys Software Integrity Group, said having “also understands software security” on a developer’s resumé is “more like a great-to-have second skill set — a specialization — rather than something core to what it means to be a developer.”

“If you’re trying to make every developer in your organization also be an architect, a compliance expert, a developer, a tester, and an operations person, and each of those with some security savvy, you’re doing it wrong and you’re not paying those people enough,” he said.

That doesn’t mean they can’t take direction from security experts. Migues noted that most carpenters can’t explain how alarm systems are bypassed and most automobile assembly line workers can’t explain how cars are stolen. “But they can all follow best practices,” he said. “Similarly, most developers can’t explain how software is attacked and exploited, but they can follow best practices.”

Best-practices conundrum

This isn’t always the case, however. The ongoing success of cyberattacks that exploit bugs and other software defects is due, in significant measure, to the pressure on developers is to produce — fast. Facebook, on Android alone, has between 50,000 and 60,000 builds each day. Amazon reportedly deploys new software to production every second — that’s 86,400 every day.

The reality is that speed usually trumps security. If using security best practices slows developers down, they are apt to let it slide.

Developers have been up front about that. Migues has been a coauthor for more than a decade of the “Building Security In Maturity Model” (BSIMM), an annual report by Synopsys on ways organizations in various industries improve their software security. In the 2020 BSIMM report, developers told researchers, “We’d love to have security in our value streams if you don’t slow us down.”

Indeed, Chai Bhat, security solutions manager at the Synopsys Software Integrity Group, said that given the popularity of the so-called DevOps methodology of creating software projects, “developers have metrics on how fast and frequently they develop/deploy software. Security can easily become an afterthought in the DevOps age.”

And Steve Sims, faculty fellow at the cybersecurity training firm SANS Institute, said that “it takes more design and planning, and often more code, to do things securely.”

“Development teams are often under immense pressure and doing things like training, even in small chunks, as well as secure design, threat modeling, code review, dynamic analysis, etc. are all time-consuming.”

Beyond that, Sims said, there is some institutional inertia. “Newer developers tend to be more security-minded but legacy developers, who have more business maturity and experience, might be less flexible with changing, Also there’s a lot of legacy codebases out there. Rewriting everything securely is incredibly time-consuming.”

But there is danger — real danger — in underestimating the stakes of leaving security as an afterthought, or no real thought at all. Software doesn’t just run your email, social media accounts, and online shopping. Today it runs just about everything, from your car to your appliances to your home security to the critical infrastructure you rely on — water, electricity, sewer, traffic signals, energy, the financial system, and more.

Existential implications

Which means that building security and trust into software projects has existential implications. And doing so requires guidance and training for developers in those best practices Migues cited.

Some guidance can be done with technology. Governance-as-code is the practice of configuring the platforms developers are using to prevent them from doing risky things or making common mistakes. Migues has in the past likened it to parents setting rules in the house about roasting a hot dog. You don’t let the kids build a fire in the living room. You make them do it the safe way, using the stove in the kitchen.

“Focus on preventive measures that make it extremely difficult for someone to do something wrong unnoticed,” he said.

There are also regular exhortations at security conferences to “make the secure way the easy way” to develop software, which would also make it the fast way.

One way to do that is with intelligent orchestration — automated software security testing tools that do the right test at the right time throughout the software development life cycle (SDLC) and don’t overwhelm developers with false positives or false negatives, or by flagging trivial defects that won’t affect either the security or quality of the product.

Finally there is developer training, which can elevate DevOps into DevSecOps, where security is embedded into the SDLC from planning to production and everybody on the team is aware of security best practices.

There is a wealth of information online about secure coding — enough to overwhelm most developers. But there are summaries that can make learning best practices more manageable. Among them is the Open Web Application Security Project’s (OWASP) Secure Coding Practices — Quick Reference Guide.

Even a summary will take time to absorb. The OWASP guide covers 14 practices that include advance planning — for example, an external application will need more and different security than an internal one — access control, ways to validate user input, and authentication and password management.

But while it won’t turn developers into security experts overnight, it will raise their security awareness over time.

Awareness training

There are also security awareness training programs specifically aimed at developers. The SANS Institute offers one that covers the OWASP Top 10 list of critical web application security risks.

Beyond that, software developers are likely aware that most of their projects involve assembling software components more than writing code, with the large majority of those components coming from commercial third parties or open source projects.

Open source is free, which is one of the reasons it’s so popular — it’s included in nearly all codebases (98%) and makes up the majority (75%) of most of them. But it usually comes with licensing restrictions and needs to be carefully tracked because patches or updates for known open source vulnerabilities aren’t “pushed” to users. They have to be “pulled” from whatever volunteer community is maintaining the component.

So if developers don’t maintain an inventory, known as a software Bill of Materials (SBOM), of every component they’re using, they’re unlikely to know when they need to patch it. One of the mantras in software security is, “you can’t protect what you don’t know you have.” It means the vulnerable component can’t protect you either.

An automated software composition analysis tool can help with creating an SBOM by tracking open source components, their licensing requirements and any known vulnerabilities. But a complete SBOM can get complicated, because the “surface” components of a software project aren’t all there is. Most components depend on other libraries or packages to function. And those dependences frequently depend on others, meaning that the supply chain of dependencies can go multiple levels deep and increase exponentially the number of components to track in a single application.

That means developers need to be careful about the open source components they choose to use. Jonathan Knudsen, head of global research within the Synopsys Cybersecurity Research Center (CyRC), wrote recently that “managing risk in the software supply chain means security must be considered at the time of component selection. The development process needs to have some safeguards so that when developers choose components, they base that choice on risk and not solely on functionality.”

Ultimately, there is no way around the reality that good security requires an investment of time and money. Sims noted that security technology is progressing — newer programming language like Rust are fast but also offer “strong memory management and type safety built in.”

But using it requires “learning new things and rewriting old code, all while keeping up with feature requests and new projects. It’s challenging to break old habits — stop copying and pasting old pieces of code, cutting corners, etc.,” he said.

So the challenge for security advocates is to convince developers that those investments will pay dividends. It is always easier and cheaper to fix defects early in the SDLC than at the end. Security awareness will likely slow down the development process, Migues said, “but it might speed up the overall value stream by preventing some rework.”

Preventing rework could be music to a developer’s ears.

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Taylor Armerding

Taylor Armerding

I’m a security advocate at the Synopsys Software Integrity Group. I write mainly about software security, data security and privacy.