There are now $2.41 trillion reasons to improve your software security
To have any hope of succeeding at something, you have to get better at it. Based on that metric, the drive for better quality software is moving down the ladder of success. Things are getting worse, not better.
According to “The Cost of Poor Software Quality in the US: A 2022 Report” by the Consortium for Information and Software Quality (CISQ), the collective bill in the U.S. for defective software this year is an estimated $2.41 trillion, up almost 16% from 2020’s $2.08 trillion. That’s more than the GDP of all but a dozen countries.
And it doesn’t even count an estimated $1.52 trillion in “technical debt” (TD) — accumulated software vulnerabilities in applications, networks, and systems that have never been addressed. That debt is also up 16% since 2020. It isn’t added to the $2.41 trillion because it applies to future costs, but that debt will have to be paid eventually.
Those and other findings illuminate a continuing and alarming state of apparent denial among organizational leaders who know, or ought to know, that software can make or break them. If it’s high quality, with security “built in” throughout the software development life cycle, software can streamline operations, protect assets, and help create and deliver products and services that can make a business prosperous.
If it is written or maintained poorly, with little attention paid to security, software can make an organization an easy target for online attackers who can exploit its vulnerabilities to steal intellectual property, money, and customers’ personal and financial information. It can destroy reputations, leave businesses on the hook for legal and regulatory liabilities, and ultimately put them out of business.
Given that, you might think the vast majority of organizations would make the quality and security of their software a very high priority. Who wouldn’t want to be on the “make” side of make or break?
The biannual report, cosponsored by Synopsys, found that the major reasons for the cost of poor-quality software (CPQS) continuing to increase are:
- Existing vulnerabilities. While the losses for 2022 haven’t yet been totaled (there’s still a month to go, after all), they continue to increase. Note that these aren’t so-called “zero-day” vulnerabilities that nobody knew about until they were exploited. Existing vulnerabilities are known. In almost all cases there are patches or updates available. They just aren’t being applied.
- Complex problems with the software supply chain. In 2021, 77% of organizations reported an increase in the use of open source software. But meanwhile, the number of failures due to weaknesses in the open source components in software supply chains increased by 650% between 2020 and 2021. In short, organizations are using it more but protecting it less.
- The growing impact of rapidly accumulating TD. The report describes this as “the biggest obstacle to making any changes to existing codebases.”
That’s because its impact is similar to that of growing credit card debt. When it gets too large, borrowers get caught in a downward spiral of paying only interest and never paying down the principal. According to the report, a late-2019 forecast was that by 2025, 40% of IT budgets will be spent simply maintaining technical debt. Already the number of weekly hours an average developer at a company spends on addressing technical debt is 13.5 out of 41.1, or 33%.
What to do about all that?
The overall goals of the CISQ report, according to its author Herb Krasner, retired professor of software engineering at the University of Texas, Austin, and member of the CISQ advisory board, are not simply to document how bad things are, but also to recommend solutions including quality standards, tools for finding and fixing defects, and artificial intelligence/machine learning tools.
But to start, as you might expect, Krasner had no trouble finding examples of the need for those solutions. He cited more than a dozen of the “biggest operational software failure(s) of the last two years” that include
- The 2020 SolarWinds/Orion supply chain attack, which cost the company an estimated $90 million and the affected victims an average of $12 million
- The 2021 T-Mobile hack that compromised the personal data of about 76.6 million customers
- The 2021 Colonial Pipeline ransomware attack that took down 45% of the fuel supply to the U.S. East Coast and caused a major spike in gas prices
- The Log4Shell group of vulnerabilities in the open source logging library Log4j, another software supply chain defect that left millions of web servers vulnerable to hackers
Each of these and others were, at the time they were discovered, labeled “wake-up calls” that would jolt organizations into making the security of their software a higher priority. But the numbers in the report confirm that they weren’t — organizations have been hitting the figurative snooze button.
The report also lists the hottest cybercrime trends of the past two years, with ransomware at the top. The average downtime from those attacks in 2020 was 21 days, the average time to recover was 287 days, the average payment was $312,493 (up 171% from 2019), and total payments were $350 million, up 311% from 2019. The federal Cybersecurity and Infrastructure Security Agency reported in February 2022 that there have been ransomware incidents against 14 of the 16 U.S. critical infrastructure sectors.
Also on the trend list are cryptojacking, deepfakes, videoconferencing attacks, IoT and OT attacks, supply chain / open source software attacks, extended detection and response solutions, and critical infrastructure attacks.
So given the depth and breadth of the threats, much of the remainder of the report is an attempt to prod organizations into staying awake to the dangers of poor-quality software and using that awareness to reap the benefits of improving it. The report offers recommendations in the following areas:
Secure the software supply chain
This is especially true for open source components, which are a prime attack surface. The annual Synopsys Open Source Software and Risk Analysis (OSSRA) report has documented that open source software components are in virtually every codebase and make up 78% of them. Krasner noted in the report that even a medium-sized application has 200 to 300 third-party components in it.
The latest OSSRA report, based on a survey of 2,409 commercial codebases, also found that 88% of the open source components in use were not the latest version, that 85% had components that were more than four years out-of-date, and 88% had not had any development activity in more than four years.
That means far too many organizations are ignoring the key to maintaining the security of those components — keeping an inventory of what they are, who made them, what version is being used, whether they have any known vulnerabilities, and whether they are still being supported.
The way to do that is well established. Tracking open source requires an automated software composition analysis tool, which can then help create a software Bill of Materials (SBOM).
As Synopsys vice president and CISQ board member Anita D’Amico, put it,” just because a component you add to an application is secure today doesn’t mean that it will be tomorrow. That is largely due to the complexity of the software supply chain.”
“Creating an SBOM allows organizations to proactively gather a comprehensive inventory of the components used to make up a piece of software,” she said. “As new and existing vulnerabilities are identified, organizations are fully equipped to act quickly to remedy those issues.”
Address technical debt
TD is rampant because of short-term thinking. What Krasner called “a more exhaustive, long-term solution” to get software components to a “desired level of maintainability and evolvability” may cost more up front, but allowing TD to go unaddressed “comes with substantial, initially hidden costs that organizations must pay later.”
Part of the problem, according to the report, is that there haven’t been advanced ways even to measure TD, let alone fix it. “We believe it will take several more years before TD management grows from adolescence to adulthood,” Krasner wrote.
But there are automated tools available, including static code debt analyzers, to start paying off both the principal and interest of that debt.
In short, as in finance, debt doesn’t have to destroy you, but if it gets out of control, it could.
Set quality standards and then conform to them
A quality standard already exists, created through the leadership of CISQ called ISO 5055, which defines source code quality measurements in four categories: reliability, performance efficiency, security, and maintainability. It is based on the Common Weakness Enumeration list maintained by the Mitre Corp., which codifies more than 800 known software weaknesses.
But, of course, it has to be used to deliver any benefit. As Krasner put it, development teams can use the standard to “evaluate the structural quality of software ahead of each release, thus preventing dangerous flaws from being delivered into operational settings, where they will be orders of magnitude costlier to find and fix.”
“If all new software were created without those known vulnerabilities and exploitable weaknesses, the CPSQ would plummet,” Krasner noted.
Bring on the machines
Most software products are now assembled from existing components rather than written. That means “the difficulty has now shifted to understanding what the components do and how they interact/depend on each other. And so, working with pre-existing codebases that the development team probably did not write themselves is the new normal,” Krasner noted.
But artificial intelligence (AI), which includes machine learning (ML), deep learning, and natural language processing, “creates some unprecedented possibilities to transform the software development process,” Krasner wrote.
Those possibilities, collectively, can help developers work both better and faster, meaning both speed and quality can improve.
All these measures take time and money to implement. But the amount of time is decreasing, thanks to intelligent orchestration of software testing tools, an automated way to help developers do the right security tests at the right time without overwhelming them with false positives or irrelevant defect notices.
The promise of advances in AI/ML could make debugging software even more efficient.
Most important, better software security will yield multiple financial benefits including lower cost of ownership, higher profitability, better human performance levels, increased innovation, and more effective mission-critical IT systems.
Or, as noted at the start of this post, an investment in high-quality software that you and your customers can trust can “make” your entire business.