A recent report by the Consortium for Information & Software Quality (CISQ) revealed the economic impact of poor software quality in the US. Below, we outlined our favorite takeaways and recommended solutions.
Following a year where digital experiences were thrust into the spotlight, CISQ has released The Cost of Poor Software Quality in the US: A 2020 Report, an update to their 2018 analysis. The report combines publicly available online data with expert software quality knowledge to calculate the economic consequences of poor software quality in the United States.
Herb Krasner, the report’s author and an Advisory Board Member of CISQ, recently hosted a webinar to present highlights from the report, alongside Joe Jarzombek, Director for Government and Critical Infrastructure Programs at Synopsis and Governing Board Member at CISQ, and Dr. Bill Curtis, Executive Director of CISQ. Below we’ve gathered the most salient findings from the webinar and report.
Just the Tip of the Iceberg
The report estimates the total cost of poor software quality (CPSQ) in the U.S. in 2020 as close to $2.08 trillion, compared to CISQ’s revised 2018 total of $2.1 trillion. Yes, trillion. This is roughly 10% of the United States gross domestic product and twice the total IT labor base (including C-suite and gig economy workers).
Herb uses an iceberg model to illustrate the true impact of these costs. Just as only a small part of an iceberg is visible above the surface of the ocean, the cost of poor quality software comprises both direct (or observable) and indirect (or hidden) costs. Stock loss, service outages, and lost revenues are examples of direct costs, while indirect costs include delays, overtime, and fixing bugs.
Astoundingly, the hidden costs are 6 to 50 times that of the observable costs.
How Did This Happen?
Getting to the root of this software quality problem is complex, but at the highest level, the agility-stability paradox is to blame. Herb explains, “The result of increased digitalization through software has created a balancing act trying to deliver value at high speed without sacrificing quality or security. As it turns out, we are not very good at balancing.” The huge CPSQ total bears this out.
Digging a bit deeper, there are three major contributing factors:
Operational Software Failures
The category of operational software failures can include data breaches, ransomware attacks, IT outages, and cyberattacks. The estimated 2020 cost is $1.56 trillion, up from $1.275 trillion in 2018, mainly due to the escalating frequency of cybersecurity incidents and software defects.
A 2018 study from Tricentis finds 606 significant software failures as reported by English language news sources. These resulted in a loss of $1.7 trillion in assets at 314 companies, averaging $2.8 billion per failure. With failures expected to grow at the same rate as software development, we can expect even more failures in 2021.
Unsuccessful IT/Software Projects
Failed projects make up another portion of the poor software quality cost. This category includes those canceled before completion and those challenged in execution, accounting for $260 billion, up from $177 billion in 2018, though project failure rates have remained consistent for a number of years.
The Standish Group’s latest report, CHAOS 2020: Beyond Infinity, states that 19% of projects are canceled before completion, and 47% are challenged (late, over budget, and with low software quality). Only 35% of all projects are actually successful (on time, on budget, and meeting quality expectations). This is a classic example of the types of issues that are hidden below the waterline in Herb’s iceberg example.
Poor Quality in Legacy Systems
Rounding out the major areas of cost is the classic burden of operating legacy systems, defined by CISQ as “large software systems that we don’t know how to cope with but are vital to our organization.” The cost in this category is $520 billion, down significantly from $635 billion in 2018, possibly due to resource reallocation and senior IT staff retirements.
Over time, legacy systems are increasingly difficult to support, and software quality issues can be harder to resolve. As Herb remarks, “Modernization is not always straightforward.” Also, legacy systems costs are often 70% to 75% of a company’s IT budget and can make up 80% of the cost of ownership for software, posing a risk of failure for any business.
Additional Costs of Cybersecurity and Technical Debt
The report and webinar considered a fourth category separately from the three above, as the components do not contribute to the $2.08 trillion total for CPSQ. This category includes cybersecurity, a subset of the costs of operational software failures, and the future cost of technical debt.
The FBI Internet Crime Complaint Center (IC3) estimates U.S. cybercrime costs at $10.2 billion, although other studies estimate a much higher total. In addition, according to a 2019 Accenture study, cybercrime costs for each company affected were $13 million, 12% higher than in 2018. Rapid software growth and vulnerable software systems give cybercriminals opportunities to carry out these attacks.
The often overlooked cost of technical debt, which resides in severe defects in need of correction, is $1.31 trillion (without accumulated interest), up from approximately $1.145 trillion in 2018. With approximately 100 billion new lines of code written each year, including an average of 25 bugs introduced per 1,000 lines of code, technical debt continues to grow rapidly over time, posing long-term challenges for many organizations.
What Can We Do About It?
According to Herb, “In 2018 we concluded that the key strategy for reducing CPSQ is to find and fix problems and deficiencies as close to the source as possible, or better yet, prevent them from happening in the first place.”
He adds, “Our general recommendations for 2020 continue to emphasize prevention. The next best approach is to address weaknesses and vulnerabilities in software by isolating, mitigating, and correcting them as closely as possible to where they were injected to limit the damage done.”
His top three specific recommendations for achieving these goals are:
- avoiding low quality development practices and adopting secure coding practices
- recognizing the inherent difficulties of developing software and using effective tools to help deal with them
- ensuring early and regular analysis of source code to detect violations, weaknesses, and vulnerabilities
These weaknesses and vulnerabilities increase right along with software systems as they become larger and more complex, leaving these systems at risk for exploitation.
Herb notes that “Traditionally, software development does not include quality concerns or robust testing until the end of the development cycle, meaning that code is often produced to be functional but not necessarily of high quality.” This can cause weaknesses and vulnerabilities to be overlooked, especially when security is not recognized as a key part of quality.
What Else Can You Do?
In addition to preventing and fixing deficiencies close to the source and addressing weaknesses and vulnerabilities throughout the development stages, organizations can use automated tools to help tackle software quality problems. Tools supporting static and dynamic code analysis techniques and those with “security domain checkers under the hood,” as Joe describes them, are especially valuable.
Ready to begin addressing the critical software issues contributing to CPSQ within your organization? Try OverOps for free and start identifying, preventing and resolving software deficiencies today.