The ROI of UX in Enterprise Software — Part 2
In part 1 of my breakdown I mentioned the ‘Rule of 10x’. Throughout part 1 I was able to provide some fairly solid calculable methods to justify how much having solid user experience from the start of a project could impact not just the bottom line, but more importantly the overall adoption and acceptance of a project.
That said, one of the items that I mentioned, the ‘Rule of 10x’. In case you are just joining me, and don’t want to be bothered;
Changes to project requirements increase by a factor of 10 at each stage in the project’s lifecycle. Correcting a problem at the development stage is 10x more expensive than fixing the same problem during the user-centered design phase.
Seemed to be anecdotal at best. Sure, it sounds good and why wouldn’t it? It justifies our existence right? Anyone practicing UX could tell you our efforts, when done properly can save development time and money.
My problem, and I am sure the problem of many stakeholders, the ‘Rule of 10x’ doesn’t seem to have any data to back it up. Now there are things in this world that you shouldn’t need data to prove. I can make a dozen edits to my sketch or photoshop files according to user feedback in a matter of minutes. Depending on the severity, those changes could take days for development and testing.
But, again no hard data.
Or so I thought.
In 2004, NASA decided to begin reviewing the Error Cost Escalation through the project life cycle of their own software. They initially reviewed the findings from 7 independent studies.
NASA reviewed it’s own software development projects to assess the cost of an error during the process. The numbers were, well…
Not to be outdone, there was another study done in 2003 by Alan M. Davis , author of 201 Principles of Software Development, voted by the Association for Computing Machinery members as one of the 20 classic computer science books.
Alan Davis’ studies didn’t get any better for Error fixing costs.
In 2007 IBM’s System Sciences Institute reported that the cost to fix errors found after release were 4–5x as much as errors discovered in design, and 100x more than one identified in the maintenance phase.
All this data is ‘’old’
Indeed it is, and I started out by saying that we are looking for some ‘hard data’. So what about that?
Let’s take a quick look at the application development landscape in 2004 when these studies were done and published.
What are the odds?
That the results of these studies done back in 2004 have gotten anything but worse for cost effectiveness considering how exponentially more complex the devices our software needs to work on and with?
We no longer have just a desktop or laptop with a couple browser choices.
There are 11,868 devices that every Android app has to work on alone.
For the sake of argument lets just say of all these devices, IF all goes well, you only have to worry about the SIX Supported screen densities:
- ldpi (low) ~120dpi
- mdpi (medium) ~160dpi
- hdpi (high) ~240dpi
- xhdpi (extra-high) ~320dpi
- xxhdpi (extra-extra-high) ~480dpi
- xxxhdpi (extra-extra-extra-high) ~640dpi
For Android screen densities alone we are double the 3 screen resolutions that accounted for over 90% of monitors in 2004.
Oh, and I didn’t even mention a little company called Apple and their devices.
One last thing…
In the last year, we have seen ‘wearable’ technology beginning to get adopted while voice devices like Amazon’s Echo begin making headway. Oh, and something called VR is just behind that. The Rule of 10x is already going to be grossly underestimating the cost of not investing in UX upfront to determine what engineers and developers should be focusing their efforts on, and more importantly, not reworking after the fact.
(PDF) Error Cost Escalation Through the Project Life Cycle by JM Stecklein
The Cost of Requirement Errors by Peter Gordon
Thank you all for taking the time to read and share Part 1. I hope you find Part 2 as helpful. If you did, hit that little heart and help others find it as well.