Mind the gap between business and technology: connect the dots with data

Olivier Liechti
Avalia Systems
Published in
7 min readFeb 5, 2019
Photo by Suad Kamardeen on Unsplash

We often hear people talking about “the gap between business and software development”. But what does this actually mean? Can we better characterize this broad statement and make it more concrete with examples? And with this understanding, can we sketch out solutions to improve the situation?

In this article, our goal is to discuss these questions, by looking at data-driven software engineering approaches. We examine the particular case of software assessments, where the goal is to understand the current state of a software organization (technology, processes and people). We share our experience with the Goal Question Metric method, which has proven to be very effective in these situations.

Why is it so hard to bridge the gap between business and technical domains? One reason is that it is difficult to establish measurable causal relationships between the two domains. Will we increase revenue if we do more automated testing? Why should we invest resources to reduce technical debt? Are we going to reduce costs if we move towards continuous delivery? Even when business stakeholders understand this jargon, even when they understand the rationale of software engineering practices, they have a hard time quantifying their impact. As a result, investment decisions are not easy to make.

The challenge of connecting the dots between business and technology

Models, conversations and shared understanding

To improve the situation, we argue for data-driven approaches that make explicit connections between the technical and business domains. The general strategy is to capture relationships in a model, to feed data into the model and to collaboratively analyze the results. Often overlooked, the last step is fundamental: the conversations enabled by data-driven models produce rich insights, which are the most valuable outcome of the process.

We have applied this approach in different types of organizations (startups, large companies), across geographies and in various situations (M&A due diligences, internal assessments). We have seen consistent benefits in every one of these projects. Data-driven software assessments are effective in triggering and framing conversations. They do a lot to improve mutual understanding and decision making.

How does it improve a technical due diligence?

The stakeholders who drive an overall due diligence process are familiar with formulating questions and identifying risks in the commercial, financial and legal domains. They are familiar with measurement models that are specific to these domains.

However, they usually have a hard time specifying what to assess in the technical domain. Yes, they know that “scalability”, “security”, “quality” and “the dev team” are dimensions that need to be evaluated. But how? And what is even more difficult for them is to grasp how the technical domain impacts the business domain.

As an example, the links between 1) specific DevOps practices, 2) the frequency of software releases, 3) the ability to respond quickly to market changes and ultimately 4) an increase of the revenue is not obvious to make for non-technical stakeholders.

When we deliver due diligence projects, we address this problem by applying the data-driven approach described:

  • we first create a model that logically connects business concerns with technical practices,
  • we then collect data from different sources, feed the data into the model and create visual representations of the results,
  • we finally use this material to provide context for interviews and deep-dive sessions with various stakeholders (management, product owners, engineers, etc.).

Compared to traditional methods, this allows us to get a deeper and more insightful understanding of the situation. We can look at it from various angles and better grasp the strengths of the company under evaluation, as well as the areas that needs improvement and where the acquirer or investor can bring value. This is a learning experience for everyone involved in the process and the feedback from both ends has always been very positive.

Research in empirical software engineering

Now that we have a sense for the general data-driven approach, we can look at some techniques that are helpful in guiding the process. There are numerous examples in the empirical software engineering literature, with much of this work already completed in the seventies. Research results are now making their way into commercial products and services at an increasing pace. The term software analytics is often used describe to this broad topic.

Software analytics is the analysis of heterogeneous software engineering data, with the goal to understand a situation and to make better decisions.

The premise is that software engineering is an activity that generates huge amounts of data, and that analyzing this data can help make organizations more efficient. Think about what goes on in version control systems, bug trackers and collaboration spaces. The analysis of traces left by people in these systems can reveal insights about the code, the processes and even the culture of an organization. This feedback is most helpful for organizations who have a continuous improvement mindset, in line with core agile values.

The idea of analyzing software engineering data to understand a given context is not new. It was already many years ago that Meir Manny Lehman formulated the “laws of software evolution” based on quantitative studies of software data. Among other things, he looked at the frequency of releases and system complexity over time. He derived general principles that apply to the evolution of any large scale software system. Many of the observations he formalized at the time resonate with the rationale of agile approaches.

The Goal Question Metric method

One particular research outcome in literature is the Goal Question Metric (GQM) method. The result of pioneering work by Victor Basili, GQM is a structured approach that can be used to characterize a situation, to make predictions and to gear improvements. When applying the method, one starts by defining a set of high-level goals. Every goal is then split in a set of more concrete questions. Finally, one identified metrics that make it possible to give quantitative answers to the questions.

For instance, our goal might be to understand how well we serve our customers, which is fairly abstract. We might then come up with more concrete questions such as : how quickly do we respond to their needs? How often do we cause issues that prevent them to work? Is the frequency of such situations decreasing or increasing? How happy are with our products and services? To answer each of these questions, we might look at very precise metrics such as average response time, lead time for bug fixes per severity, number of new critical bugs per week, Net Promoter Score, churn rate, etc. For every question, one needs to define a formula that combines multiple metrics and produces a measure.

The Goal Question Metric method is one way to build a model that connect high-level, qualitative outcomes with concrete, measurable elements. It enables informed conversations about business impact based on facts, some of which are measured in the technical domain.

Putting it in practice: data-driven due diligence

We have seen that data-driven models spanning across business and technical boundaries provide valuable insights, especially when they are used in conversations and interviews. We have also seen that the GQM is one method to build these models and that it helps connect abstract outcomes (goals) with concrete outputs (metrics).

But what effort is required to apply this technique during a software assessment? What are the skills, time and resources required to get the job done?

First of all, the team leading the process must have deep expertise in both the business and technical domains. This is necessary to build sound models, but more importantly to moderate and facilitate the discussions between parties. The team must be able to deal with strategic questions, but also to drill down to a very concrete operational level. The team must have hands-on experience with software engineering practices, ideally lasting several decades (many assessments are done when transforming legacy systems into modern cloud-based systems).

Then, the use of tools can make the process time efficient. We have seen that the process involves 1) the collection of raw data in heterogenous systems, 2) the analysis of the raw data (this sometimes relies on advanced techniques, such as machine learning), 3) the creation of visual and interactive models, 4) the documentation of insights and 5) the review and presentation of findings. There is no shortage of tools that address one or the other of these steps. However, it is not trivial to build a cohesive platform that integrates some of these tools, augments them with advanced capabilities and provides a streamlined, end-to-end experience to the due diligence team.

Key takeaways

  • Data-driven models can help bridge the gap between business and technical domains. These models should make explicit connections between technical outputs and business outcomes. They should strive to make this connections measurable.
  • The Goal Question Metric method comes from research in empirical software engineering. It is one method to build the data-driven models. It is well suited to the task because it helps link broad qualitative outcomes with concrete measurable elements.
  • Software assessments are one situation where the approach is effective, whether they are done as part of a due diligence or for internal purposes. Our experience is that these models are very useful to provide context in conversations and interviews. They have repeatedly helped us gain deeper insights and a complete picture of the situation.
  • Applying this approach is not trivial and requires both expertise and time. An integrated toolset can address the second concern, providing a seamless experience during the entire due diligence process.
  • Reach out at info@avalia.systems for feedback and more information about our data-driven due diligence method or visit www.avalia.io for more information.

References

--

--

Olivier Liechti
Avalia Systems

Co-founder and CTO at Avalia Systems, former Professor at HEIG-VD