Democratizing Operational Excellence: Developing Competitive Advantage in Software Forecasting

Dave Thomas
Xpanse Inc
Published in
8 min readJul 2, 2021

--

Clarifying the path to the target

Leveraging data-driven frameworks and diversity of opinion to plan and deliver more reliably

Main Ideas

  1. Gain high quality release forecasts and improved team ownership by shifting the locus of control in release date creation. Moving ownership from the top of the org chart to a diverse group of your direct contributors replaces an extrinsic demand with an intrinsic motivation to build skill and passion for operational excellence in forecasting.
  2. Provide a framework with clear guidelines for your team to be able to deliver high quality, data-driven forecasts

Learn the Fundamentals to Build the Passion

I follow an MIT AI research scientist named Lex Fridman who recently tweeted a quote.

“Learn the rules like a pro, so you can break them like an artist.”

― Pablo Picasso

Like all great quotes, this one dares you to dwell on how this applies to your worldview. For me, it sparked an internal dialogue about how teams can level up their development planning by first working to understand and then practice stronger fundamentals. Once deeply understood through practice, we can evolve from students of planning, struggling to assemble and apply knowledge to existing in a state of team flow with the creative activities that make us owners of what we build.

We often treat planning as a burden that impedes us from what we love most about our jobs: building. It feels like an extrinsic demand from management, disconnected from the way the team works. It’s my hope that we can break away from this mindset and get to an intrinsically-driven place where our plans empower and unlock opportunity through the beauty of the data, the insights they reveal, the greater reputations our teams build, and the improved confidence our people gain as software professionals.

“Most enjoyable activities are not natural; they demand an effort that initially one is reluctant to make. But once the interaction starts to provide feedback to the person’s skills, it usually begins to be intrinsically rewarding.”

Mihaly Csikszentmihalyi Flow: The Psychology of Optimal Experience

Goals to Drive to Better Forecasting

  1. Prioritize operational excellence through better forecasting as a competitive differentiator that is worth the time of every contributor’s involvement
  2. Empower teams to own the forecasting process and see it as a highly leverageable skill and point of operational control at the team level, rather than a leadership mandate
  3. Learn core principles of effective framework-based forecasting
  4. Understand the underlying concepts and relationships between estimation → capacity → sizing prediction → forecasting, and iteratively feeding back data to improve
  5. Recognize the pitfalls of weak sizing and planning approaches that put teams at risk

Operational Excellence is a Differentiator

Teams often focus gaining competitive advantage on differentiation of features, pricing, and packaging their products. An undervalued area is operational excellence. Broadly covering everything from finance to sales to delivery, companies who can out-operate their competitors are frequently the winners.

Forecasting Well is a Path to Operational Excellence

Sizing in software development is a part of planning to estimate the amount of work required to complete a task, deliver a feature, or collection of features in a software release. The forecasting process is the approach to developing, communicating, and advancing the predictive quality of the plan over the course of a software release. It used to estimate capacity, monitor assumptions and risks, and forecast the release date.

Doing this better than your competition allows you to outmaneuver them by delivering more predictably and with more agility.

Part I: Core Principles

  1. Great forecasting, initiated by the team, drives higher ownership through inclusion of all people responsible for building the product.
  2. A forecast is a date paired with a triad of related information: sizing, assumptions, and confidence level
  3. The triad can’t be broken: communicating the forecasted date should never separate out any of the component details
  4. Measure what matters: the actuals compared to the sizing is a critical feedback step to improve the next iteration of the forecast
  5. Risk is driven out and confidence improves through an active process of learning from progress made

1. Great forecasting, initiated by the team, drives higher ownership through inclusion of all people responsible for building the product

Most teams treat forecasting “upside down.” By reacting to leadership demands for a plan, they see leaders as its primary consumers. This frames the forecast as something only leadership uses. It also prevents the team from owning the method and format of the forecast, which often results in making the forecast a proxy of the dependability of the team to predict the future with unrealistic precision. In this upside down world, inevitable changes become hard conversations rather than normal iterative learning.

In strong product development cultures, great forecasting is driven by the team, for the team. Just as great development teams find their own bugs and iterate on product learning, they drive iterating on operational excellence topics like forecasting from within, rather than being pressured from above. By improving their approach over time, the world flips back to right side up. Leadership steps back and a virtuous cycle of autonomy and ownership can better flourish.

Inclusion isn’t a touchy-feely tactical bolt-on to forecasting. Research repeatedly shows a diverse group of right-skilled people, with the agency to contribute, develop higher quality estimates. The 2005 book The Wisdom of Crowds, by James Surowiecki is a great read on this topic.

“If small groups are included in the decision-making process, then they should be allowed to make decisions. If an organization sets up teams and then uses them for purely advisory purposes, it loses the true advantage that a team has: namely, collective wisdom.”

― James Surowiecki, The Wisdom of Crowds

In teams where leaders are driving delivery dates, there is a strong signal that they are running things upside down.

Suggestions

  1. Create incentives that push operational excellence expectations to the builder level of the organization
  2. The people that are doing the work should understand the work before being asked to size it. Onboard the team as early as possible so they can begin to contribute their perspectives. Include all development functions (not just engineering).
  3. Provide training on forecasting methods (found in parts 1 and 2 of this article)

2. A forecast is a triad of related information: sizing, assumptions, and confidence level

A forecast helps predict a completion date, with estimations of capacity, scope, and assumption of risks. At any given point, your confidence level is based on the quality of information at hand. As you advance through a software development cycle, the confidence of your forecast should improve relative to how well you track your actual progress, what you learn about your capacity, and assumptions.

By framing a sizing for what it truly is, a forecast attached to assumptions, you keep an appropriate degree of uncertainty paired with your forecast. Assumptions are key points which you must convert to learning through the work of your development team.

Suggestions

  1. Pulling together core principle 1 and 2, encourage your teams to find intrinsic value in improving what they know about the forecast and include this in the way they operate their iterative development process.
  2. Teams should have a framework which keeps all components of the forecast connected and develops improved confidence across the stakeholder community through iterative development
  3. Use a framework for confidence level such that it is consistently applied based on clear criteria. (See Part 2 for a sample confidence level framework.)

2. The triad can’t be broken: communicating the forecast should never separate out any of the component details, especially confidence level

Experienced software managers have witnessed the effects of allowing a forecasted date to be separated from assumptions and confidence levels. The forecast is replaced by an inflexible commitment when separated from its constituent support data.

Example

Forecast triad: We have 50% confidence we will deliver on August 30th if we onboard staff to plan and get our three key risks covered by June 15th [list the risks and assumptions].

Broken: we will deliver on August 30th.

The forecast triad clearly conveys it is a coin toss whether you’ll hit the date because of the listed risks. In the broken version, your stakeholders assume you are certain you’ll hit the date.

3. Measure for capacity: the actuals compared to the sizing is a critical feedback step

Product development leaders know measures can inform how to focus learning about iterative product improvement, but few of them apply that same notion to learning how to improve their operational tasks such as forecasting. When teams practice, they get better at predicting the forecast. Over time, planning confidence and team ownership improves. They communicate and drive forecasting change from the bottom rather than the top.

Suggested approach

  1. Evaluate established sizing approaches and stick with one so your dataset will be useful to analyze. Try story points, a relative Fibonacci-based scale that is a guess of complexity, uncertainty, and effort. (See Part 2 for details on this framework and a sample.)
  2. Size, accumulate, and record the work predicted in a future period. E.g. “We expect to complete 28 story points of work this sprint.”
  3. Record the actual amount of work delivered. E.g. “We delivered 21 story points of work last sprint.”
  4. Keep track of the average, the standard deviation, and mean deviation of these numbers normalized per contributor, across your sprints. The standard deviation measures the volatility of the data. The mean deviation (MD) is the average distance between a value and the mean and is helpful in learning about your average accuracy. (See Part 2 for a working example of two teams over 8 sprints.)
  5. Get together with the team every sprint and learn from the numbers.

4. Risk is driven out and confidence improves through an active process of learning about the assumptions

In the software product development world, iterative validated learning has become the standard approach to finding product-market fit and competitive advantage. This concept applies to operations as well. It involves how you capture data about all aspects of your product development operations, including the feedback loop of sizings to actuals.

Learning bleeds ambiguity out of the plan and drives up confidence in the forecast. Often a lone product manager is expected to be the primary driver of this activity. This misses Core Principle #1. Leveraging the diverse perspectives of the team, the forecast is more likely to improve if the data is shared and pondered regularly as a team.

Suggestions

  1. Learning about changes in the components of the forecast isn’t the sole domain of the product manager. In a recurring meeting such as your backlog refinement or operational metrics review, bring the team together and look at the data together. Ask “what are we seeing here?” Don’t anchor the team by initiating with a leadership take on the data. This will potentially censor and overshadow other better ideas.
  2. Drive change from learning. People can’t help but find patterns in data.
  3. Not every pattern seen in data is useful. Take a moment to find a skeptic in the group. I have so often been wrong about what I think I’m learning in data. This is especially a risk if a leader in the group has a strong opinion about what the data is saying. Democratize debate.

Continue reading Part 2: Concepts and Sample Frameworks for Developing Competitive Advantage in Software Forecasting.

--

--

Dave Thomas
Xpanse Inc

Engineering and Consumer Platform Leader @Upside; Product design, technology platform strategy, and ground fighting geek in Seattle