How AE Studio built a better software estimation tool on Dash Enterprise

Amélie Beurrier
Nov 30, 2020 · 8 min read

Our team of data scientists and designers at AE Studio came together to solve one big industry problem: how do you estimate software projects better?

Software estimation is hard. And when done poorly, the negative impacts are limitless: business planning miscalculations, lack of trust between stakeholders, stress around launches, death marches followed by bugs and technical debt, loss of good talent and difficulty hiring… Those are problems all the pizza and foosball tables in the world won’t solve.

Leveraging the high usability of Dash Enterprise, we were able to build a machine learning tool for product managers to run their software teams in an agile and realistic way.

And here is how we built it, using the AE Studio agile product development framework:

  1. Define who the user is and what problem they are facing.
  2. Find a solution that answers their problem.
  3. Build an MVP (minimum viable product: a version of the product with just enough features to satisfy early customers and provide feedback for future product development).
  4. Interview users to collect feedback and improve on the solution’s design.
  5. Iterate by improving the design and adding more features!
  6. Set success metrics and measure the business outcome.

Step 1: Define the problem

When we asked PMs what is broken in today’s software project management workflow, these were their answers:

“It is very easy to account for and estimate the development time of things that you already have experience with. But when it comes to building new features it’s hard to predict the challenges and account for extra time in advance. You often end up adding an arbitrary buffer to your estimated time with the hope that you won’t be too far from the truth.”
JB Trahin, Senior Product Manager, Adobe

“Software estimation is broken. We tend to be over-optimistic as we want to deliver more value to our users quickly. But longer term, planning is key for the team to not only maintain a sane, productive work environment but to also enable the company to better plan for important releases in the future.”
Emily Tomasiewicz, Senior Product Manager, ConsumerTrack

If you are a PM at a tech company, you have had to answer these questions before:

  • When is work going to be done? For an important release, a product deadline, a public launch, etc.
  • How much work can I get done in a limited time? If you work in agile, you probably ask yourself that question every week.

The uncertainty that comes with product development has consequences for, not only your team, but your entire company from customer support to finance, PR and marketing, as well as your users.

Step 2: The solution

In a nutshell, software projects take longer than you think, and that idea is well illustrated in this blog post by Erik Bernhardsson.

His hypothesis is that developers are bad at estimating the average time to complete a task and tend to underestimate it. We can adjust this bias by asking the user to provide an uncertainty level when estimating the effort of a task, then, using a statistical model to calculate the adjusted effort.

More specifically, we based our work on his hypothesis that software teams will usually estimate tasks based on the median time (middle value) to complete a task, rather than the average (or mean).

“Let’s say you estimate a project to take 1 week. Let’s say there are three equally likely outcomes: either it takes 1/2 week, or 1 week, or 2 weeks. The median outcome is actually the same as the estimate: 1 week, but the mean (aka average, aka expected value) is 7/6 = 1.17 weeks. The estimate is actually calibrated (unbiased) for the median (which is 1), but not for the mean.”
Erick Bernhardsson, CTO, Better.com

In our tool, we ask users to provide an uncertainty level when estimating a task size. This value is used as a proxy for the variance of the distribution and with it, we can calculate the adjusted estimate for the task.

The nature of the skewed distribution means that when a developer estimates a task to be 4 points, the probability of it being 2.2 is the same as it is 7. Adding more uncertainty means increasing the tails of the distribution, but much more on its right side. In practice, the higher the uncertainty, the higher the adjusted estimate will be. So, for our 4 point task, if the developer adds an uncertainty of 1, it could push the adjusted estimate to 6, while an uncertainty of 3 could lead to an adjusted point estimate of 12.
Luciano Viola, data scientist, AE Studio

By calculating the sum of adjusted points for all tasks in an iteration, we get insight into the risk of stories not being completed during that iteration. In the best case scenario, the tool also notifies us that we might accomplish more than expected!

The machine learning algorithm keeps on learning how uncertainty uniquely affects estimates for a particular project using data from past iterations.

The team at AE Studio got inspired to bring that data to life in an add-on to one of the most widely used software project management tools (and our personal favorite ❤), Pivotal Tracker.

Step 3: Building an MVP

Meet “Better Pivotal Estimator”.

We wanted to give users a solution to visually see what might get done or not get done in a given timeline, and when it’s likely to get all done (burndown).

How did we build the MVP?

To capture the uncertainty levels, we leveraged Pivotal’s labels. Adding a “u2” label to a story would mean it has an uncertainty level of 2.

To build the app, we chose Dash because it allows data scientists to create dashboards using only Python — and does not require experience with front-end development. Dash Enterprise Design Kit made the process even easier, allowing us to spend less time on HTML and CSS.

Regarding the design of the app, there were a few things we knew were going to make the tool more usable:

  • PMs use a lot of tools already, so we wanted to add something that would integrate seamlessly within their current stack. We liked that Dash would provide a simple, easy-to-use interface and that Pivotal Tracker provided an API that allowed a seamless integration.
  • We knew that PMs could use this tool within their own team (during retrospective meetings) as well as externally (to present progress to clients and stakeholders). We wanted to give them instantly actionable insights, but also make it so that they could take beautiful screenshots to embed in presentations.

We built an easy onboarding flow for users to add their Pivotal Tracker API token and get their projects imported and analyzed in 2 clicks.

We wanted high-impact data to be immediately visible and actionable for the PMs.

  • For instance, showing the number of days on average a story takes to go from the different stages of development (from “started” to “delivered” and then “accepted”) gives valuable information to educate stakeholders and clients about how long things are likely to take.
  • It also helps visually showing at which step your stories get stuck, and can help diagnose inefficiency in the deployment or acceptance process.

For the color scheme, we went with a high contrast dark theme and larger fonts (for legibility & accessibility) but left the user the ability to edit the theme in order to match their own company branding and download the plots as PNGs to insert in presentations, communications, etc.

For usability, we decided against labels for this plot. As the default labels were tilted, they were harder to read for our users. We replaced them by a side legend to the plot. We think about beautiful design & usability but always balance it with available technical resources.

Former labels, tilted and hard to read
The new legend

Being agile in our approach meant deprioritizing the creation of custom labels, and using the default legend feature instead. But freeing up that time allowed us to develop a more important feature: the “See my backlog” button. It is the main CTA of the Better Pivotal Estimator because it enables the user to take direct action based on the insights of the dashboard. It’s the perfect illustration of Dash being an actionable AI/ML tool — not only about insights, but mostly, about action.

Finally, we wanted to give PMs a tool with highly actionable insights that would inform their next decision. To accomplish this, we designed recommendations in the dashboard, which tell the PM if they tended to overestimate or underestimate the work to be accomplished in the next iteration.

Step 4: Get feedback from real users and iterate

We then went and interviewed real PMs to get feedback on the tool, and it allowed us to iterate on the design.

  • We went a step further in our recommendations and told the users exactly what they needed to do, telling them how many & which stories specifically were at risk.
  • We also added a link to their backlog and used the Pivotal Tracker API again to add a marker communicating what might or might not be realistically done.

Business impact

Our main goal to start is the adoption rate by team leads. We will measure this via 3 success metrics: total number of accounts created, total number of Pivotal projects with uncertainty levels and percentage of estimated stories with uncertainties.
The business impact this tool aims at is to increase the reliability of software estimates, allowing teams to better control their process. We will interview more team leads who adopted the product to measure their satisfaction regarding the accuracy of estimates and the impact on their agile process and company outcomes.

What’s next?

We are currently onboarding all of AE Studio’s customers on our tool to provide more accurate estimation and planning. This will allow us to collect early feedback and, in line with our agile methodology, iterate to make the tool even more useful for its users.

We are also working on making this a public tool so that any project using Pivotal Tracker will be able to onboard on Better Pivotal Estimator! More coming soon…

If you are interested in Better Pivotal Estimator, contact AE Studio at luciano@ae.studio.

For more information about Dash Enterprise, the most trusted framework for building machine learning and data science apps, contact info@plotly.com.

Plotly

Plotly’s Dash puts AI & ML in the hands of business users

Plotly

Plotly is a data visualization company that makes it easy to build, test, and deploy beautiful interactive web apps, charts and graphs—in any programming language.

Amélie Beurrier

Written by

Plotly

Plotly is a data visualization company that makes it easy to build, test, and deploy beautiful interactive web apps, charts and graphs—in any programming language.