Using JIRA to Solve Agile Problems

Terry Thrasher
Building FreshBooks
5 min readMay 9, 2017

--

As a Scrum Master, my primary responsibility is to help my team grow. That includes measuring how much stuff we do, and how consistently we do stuff.

In early 2016, my team at FreshBooks was working on a new product. At times it felt like we were making progress slowly, and that our estimates for how long things would take weren’t lining up with reality.

I thought about how to tackle this, and broke it up into three questions:

1) Are we actually moving slowly?

2) Are our estimates reliable enough?

3) Do we need to change anything about the way we work?

I needed answers to the first two questions in order to answer the third. The best way I could think of to approach the problem was to look in our project tracking software, JIRA.

At FreshBooks, all of our product development work is managed via JIRA, typically with Agile boards. There are built-in reports to measure work being done… but each team operates differently and with autonomy, so comparing one team’s reports to another’s wouldn’t shed any light on my questions.

Another Scrum Master suggested that the control chart was a great way to get a high-level view of my team’s work. When I looked at it for the first time, it was clear I would have to learn to use it, because it looked like a Petri dish:

What this chart shows is a timeline with every issue that is or was captured by our Agile board. The dots are issues, and the higher dots represent things that took longer to finish, from the time the issue was started till the time it was marked as complete.

Once I understood how to use it, I realized the chart is a treasure trove of useful information. And here’s how it helped me answer those three questions.

1) Are we actually moving slowly?

As a representation of how quickly our team was working, I focused on how long it took for us to complete our tasks in JIRA.

My first step was to reduce the noise by constraining the data shown on the control chart. To do this, I searched for what I was interested in, then saved the search as a filter. For example, early on I realized that including bugs and tasks wasn’t useful: they varied too much and weren’t representative of our team’s momentum. So I wrote a filter to include only the team’s stories. Now the chart was more readable and less of a chemistry experiment.

There were a few stories that were clearly outliers. I expected to see our large stories take 5–7 work days to complete, but there were some stories that took 20+ days, even 70 days in one case. Was our planning really that bad?

I looked at each outlier individually, and found some commonalities. When a story took a very long time, it often meant the person working on it had got stuck, or that we had finished the work but had forgotten to mark the issue as complete in JIRA. I created another filter to remove the outliers so that we had the option of looking at our standard results.

2) Are our estimates reliable enough?

Next I broke up our stories by their estimates.

We were estimating each item with two, three, or five story points. Two point stories were expected to be low risk and take a couple of days to complete. Five point stories involved more risk and took a full one-week sprint to complete. And three point stories landed somewhere in the middle. After plotting the issues and showing the sizing here’s what I saw:

While we weren’t surprised by the five pointers, the two and three point stories were taking the same amount of time! It was clear that our estimates weren’t lining up very well with the amount of time we were actually taking.

3) Do we need to change anything about the way we work?

Now that I had a bunch of interesting, informative data, I walked everyone through my findings. As we talked through the points and what they represented, the team clearly saw that our estimates weren’t reliable. They also felt that we were taking too long to finish our stories. As a group, we discussed why that was, and what to do about it.

The first insight was that we weren’t breaking up our stories enough before we started working on them. Many of the stories that took a long time involved someone getting stuck, and we might have been able to anticipate the obstacles if we had taken a more granular look at the work we needed to do. While we’d be adding some overhead, the team felt it was worth doing.

The second insight was that if we were going to more effectively break up and estimate our stories, we needed to use a finer points scale. We settled on expanding our range to six numbers (1, 2, 3, 5, 8, 13), and committed as a team to breaking up stories so that they fit one of our new sizes. We also committed to honestly assessing any large stories that might bleed past the end of a sprint, and promised ourselves that if we could break those up, we would.

So did it work?

With general agreement on the team, and a framework in place, the last question was “how will we know if it’s better?”

It was going to take some time to update our backlog with our new standards, and it was going to take some time to practice our new costing and story writing standards. After trying it out for a month, everyone on the team agreed it was a good approach. Starting from this point, the data would count.

Three months later we checked in to see if we’d been successful…

Look at that graph! It’s almost too perfect!

Our estimates had become pretty good predictors of the time we would spend working on stories. The variance was lower than before. We were spending less time per issue overall. We were more consistent about completing what we took on in the expected time frame. We had more time to spend on bugs, research, and paying down technical debt. And as an unexpected bonus, everyone was more mindful about keeping JIRA up to date, so we had fewer outliers.

The role of a Scrum Master is often represented as all soft skills and indirect action. However, doing my job properly sometimes means taking direct, data-driven action. JIRA is a powerful tool for collecting data to help a team understand how they’re working and to target areas of improvement. Sometimes that’s exactly the kind of tool I need.

--

--

Terry Thrasher
Building FreshBooks

Agile Coach at FreshBooks, competitive dodgeball player, human enthusiast