I don’t know a single person who likes reporting. Seriously. I never met a person who said: I like reporting. I know a lot of people who demand data for reports from others. But even these people hate it when they have to do it themselves. Reporting comes in different colors and shapes. I think we all experienced reporting by Excel throughout our career. You are asked to fill out some cells with values by a certain due date. It might also be a more informal status report during a team meeting. Or maybe during a daily scrum when you answer the question “What are you working on?” — while this isn’t exactly meant to be reporting in the classic sense it is still some kind of status sharing.

Hidden intentions of reports

It’s worth spending a thought on why reporting sucks. Reporting can be powerful if done right. There’s a ton of tools out there which can be used to get better using reports. So why do we often seem to fail here? Maybe we can all use the insights gained to make reporting less sucking in the future.

In the past I had to do a lot of reporting myself. Here’s an overview on what I experienced throughout my career (not necessarily in the companies I’ve been working for)

There might be other’s but I think that’s enough for now. It’s obvious that except for the status report all other kinds of reports are indicators that something is going wrong in your company. Still I think the status report should — theoretically — be strong enough to make all other reports superfluous.

So why do we get bored by a fully legitimate status report my boss demands from time to time? Why does it seem so hard for us to put numbers in fields once a day, once in a week, once a month? Why does the time invested in reporting feel like 3 hours when it really rather 1–30 minutes?

Why it’s not the tool

From what I found myself thinking and what I heard from colleagues and co-workers the first conclusion you might draw is that it’s a tooling problem. The reporting tool might be too slow, too big, too confusing, too complicated, not integrated enough, too integrated, wrongly scoped. I know people complain about bad tooling a lot and I think I’m one of them. Sure, it is a tremendous waste of time if the tool you are using recurrently is designed badly. While this certainly could be a problem I think, this is not the root cause of frustration when it comes to reporting.

No effect? Skip the data.

The true problem is that status reporting — and therefore the kind of reporting that is being done with the right intension and should help you getting better- lacks effect because no matter what is being reported nothing happens and no actions are started as a result of the reporting.

Example: You might know the situation where people are asked for a status in a meeting. People point out that difficulties are rising and that something might be delayed. What’s the consequence? There’s none. Another example: You put numbers in a list. Sometimes they are smaller, sometimes they are bigger. Sometimes you forget about it. What’s the consequence? There’s none. In an ideal world people would use the data being reported to find out where things might go wrong in the near future and would try to avoid this by countermeasures. Like: “Oh — I see a decline in number XYZ. Let’s discuss how we can bring this back on track, maybe we can ask ABC for help here. “In reality unfortunately often nothing happens until it’s too late. Reporting is basically reduced to the action of collecting data instead of working interpreting and working with data.

Basically, this means that the lack of action based on reported data leads to a lack of trust, recognition and appraisal of reporting. Reporting becomes a meaningless activity which isn’t here to help in the sense of true “status reporting” but to fulfill one of the other effects mentioned above. Clearly this is something nobody really wants and so people don’t see a value in reporting and start to hate it. The reported data becomes worse because people try to minimize efforts and as a consequence the activities deviated from the reports become even worse as well. And here you are, right in the middle of a vicious circle.

It all comes down to some basics I already pointed out in my article “Thoughts on agility”: While all we do should be commonsense, somehow we manage to behave like idiots. It’s our job to make sure we break though the insanity we experience when it comes to reporting. Reporting is here to show what works well and what doesn’t. To provide a prediction of the future based on facts collected in the past. We have to carefully scope and define the data we want to see in a report so that it really helps us to learn and improve our own process. A “collect it all” mentality is really only acceptable if you don’t do it manually and if you know how to handle all the data afterwards so that you can use this data to learn and get better.

If the data we are delivering for reports don’t seem to have any effect, it’s our job to question the report. It’s part of the “always learning and always improving” mentality every modern company should follow. Otherwise it’s just a waste of time and resources.

Technical Evangelist, Microsoft Deutschland GmbH