Why Test Developer Productivity On A Single Metric?
Productivity…there are about an infinite number of ways to measure it in a company, in a team, or even on a personal level. But it stills seems like there are only a couple of ways companies are doing this. The main way, as I’ve experienced, is to use the statistics that they get from the bug tracking software that they use, such as how many issues each developer is closing at a single point in time and how many have been reopened.
Personally, I strongly believe that this type of productivity measuring tells just a very small fraction of what is all is all going on.
Don’t get me wrong, though. These type of statistics doesn’t hurt to know. It can point to a developer who may not know a certain part of the software very well, the certain developer is going through something that’s causing them to be unable to focus, or that the developer has too much going on and can’t spend enough time on each task to give it the full attention it needs. Again, these statistics only show a symptom of what’s going on. To find what the issue is will take a lot more than clicking on a button to bring up a report and skimming it before a morning status meeting.
I’ll share an experience I’ve had with this myself from a fairly big company that was measuring my productivity exactly like above — looking at my bug report statistics.
The product I was working on was a server component to a desktop application and several parts of it I wasn’t familiar with. I didn’t even know much of the business itself and had to learn as I go.
The issue, and honestly what turned out to be the main reason I was taken off the project, was because I had the highest rate of reopened tickets than the other developers. That’s it. Nothing else was looked at it from what all I was told. Not any of the documentation that I added to contribute to the productivity of the whole team. None of the several new features I added that resulted in very few issues themselves. And they’re definitely not going to notice any support I’ve given to other people of the team. You would think at least some of that was accounted for but apparently not: All they mentioned was that one fact about the reopened ticket count.
Hell, the only good feedback I did get was that they liked that I kept asking for more work when I was out of it. I honestly don’t think they even knew of the other contributions I’ve made.
Because of that — them not apparently acknowledging those other contributions that I believe definitely outweigh the reopen ticket count issue — I wouldn’t even want to work for them again if they would offer.
Again, it’s not good to be the one who did have the most reopened tickets. It definitely makes me feel like I’m a horrible developer. However, it is more to it than just that. What caused that to happen? Digging a bit deeper from when I was there, I believe a good part of it was that I was given issues where I didn’t know any of the business domain. Of course, when submitting a fix, there was something missed and I’d have to go back and handle that scenario. Bear in mind, this is after already writing unit tests for what I did have and passing code reviews.
What seems to be the worst part, to me, is that they didn’t say any of this while I was there. No feedback at all on how I was actually doing. I even asked at one point if they had any feedback and they mentioned nothing that I could improve on. I only heard the above after I was already off of the project. I’m sure a good part of it could have been improved if I knew about it while I was still there and had time to improve.
I don’t mean for the above to be tirade of how the client was. Overall, I did enjoy working on the team and learned a lot from it. Though, I definitely could have done better to check my fixes before checking in to their source control. Mainly, I hope this mainly helps illustrate that it takes more than just looking at metrics from the bug tracking software that you may be using. Statistics help but it tends to be more to it than just that.