3 Cognitive Biases I Face As A Product Manager

Aamna Khan
Perpetual Beta
Published in
6 min readJan 26, 2016

Every now and then, in the flurry of information and opinion hurled at us, we come across something that just sticks. Last year I came across an article from Business Insider that described the various cognitive biases we humans face in our everyday lives. Since I’ve read the article I often try to spot these biases either in my own actions or those of others. I must admit that it’s far easier to recognise them in other peoples’ actions rather than my own.

Over time I’ve realised that there are three biases in particular that I often deal with at work.

Information Bias

Information bias is the urge to seek more information in order to make a decision when this extra information is actually irrelevant to the decision.

I have a lot of data to work with. Hotjar lets me record user sessions. Google Analytics lets me slice and dice session data. Localytics lets me create funnels to see where my users are dropping off in the app. There’s tons of internal CRM data I look at. A lot of times it happens that I’m looking at X data in order to make a decision and realise there’s also Y and Z and think may be that will be useful and before I know it I have I’ve spent 2 hours pouring over data that has proven useless to the decision-making process.

It’s fun to look at data and draw insights but if done without a clear objective it can take up significant amount of time (and you won’t realise because you enjoy it). I’m not against looking at data without a concrete goal in mind, often you’re left with more questions than answers and that’s not necessarily a bad thing; but when you’re working towards something specific with limited time at hand, it helps to clearly outline your objective. It takes quite a bit of discipline to not get carried away. This discipline, I admit, I’m still working on. This one time I was querying our internal data for something and realised I could do a market basket analysis on what people purchased. And I did end up doing that but that information wasn’t immediately useful and I hadn’t started work on the original problem at all.

To prevent this from happening, I now make it a point to spend some time before I begin on a problem just thinking about what I want to achieve with the data I’m looking for and how I’m going to get to it. This initial clarity helps a great deal in avoiding any arbitrary waste of time later.

Confirmation Bias

Confirmation bias is the tendency to only listen to, look for or remember information that confirms our preconceptions or beliefs.

Knowingly or unknowingly, we all form preconceptions about a lot of things. When we meet someone new, we might already have formed an opinion about them because someone told us something and everything this new person does seems to confirm what we already think of him/her. We run an A/B test and we’re already pretty sure what the outcome is going to favour. We run a new marketing campaign, we’re confident it’s going to work and we’ll find metrics that support our claim.

This is a tough bias to identify and avoid. So many times I’m confident of the outcome of some test we’re running and any other result ‘just can’t be’. These preconceptions can be particularly dangerous. For instance, I track on a daily basis, the number of people who reach the error screen in our app. In one particular release, we had fixed a bunch of bugs which I had assumed were the reason behind the high percentage of people reaching the error screen. Post that release I was confident that the number would decrease. And it did. So I triumphantly mailed my team how we finally had fewer people facing errors in our app. It was then a colleague pointed out that it was possible that the overall traffic to our app could have dropped which could have resulted in a decrease in the metric. And he was right. Marketing had paused a bunch of campaigns and our DAUs had dropped. Of course, I should have looked at the metric as a percentage of DAUs but my bias got the better of me.

I’ve become more conscious of this now. I figured the only way to beat this was to double check all facts and get a second opinion wherever possible. It’s made me slightly skeptical; when data confirms my preconceptions I tend to get suspicious. But I’d rather be skeptical than be biased.

Blind Spot Bias

The failure to recognise your own biases is a cognitive bias in itself.

It’s a natural tendency to think that we aren’t biased. We tend to judge people for their perceptions and label them as biased and are more inclined to think of ourselves as unbiased and better individuals. In one study, over 600 people were surveyed and only 1 person said that he/she was more biased than the average person.

The struggle to be objective is constant. There have been times when I’ve pushed for something I really believe in, it could be a feature or a bug fix that I thought should have been prioritised. When things didn’t work in my favour it left me pretty sour and annoyed. But in retrospect, and this has happened a embarrassing number of times, I found that I had severely over-judged the value of my ask and highly underestimated the effort that would go into it. It’s amazing how blindsided we can get.

I’m still working on being less biased (and I don’t think I’ll ever get to a point where I can stop). What about you? How do you deal with your biases?

If you enjoyed this article, do recommend:)

Follow me on Medium — Aamna Khan

Follow me on Twitter

@_aamnakhan

--

--