The Feature Audit
Identifying the right features to improve/remove
It was my first SPRINT at Housing.com Hq. The theme was growth. Growth in terms of increasing the engagement of existing users. Improving the product features on Housing.com’s website.
Basically either add new features or improve the existing ones.
Around the same time folks at Intercom published an article on something called the Feature Audit. It was like a how-to guide for solving this challenge. I would like to thank them for writing this amazing piece.
I will refer and use parts from that pdf throughout this article to support the findings and the process.
What are people actually doing in your product?
Are all users using all the features in your product? Of course they’re not. It’s useful to ask “how many people are actually using each of our product’s features?”
The core value of your product is in the top right area, up where the star is, because that’s what people are actually using your product for.
Using MixPanel, the awesome data analytics team at housing quickly came up with the data usage for each feature on the website.
X axis : Adoption (how many people use a feature)
Y axis : Frequency (how often do they use this feature)
Also, Features with one colour fall under one funnel.
Top right corner — Core value— (13) Open info window!
True, core value of housing is to provide genuine property details.
What do you do with your feature audit?
Kaizen is the philosophy of continuous improvement. Shipping code doesn’t mean that you’re improving anything. It all comes to down to the type of improvements you’re making.
The two most popular ways to improve a product are to add new features, or to improve existing ones.
You can improve an existing feature in three different ways:
- You can make it better (deliberate improvement).
- You can change it so customers use it more often (frequency improvement).
- Or you can change it so more people can use it (adoption improvement).
This is when you know why customers use an existing feature and what they appreciate about it.
A deliberate improvement seeks only to make it better in ways that will be appreciated by the current users. For example making it faster or easier to user, or improving the design.
Use deliberate improvements when: there is a feature that all your customers use and like, and you see opportunity to add significant value to it.
One such important Feature we identified : The Map Markers!
Basically, people who land on the page, can see the details (info window) of the properties in two ways, by either clicking on map markers or clicking on the properties in the list.
58% people click on markers but they do it 17 times in one session while 75% people click on list element but they do it only 10 times.
What does this mean? This suggest that less number of people (58%) prefer to navigate by clicking on map markers and they find it difficult to come to desired result as they have to click the markers more number of times(17).
Housing’s USP is that its a map based real estate forum. We wanted to focus on this proposition and make it more valuable. We wanted more people to navigate by map markers, map covered 3/4th of the screen space. And we wanted people to come to desired houses by as less clicks as possible.
Challenge was to improve the adoption and decrease the frequency for map markers!
It really felt amazing to identify this simple problem but what should be the solution? How should we do it?
It was crucial to identify why more people(75%) were using the list view and why were they finding it easy to come to results (only 10 clicks).
Because the list showed one very important information upfront that was missing from the markers. The Price!
We changed the markers and showed price on them. After going through different iterations we came up the design like the one on the right below.
When designing for the web, you can analyze usage data for your product and compare different interfaces in A/B tests. This is sometimes called “data-driven design”, but I prefer to think of it as data-informed design — the designer is still driving, not the data.
We did A/B testing with both these versions for a period of two weeks.
Old on Left and New on Right.
One with BKH Markers (Older version) had almost same metrics as before
Adoption : 59.2% Frequency : 18.25
One with the new Price Markers had surprising results
Adoption : 82.1% Frequency : 8.5
Increase in adoption by almost 20% and decrease in frequency by 10.