Insights Driven Design

Gaurav Joshi
Housing Design
Published in
5 min readMar 19, 2018

In the world of design, we come up with new fancy terms everyday. I am not a big fan of terminology but sometimes it make a difference in the thought-process.

As an amateur designers, the philosophy of good visuals steered my design direction… over time I started looking at more data from analytics. Then we started introducing parameters to track while designing. A few videos from the Etsy design team were enough to make me realize the importance of testing. We started advocating tests of our own design. While all of this was coming in we were already doing user research and testing.

So what drives good design?

There isn’t one particular trait that makes a design good design. More importantly, every good design has an expiry date.

The more contextually aware, emotionally intelligent, data validated and user tested and delightful a design is, the closer we are to achieving its objective.
The overwhelmingly wide array of factors that I looked at, they all seemed to give me some insight and I realized:

That’s what is important eventually, the insights.

It is these insights that make design usable, enjoyable and successful. If it were not for the insights, there wouldn’t be such a thing as UX Design.

Insights come from all sorts of sources, qualitative and quantitative.
1. Feedback
2. User Testing
3. User Session Videos
4. User Interviews
5. Anayltics Data
6. Experiment Data

I don’t want to go in the details of all of those. Alot of resources are available on each of these subjects. What is however interesting is to realize what kind of insight source is important on which part of an experience. In doing so the underlying rationale is that atleast one of each, a qualitative and a quantitative insight source is needed.

As an example of how going with just data as the source can mislead us I want to point out an instance. At Housing.com we once had a very long home-page on desktop. We increased the number of sections due to ad inventory and most of these sections looked the same.

Even upon increasing the ad sections we saw that the data suggested the number of people going to the bottom of the screen was still high.

This we thought surprisingly was working as a design pattern. We then took a closer look. After doing a quick user test we established that people were finding the long scroll annoying leading us to see videos of user sessions where we used a tool called Inspectlet and realized that most of these users would flick the vertical scroll straight to the bottom of the page out of curiosity and feel overwhelmed and come back up without interacting with the page.

The last block trigger was catching these people and our fault lay in assuming the data was showing the complete picture!

Let’s now move from what… to when.

When is it that we should look for insights?
There are some defining moments when we should definitely look at different sources. When we:
1. Receive a problem statement.
2. Execute a problem statement.
3. Release an MVP
4. Before and after conducting an experiment
5. A few units of time into the experience being live
6. A relatively longer unit of time into the experience being live.

I would subsequently detail some of these in the posts coming up but for the sake of this post, I want to look at the case where we first receive a problem statement. As the design function we often overlook the need to scrutinize the problem itself! Is it really a problem to solve? What is the source, how is it defined?

To do that I have a small template for you guys.

The Problem Scrutiny Template

Before picking up a problem statement, the following aspects need to be thoroughly examined.

Thread: Which product thread does it belong to
Eg. User Acquisition, Returning User Experience Optimization, Supply Product.

Audience: Profiling of type of audience we are targeting.
Audience Intent: What is the nature of intent of the users we want to build this for Eg. In case of housing users it was : Buy / Rent / List
Audience Visit: First Time /Returning User
Audience Funnel: Top, Mid, Bottom of the Funnel
Audience Pref: What is the preference of the target user segment, if any.
Eg. for buyers on housing resale homes, New Projects, Budget, Locality, etc.

Kind : Specification of what is the solution being built on top of, a small change on existing experience, progressive change, MVP, etc.
Eg. New Problem Statement / Progression / Incremental Change / Experiment

Reason: Using the double why concept to find the context of the problem.Why are we solving this problem = A. Why are we solving A = BEg. If the statement was around adding preferences on CRF (Contact Request Form), the reason would be that we need to send more qualified leads to developers. We need to do that because developers are not convinced about our lead quality and accuse us of sending fake leads.

Assumptions: Things / Behaviour of the people that we are already assuming.Eg. If we redesign the search takeover experience we are already assuming alot of people will trigger it.

Goal / Expectation: Specifying what do we expect from the solution of the problem. Eg. We expect for everyone experiencing the homepage to clearly understand they can sell properties too. (Qualitative goal) Increase in the conversion from property listings.

Metric: Mention the metrics that will be relevant for this solution. There maybe multiple metrics the solution may affect even outside the desired metrics.Eg. Bigger event banner can increase CTR of the banner but decrease the clicks of immediate surroundings.

Insight Sources to consider: 1. Feedback 2. User Testing3. User Session Videos4. User Interviews 5. Anayltics Data 6. Experiment Data

Solution(s): There could be one or more solutions to a problem statement and sometimes they could all be executed for testing.

Release: How is the solution being released. Rolling out to everyone, A/B test or testing waters.

Platforms: Desktop / Mobile / App.

Dependency: Executing this problem statement could affect any other experience.

Impact: Difference in pre-release and post release impact (Quantitative / Qualitative)

That’s all for now folks. let me know if this is helpful or if you have another approach of doing things. I’d love to hear from you!

--

--