Design with Data: Six Dilemmas You’re Likely to Face (Part 2)

Elisa P
6 min readFeb 6, 2018

--

In the previous post, the focus was on getting started with using data to inform your design work. In this post I’m stepping up the game: If you’re into setting up a more established, on-going practice for design relevant data, then this discussion is for you.

You might be wondering “What do you mean with design relevant data?”. Let’s make do with a short explanation for now: Design relevant data is anything that helps you understand what you’re designing for. You might get the data from digital analytics, statistics, sales numbers, surveys, text analysis tools, or sensor logs from a device. You might look for contextual factors, like distance or time of the day, or behaviors, like activity levels, or customers’ experiences, via ratings. In particular, you’d be looking for patterns that only become apparent when studied over time and in larger sample sizes.

Again, there are two overlapping terms I’ll use that might benefit from a brief clarification, metrics and key performance indicators (KPIs): KPIs are just what the name says, important indicators of performance. Like any metric, they should be measurable, enabling you to keep checking if the values are changing over time. E.g. in the statement “15% of our visitors were extremely satisfied”, customer satisfaction would be the metric of interest. If it is seen of significant importance, it might be established as a KPI.

And with that, let’s dive right in.

4. Dealing with conflicting data and outliers
Sometimes, you’ll find yourself staring at the data, comparing different sets or remembering ones you’ve encountered before, all the while wondering “Why is it like this? It doesn’t make any sense!”.

Data never lies, right?
  • If you’re using any tool for your analysis, check your filters first… Overlooked changes in filters or settings are bound to sidetrack you from time to time. If the issue persists, you can set your Sherlock’s cap on and proceed to…
  • Consider what (or who) your data really represents. E.g. do your customer effort survey or net promoter scores reflect all customers who would be interested, or those who have just completed a task succesfully? Are you looking at a snapshot of time, or a trend that is visible over a longer stretch? Do the numbers stay in roughly the same range if you start comparing them across user types, devices, time of day, store types, etc.?
  • Consider (un)related events. E.g. was there a media event that led to a surge of visitors? Or a holiday season, leading to changes in people’s everyday habits? An update (or several) that ended up breaking a flow for a share of customers?
  • Check your biases. The results aren’t what you expected… or hoped for? Can’t agree on the interpretation? Gather more data, cross-reference it, dig deeper with observation and interviews if need be, but remember: the point is not to confirm existing beliefs and solutions, but to learn and adjust as needed.

5. Going beyond simple data
Digital analytics tools like Google Analytics, especially without further set-up, are great at providing averages and best / worst scores. Yet, the old rule “there is no average customer” still applies. While improved access to data means its easier for us to reference it, we’ll still need to work hard to create a deeper, more nuanced understanding.

Keeping it simple is good, until it isn’t
  • Know thy customers and business logic. How homogenous is your customer base? Which differences truly matter for your design and business? Can you identify and seperate these groups to have both better data to work on and aligned principles to base your design decisions on? (Designing for customer archetypes that have conflicting tendencies is more challenging but also fun, for who wouldn’t love a good challenge?)
  • Put your data into different frames. How big of a change are you seeing? Is that number as small or large as it seems? Without more extensive experience and comparison points, it can be difficult to tell. You might set your finding against the same set from a year ago, or compare it to the total number of customers, or market potential. And even if you rely on ready-made visualisations only, learn about the ways our perception of the data changes with the way its presented.
  • Involve others in the interpretation and decision making. Be it your customer, manager, team, those other teams, etc. This will bring in a more diverse set of viewpoints, leading both to alternative and more nuanced interpretations. In addition, this will spread out the insights: after all, the point is not about having data, it’s people learning and taking action based on the insights.
  • Build your own tools and templates if needed. The systems and sources you can access might not provide the kinds of reports you have in mind. And especially early on, it woudn’t necessarily make sense to make a big investment into a new tool or integration. Building your own, simple enough cross-referencing, data refining tools, be it a sheet or something else, would allow you to experiment with the data. And not only that: once you can make your insights visible, you’ll have a much stronger case to make the process of creating them even better.

5. Getting the right metrics and the metrics right
Congratulations, you’ve identified the winning formula for your product or service. You know your focus areas for now and there’s a great buzz on growing and evolving. But how to follow-up on your progress?

Time to negotiate

The less an organization is invested in data processing and continuous learning, the more effort is expended each time data has to be dug up. Yet, the more interested the organization is in following up on the impact, the more competition there might be for ‘the right set’ of key performance indicators (KPIs). In either case…

  • Differentiate between long and short term objectives. Metrics driving long term objectives should be few enough to be memorable. This means that when in doubt, you can always get back to them to check if you’re on track, regardless of your design task. Short term focused metrics can be more varied as you optimize various areas of your product or service.
  • Experiment with the process. This means not only being able to contribute to what should be followed up on, but also knowing the organisation enough in order to identify gaps: Are new capabilities, roles, or other investments needed? Who should own it, who needs to know about it?
  • Automate what you can. If a practice is not only seen as valuable but also easy to keep up, you’ve got a winner proposal at hand.
  • You might not always need a new metric or KPI, but regular UX and usability tests. If you or your designers have spent any time considering what is being produced there should be questions and hypotheses waiting to be tested. Having regular sessions helps to decrease the risks of ‘being proven wrong’ in your assumtions every now and then.

To conclude, I have one more suggestion to share: Start experimenting and studying to become a better data whiz. While design with data is a rising star, we have much to learn still. On the other hand, while data and business analysis are well established fields, design can bring in a forward looking perspective. As designers, we’re well versed with taking an observation and exploring it, providing tangible simulations of these possible futures.

I don’t think ‘soft’, qualitative insights are going anywhere. We’ll still need to fight our proverbial battles to gain those, just as we’re starting to make more use of ‘hard’, quantitative insights. You can think of it as expanding your toolbox, or vocabulary as a designer.

I hope that this overview has been beneficial for you. I’m also wondering, what were the points that stood out most to you?

Hi. I’m a designer and consultant working in an insights driven agency. I have a background in qualitative user research. This post is part of my own journey to making better use of digital analytics, statistics, metrics, etc. The six dilemmas are based on my experiences, as well as discussions had in a ‘Design with Data’ workshop I recently co-facilitated.

--

--

Elisa P

If you keep your eyes open, you’ll always find something intriguing or amusing to wonder at.