Are You a Good Driver? How Designers Use Data to Get to the Truth

Matt Cooper-Wright
Design x Data
Published in
8 min readAug 17, 2015

--

By Matt Cooper-Wright

Sensors, cameras and open data sets mean we’re learning more about people’s behaviour than ever before. That’s reshaping the design and innovation process.

Ask car-owning Londoners to rate their driving, and nine out of ten will tell you they’re either ‘good’, or ‘very good’. Record people’s driving behaviour, and a different picture emerges. It will come as no surprise to anyone who’s driven in a city recently that ‘good’ and ‘very good’ drivers are much less common than people’s self-reporting would suggest.

So what’s happening here? Why do people consistently overestimate their driving ability? As a designer, I want to know: should I design products and services for how they think they drive, or how they actually drive?

Revealing the Full Picture
As human-centred designers we’ve always used data to get inspired. In the past it’s been qualitative data gathered through ethnography — home visits, participating on online forums, and interviews, for example.

But what people say, think, feel and do can all be different and contradictory. It’s not that people are being dishonest — in fact quite the opposite, they’re giving as honest an answer as they can.

Ethnographic design research hasn’t traditionally brought quantitative and qualitative methods together. Now, technology is making it possible. Cheap sensors, smaller computers and open source datasets are making it possible to access an objective picture of the world and compare it with people’s subjective views.

This behavioural data shows us what people really do,
and how it differs from what they say.

As a designer i’ve found that combining quantitative and qualitative research data is the key to understanding the full picture: it’s sometimes called ‘hybrid’, and is increasingly the standard for design research (you can read more here).

It’s human nature to say one thing, and do another. A people-centred approach means designing for that. (My colleague Arianna McClain, a designer at IDEO in San Francisco, explores this in depth in her article about data and design.)

Mind the gap
We spotted this gap between car owners’ perception and reality while designing a new type of data-driven service to reduce stress for city drivers. As part of our research, we recorded the driving behaviour of 15 Londoners for three months, things like steering wheel angle, or how heavily people braked. What we found is data, in many forms, became an integral part of our entire design process.

Here are three ways these new data points around human intention and behaviour can impact design at every stage: from inspiring at the research phase, refining design with real, nuanced behavioural data, right through to proving the value of a working prototype, and building a business case for it.

1. Discovering new opportunities

Data is a new source of inspiration. Inspiration fuels great design.

At the beginning of a design process publicly-accessible datasets draw attention to new opportunity areas that were previously unobservable.

Individuals, organisations and governments are gradually realising how useful open data can be, while more interfaces and visualisation tools are bubbling up to make sense of them. For example, in relation to our new service, recent figures showed average mileage for cars in the UK is falling, which shows changing behaviour around car travel. Transport for London took the smart move of making its transport data publicly available — rather than developing apps themselves they have made it much easier for more agile companies, like CityMapper, to build better tools for people.

As designers we should be both making use of the new tools and learning how to manipulate data directly, rather than relying on data scientists to process the data on our behalf.

If you’re able to look at raw data directly you might spot inspiration
that a data scientist would discard as an outlier.

When we started tracking our city drivers, we quickly realised we’d need to build visualisation tools to let the whole design team see the data. This began with simple bar- and pie-charts showing braking and acceleration over a journey, and moved toward geographically-mapped routes with overlayed weather and traffic conditions. It was very important that the whole team could access the data and understand it, as different team members spotted different insights in the data.

Unearth hidden patterns
Aggregating behavioural data can unlock hidden patterns and new insights. Grouping together the data from our city drivers, for example, showed that Wednesdays and Thursdays are the most dangerous days to drive in London, for example.

If you drive in cities on a regular basis this might be something you’ve noticed. In fact, during an interview, one of our drivers told us: “I don’t know why, but people just drive weirdly on Wednesdays.” He had sensed what we were able to prove with data.

As these hidden patterns emerge, we can quantify the size of the opportunity they represent. Our city drivers who rated their confidence highest were often among the most badly behaved. Our data showed a significant enough portion of drivers fell into this over-confident group for us to adjust our design of the service to account for them.

Macro and micro observation
Just as data starts to reveal macro behavioural trends, it also reveals micro behaviours. Amazon in the past has identified that latency of 100 milliseconds on their website translated to a drop in sales of 1%.

There are interesting questions for designers when user behaviour can be measured in milliseconds: how does it feel when the quality of your
design can be assessed at speeds faster than the blink of an eye? Google
and Amazon have shown the importance of this when refining an existing product, but what if you knew about this while designing something?

Alongside our over-confident drivers, research revealed another
group of nervous drivers. The micro behaviours that related to specific nervousness around driving at night were a key indicator for broader nervousness. Simply put, if you’re nervous driving at night you’re probably nervous in a range of other situations. If your design research has identified two very different groups in a population, how should you design to
suit both?

2. Measuring the nuance of
human behaviour

Triangulating the say, think, feel and do, with data.

Where ethnography has developed a great suite of tools to uncover what people really think and feel, complementary data can help us understand the nuance
of human behaviour. The combination of the two is potent.

Beyond simply tracking how our drivers used our prototype, we also took inspiration from online quantitative surveys we’d run with a bigger group
of 500 city drivers. The additional research methods put them in a broader context.

Joining the research dots
Correlating the detailed behaviour of the 15 with the 500 gave the design team and the client confidence that our small subset of drivers were representative of a much bigger group of city drivers: insight from one stream of research directly informed another.

For a designer, knowing your future customer’s reaction to
a new service or product before it’s built is something new.

Longitudinal learnings
The challenge when designing something new is to build a body of evidence to show the potential for impact.

By the end of our driving project we had 75 hours of ethnographic interviews, 20 million driving data points and analytics from our prototypes, and the survey responses of 650 drivers. All of our evidence pointed towards our design solution. This is one of the areas I’m most interested in — the ability to record behaviour over time, to see how people change while you’re designing something new. Collecting data suddenly makes that possible.

Our city drivers were given a prototype app showing them the data we’d captured, for example. Seeing them open the app every day for a fortnight was proof for us and our client we’d found something they really valued.

Better still, recording data can show us real behaviour immediately,
and therefore fit into the rapid iteration of the design process.

As we developed prototypes for our drivers one week, we refined them based on the data we’d collected the following week. This will be familiar to anyone working on digital products, but it’s a new opportunity for designers.

It can be disheartening to see low user engagement in a prototype you’ve spent time developing. But it would be far worse to see low engagement in a product launched into the real world.

3. Building the business case

Data can prove an idea’s value, as well as user behaviour

For clients a good idea is not enough: desirability from users needs to be matched with commercial viability. If you’ve collected longitudinal evidence and used it to shape design, the next logical step is to take this data directly into an emerging business model.

The original behavioural data a prototype generates is useful
to the interaction designer and business designer alike.

It can build a business case for the thing you’re designing while you’re still designing, and quantify the potential impact of a new design. It’s hard to argue against real user’s reactions and responses.

Toward the end of our city driving project, for example, business designers were able to design a variable pricing model based on the real driving behaviour we’d gathered.

Human-centred data

Ultimately, data’s new role in the design process is better understanding people: it’s just a new perspective. It’s hard to imagine going back to a process that doesn’t draw such a rich picture of the life of the people I’m designing for.

But data’s use goes beyond new opportunities and better observation.
Measuring behaviour over time gives designers new power to
improve their work’s potential for impact.

Take one example: towards the end of our project, we retested our concepts with the drivers who’d said they were good, but the data showed were in fact badly behaved. They were initially shocked at the gap between their subjective view, and the objective reality, and demanded to see the data evidence. Seeing their driving played back to them was a moment of levity, but each resolved to become a better driver.

Soon afterwards, the tracking data showed their behaviour did improve.
We knew then that designing with data had helped us get to the heart of encouraging better driving.

We’re continuing to experiment with tools and techniques, I’d love to hear how others are using data to reshape their process.

--

--