What is data-driven design?

SDM Service Design Melbourne
Design and Ethics
Published in
5 min readAug 6, 2019

July 2019 Design & Ethics

Design & Ethics #6 event facilitated by Jess Bird and Kate McEntee aimed to create an opportunity to explore some of the practices and questions in an environment with a growing demand for ‘data’ to be used to lead our design work: what is data, how do we use, how are qualitative and quantitative data considered and how capable are we of collecting and interpreting data. This was prompted by concerns by the organisers about the rising claims around data in our design community and provide ways for us all to learn more from one another.

The evening was split into a mini-workshop followed by small group discussions. Participants came from varied backgrounds around data expertise, from being deeply immersed in using data to directly drive decisions around design work, to bemoaning the lack of data expertise in practice, to fear of quantitative data and love of qualitative data and the opposite.

We began working in large groups looking at sets of quantitative data from Victorian Transport Insights Hub, including “Travel by time of day in Victoria”, “PTV Customer Satisfaction” and “Walk access to frequent (< 10 mins) PT in the AM peak (7–9am).” Teams were asked to review the data in front of them and discuss what interesting insights or possible design directions could be indicated by the information, some of the gaps in information and what other methods could be considered to augment this information. Each group shared the insights of their discussion with the room. Here are some notes taken by the facilitators, curated by topic or issue:

Issues of communication

  • Lack of clarity around the terms eg. Active Transport? = Bike, Segway, Scooter?
  • What is ‘other’ and does this label leads us to ignore this input as miscellaneous?
  • Unintelligence of data vis. Why are line graphs the most common visualisation used? What visual form might enable us to ‘read’ this data more intelligently and productively?
  • Expertise-ism vs lay understanding. Expert knowledge is needed for data gathering and analysis, and lay knowledge is needed to communicate this to everyday people.
  • How can we share data and research as a sector (eg. say design?) following moves in public sectors?

Issues of validity

  • Over-emphasis on metropolitan (City) transport and not much on Regional? Does this means more respondents for metropolitan (thereby larger number for quant. verification) than regional, but the ‘results’ are collapsed as if to appear the same number of inputs.
  • Danger of confirmation bias = we look for and more easily accept information that fits with our experience and worldview. Eg. Data vis shows Western suburbs with no PT coverage. This correlated with lived experiences and observations by participants, and were accepted without questioning. In turn, there is danger of accepting data that we are easily able to explain why, rather than those that we are unable to understand.

Issues of categorisation

  • Quantitative data is black and white. This means if you travel by driving to the station, then catch a train, then walk to your work, this is ‘split’ into three categories of travelling by car, train and walking as one input each, skewing the data. Quantitative data is rarely useful for layered and combination of experiences.
  • Who responded may have been parents or guardians, rarely the children, even if they are in the same car on the drop off to School, and then off to shop or work. Same critique as above and further steps into ethics of ‘silence’ with minors and minorities.

Drawn from readings circulated through Eventbrite, a discussion document was put together as a guide (see the last section) to seed smaller group discussions. We share some notes:

  • Power: Data can wield significant power. It is often the basis for important decisions, and how it is collected, analysed and translated matters. It relies on the messengers (translators) to tell the ‘right’ story with the data.
  • Lens: on its own data allows for human interpretation into as many meanings or insights as you want. The direction the data will take you in is shaped by the lens or goal being used to look at it. What information are you wanting your data to tell you? Being cognizant of the lens or goal you bring allows you to also be cognizant of what you are ignoring as well.
  • Time and resources: when time and/or resources are limited, quantitative data can be the more viable option to get usable information, but this does not mean it is necessarily more informative or valid.
  • Entry point: quantitative data allows for entry points into an issue, area or system, but there are missing pieces without stories or deeper research to demonstrate meaning
  • The importance of translation: often the role of the designer is in the translation of data or specific expertise
  • Silver bullet issue: when it is assumed that data (or anything) will lead to a ‘silver bullet’ insight or solution, there is a red flag
  • Assumptions: Quantitative data carries with it a ‘credibility burden’ in that it is often thought of being valid and unbiased, but how might we qualify bias or human error also inherent in the quant data?
  • Methods: are not chosen by perhaps the best for the project, but often rely on: 1. What we know and are comfortable with; 2. What we have the money and time for; 3. What the client wants

The event ended with a summary discussion on ethics:

For any ‘data’, either qualitative or quantitative, these questions are useful to ask:

  • ‘Data’ is never ‘neutral’ or ‘objective’. If you think they are, what makes you think ‘neutrality’ and ‘objectivity’ is more truthful or has more integrity?
  • Who is interpreting the data? What are the lens/framework used to do this analysis? Are you aware of your own lens, framework and positioning, and what steps have you taken to be conscious of this? (This lens/framework is also called bias)
  • How are respondents or participants invited to have input in the logic and structures of the questions? Think of the time when you ticked a certain box, not because it captured how you felt, but it was the only choice you had to tick from?
  • How was the ‘data’ collected? When and by whom? What is their purpose or ‘agenda’? What happens to that data?
  • How well is your mastery of methods and its appropriate enactment? Think of methods like a craft (like playing a musical instrument, eg. a violin) and the time (they say +10,000 hours avg) for its mastery. Don’t think simplistically as ‘using’ methods, like a functional object. Methods is a verb. Honour the fields in which the methods come from and avoid the magpie-like ways design has pinched it to make it look ‘quick and easy’ — this is rarely the case (or if it is, it’s a good sign of poor quality).

The discussion guide was created from four sources:

What is Data-Driven Service Design? (Individual)

Data-Driven Design report (Industry)

Digital Methods for service design: Experimenting with data-driven frameworks (Academic)

Simultaneous Triangulation: Mixing User Research & Data Science Methods (Industry)

Please feel free to use this guide and these readings to create your own discussion groups about this topic!

--

--

SDM Service Design Melbourne
Design and Ethics

We foster and support knowledge sharing on human-centred approaches and outcomes of design through invited speakers, workshops and informal conversations.