Thinking About the People Behind the Data
Humanize the stories behind the numbers
This is part of a series to provoke dialogue and provide concrete ways to help teams ethically build and design intelligent systems. Read our introduction.
Sometimes qualitative research helps us understand what kind of data we might want to collect. Other times, looking across data sets first will help us focus our ethnographic research. Regardless of the starting point, most design challenges benefit from using a combination of data (both big and small) and human stories to generate robust insights and to help build confidence in your designs.
It’s tempting to default to statistics and charts when you have them. In many business settings, leaders may even request “just the numbers”. But if you really want people to understand and have empathy for the rationale behind your concepts, you should tell a human story alongside your data..
Activities to try
_Richly characterize one or more of the specific people whose story is being captured and represented in the data or analysis. Discuss what could explain the patterns, both typical and extreme, that you are seeing. What are the underlying behaviors, events, or mitigating circumstances? When in a generative, divergent mode, this probing can help you determine what questions to ask next. When sharing your concepts with leadership, adding such depth is a powerful way to connect them to real life.
_“Translate” a few rows or columns of data into human stories — especially when those stories can highlight a wide range of experiences. Even if this is simulated or proxy data, create a face and give each story a few humanizing touches taken from actual stories you heard in the field. What new issues or considerations does this highlight or surface? Remember, you’re not designing for a number, you’re creating for a person.
An industrial client wanted to explore a new offer around usage-based machine repair rather than time-based machine repair. The client had been replacing parts at regular intervals based on the average part failure statistics; if a part needed to be replaced every six months on average, then they replaced it every six months — regardless of the actual wear and tear on that specific part in that specific machine. To understand the implications of a usage-based approach, we simulated the variance of the degradation of parts over time and examined what happened at the extremes. Rather than presenting only a dense chart full of wiggly lines, we highlighted single lines and used them to tell a human story. In one, “Gretel” overspent on a part because it was replaced far too early; in another, “Hansel” would have to send his entire team home for the day because a part failure shut down the manufacturing line. Adding these simple details helped everyone in the room understand the human implications of a data-driven solution.
Explore the other posts in this series: