Empathising with Data: Why Designers Are Crucial in the Age of AI
Overview
In this first part of our two-part series, we’ll explore the evolving landscape of Human-Centered Design (HCD) in the age of data and AI. We’ll discuss:
- The origins and evolution of HCD
- The current data and AI revolution
- The crucial role of designers in this new landscape
- Real-world questions of data representation and design challenges in policy
- The importance of empathising with data
Let’s dive in and discover why your skills as a designer are more important than ever in this data-driven world.
The Evolution of Human-Centered Design
Human-Centered Design (HCD) has become our bread and butter as designers. But have you ever wondered about its origins?
HCD emerged as a response to the challenges posed by the industrial revolution and subsequent rapid technological advancements. As products and systems became more complex, designers realised the need to focus on the people who would ultimately use these creations.
The roots of HCD can be traced back to ergonomics and human factors research in the mid-20th century. However, it was Donald Norman’s work in the 1980s that really crystallised the concept. Norman, a cognitive scientist and usability engineer, emphasised the importance of understanding user needs, capabilities, and limitations in design.
HCD was saying, “Hey, wait a minute! What about the human experience? What about the people closest to the problem, or experiencing what we think is THE solution?” It was a way of orienting us back to the humans at the heart of it all, using qualitative methods to dig into their deeper motivations and needs.
This approach revolutionised design practice, shifting focus from purely aesthetic or functional considerations to a deeper understanding of user experience. It encouraged designers to empathise with users, to observe them in their natural environments, and to involve them in the design process.
Now however, we’re in a new, not very natural environment, where we are being observed in the digital. And I’m going to ask you not just to empathise with humans, but with the data that represents us.
The Data and AI Revolution: A New Frontier for HCD
It’s 2024. We’re in the midst of a data and AI revolution. And just like HCD was a response to the industrial revolution, we need a new response to this data and AI-driven age. This revolution is fundamentally changing how humans are represented and how decisions about them are made.
In this new landscape, data has become a primary medium through which human experiences are captured, analysed, and represented. Every click, every purchase, every form filled — every interaction with a digital system generates data. In a very real sense, we are increasingly being represented by data — often without our full awareness or consent.
This shift goes beyond just having more information available to third parties about us. It’s about how that information is being used. Increasingly, decisions about what counts, what matters, and what should be done are being informed by these data representations of humans. These data points serve as proxies for real people, their behaviours, preferences, and needs. And many times, individuals don’t know how they’re being represented as data in third-party systems. And how much would we agree that these data representations of ourselves truly capture what’s important about us, and what we want others to know to influence how they design for us?
The AI revolution has added another layer of complexity. Now, it’s not just about collecting and storing data — it’s about interpreting it at scale. Increasingly, decisions informing policy, product and service design, and user experiences are being made using AI to interpret this data. AI systems are analysing patterns, making predictions, and even making decisions that directly impact people’s lives through products, services, and platforms that are increasingly AI-enabled.
The Designer’s Crucial Role
This new landscape presents both unprecedented opportunities and significant challenges for human-centred designers. We have access to vast amounts of user data, but we risk losing sight of the human stories behind the numbers. We can create more personalised experiences, but we must grapple with issues of privacy and consent. We can leverage AI for powerful insights, but we need to be aware of potential biases and ensure transparency.
Our role as human-centred designers in this new landscape is more crucial than ever. We need to be the ones asking the hard questions: Who is represented in this data and who isn’t? How might these AI-driven decisions impact different user groups? How can we ensure that the human context isn’t lost in the sea of data points?
Real-World Challenges: The Human Complexity in Data
Let me give you an example from my work with Bushfire Recovery Victoria. We were asked to report on impact and make data “actionable.” Sounds straightforward, right? Wrong. I found myself grappling with questions like, “How do we define ‘community’? What does ‘community-led’ really mean? How do we measure if something is ‘culturally safe’?”
These aren’t just semantic quibbles. These definitions matter because they determine what gets counted, what gets prioritised, what stories get told. When we’re trying to create data reporting, the way we define “community” has real consequences. Are we talking about geographic communities? Communities of interest? Indigenous communities? Each definition includes some people and excludes others.
This inclusion or exclusion of people into data categories is something we need to pay close attention to. It’s not just about having accurate data — it’s about ensuring that our data truly represents the diversity of human experiences we’re trying to capture.
Real-World Challenges: Measuring Impact in Bushfire Recovery
Let me give you an example from my work with Bushfire Recovery Victoria, now known as Emergency Recovery Victoria (ERV). This State Government agency was established to coordinate Victoria’s recovery efforts and support recovery in the areas affected by the devastating 2019–20 Eastern Victorian bushfires. These fires burned over many months, affecting numerous communities along the eastern seaboard and other parts of Australia. Lives were lost, thousands were displaced, and many communities were temporarily isolated and the recovery journey has been long.
In this context, we were asked to report on impact and make data about affected communities to track how they were being supported in recovery. This was anything but straightforward, how do we collect and measure and report on data like this? How do we benchmark pre-bushfire and post-bushfire measures of wellbeing? When dealing with such a complex and emotionally charged situation, I found myself grappling with questions that went far beyond simple data analysis.
For instance, how do we define ‘community’ in a disaster recovery context? Are we talking about geographic communities that existed before the fires? Or the new, temporary communities formed in evacuation centers? What about the scattered diaspora of those who had to relocate? Are we talking about geographic communities? Communities of common interest? Indigenous communities? Each definition includes some people and excludes others. These definitions matter because they determine what gets counted, what gets prioritised, what stories get told. When we’re trying to create data reporting, the way we define “community” has real consequences.
What does recovery really mean when some communities have been entirely displaced? How do we measure if recovery efforts are appropriate for the diverse communities, who have different connections and meaning from the land, livelihoods, and homes that were burned? Moreover, how do we quantify the impact of a disaster that goes beyond physical damage? How do we measure the trauma experienced by those who lost homes, livelihoods, or loved ones? How do we account for the long-term environmental and psychological effects on communities?
When we’re trying to create data reporting on bushfire recovery, the way we define these terms has real consequences. Each definition includes some people and excludes others, potentially directing resources and support in very different ways.
This inclusion or exclusion of people and experiences into data categories that are aggregated for reporting and analysis is something we need to pay close attention to. It’s not just about having accurate data — it’s about ensuring that our data truly represents the diversity and depth of human experiences we’re trying to support.
In the context of bushfire recovery, this challenge becomes even more critical. The data we collect and how we interpret it directly influences recovery efforts, resource allocation, and ultimately, the healing and rebuilding of affected communities. As designers working with such sensitive data, we have a responsibility to ensure that our data practices are as empathetic and inclusive as possible, truly serving the needs of all affected individuals and communities.
Empathising with Data
So, as designers, we need to start thinking about data differently. We need to see it not as something cold and mathematical, but as something warm and human. We need to empathise with data.
What do I mean by that? I mean understanding where data comes from, who it represents, whose stories it tells — and whose it doesn’t. I mean considering the entire lifecycle of data — from how it’s collected, to how it’s analysed, to how it’s presented and used to make decisions.
It means asking questions like: Who gets to decide what data is collected and how it’s used? Whose experiences are being captured, and whose are being left out? How might the way we’re structuring this data influence the conclusions drawn from it? How can we present this data in a way that’s accessible and meaningful to the people it affects?
These are design questions. They’re about understanding users, considering contexts, and shaping experiences. They’re about empathy and ethics and making intentional choices that put humans at the centre.
Designers, assemble around the humans in the Data Age
As designers, we’re uniquely positioned to tackle these deep questions and challenges. We have the skills to bridge the gap between the technical world of data and the messy, complex world of human experiences. We know how to ask the right questions, how to consider multiple perspectives, how to make the complex more accessible.
In this age of data and AI, we’re not just designing interfaces or products anymore. We’re designing the very systems that capture, analyse, represent, and make decisions about human experiences. We’re designing the interfaces between humanity and AI. And that’s a responsibility we need to take seriously.
By bringing our human-centred approach to the world of data and AI, we can help ensure that these powerful technologies truly serve human needs and values. We can be the advocates for the humans behind the data points, the guardians of empathy in a world of algorithms.
This is our new frontier. And it’s time for us to step up to the challenge. In the next parts of this series on HCD, Data and AI, I’ll introduce some common terms in data science and AI, and how understanding these can help us design better AI-enabled systems. Spiloer alert — it’s going to be very much focused on AI augmenting and not replacing humans. I’ll also introduce a framework I’ve developed to help designers engage meaningfully with data science projects and collaborate with data scientists and researchers.
But for now, I want you to start seeing yourselves not just as designers of interfaces, but as designers of data experiences and even algorithms. Get curious about the data underlying your projects. Ask to be involved in data-related discussions. Bring your human-centered perspective to every stage of data-driven projects.
Because in this age of AI and big data, we need designers more than ever. We need people who can humanise data, who can ensure that in our rush to be data-driven and AI-enabled, we don’t lose sight of the humans at the heart of it all.