Design research today often struggles between using qualitative and quantitative techniques to best inform design decisions. This article discusses the need for a mixed approach that leverages “Chunky Data” — bringing together the best of both worlds — but sequenced differently depending on the type of problem faced.
In my work, I have seen great tension as organizations and innovators try to reconcile the creative, need-based world of design thinking with rigorous quantitative approaches to research. Opinions on the topic vary widely, with camps often resorting to what is comfortable rather than what is right. In reality — there is no winning side. Design thinking needs both approaches to data to come together.
What is Chunky Data?
In the design world today, there are two camps of research that dominate. On one side, you encounter qualitative purists — who will discount the idea of statistical significance in favor of a smaller, ‘emotional significance.’ They retreat behind tried adages like “you cannot ask users what they want.” They are strong believers in what is termed as “Thick Data” — going deep with a few users, really understanding their needs through immersive techniques like ethnographies. There is no validation on whether they are hitting on all dimensions or user groups in a market — it is chance coupled with any biases embedded in recruiting techniques.
You also encounter quantitative purists who obsess over things like behavioral data and AB-testing, often losing the power of needfinding and understanding the “why” of consumers that the numbers point to. From surveys to data logs of an app, they focus on “Big Data”— what is quantifiable, what algorithms can be applied over, what can reveal patterns of consumer behavior at scale. Their focus is to build a powerful understanding of user characteristics and behavior at a statistically significant level.
In reality, the future of design in my view lies in what some term “quantified” or “hybrid” approaches — marrying both camps together in a relevant way to the question at hand. Designers need to have fluency in both the statistical and creative to survive in this future world, and design leaders need a keener sense of what the right tools are to attack a given problem. What we need is neither big data alone, nor thick data alone — we need is what I will call “chunky data” — that is both big in sample size, but intentionally thick with representative members of major user groups.
However, not all Chunky Data is the same. In fact, there are two flavors of Chunky Data — one led by Thick, the other led by Big — that have two specific use cases depending on the type of problem design leaders face.
Greenfield vs. Brownfield Chunky Data
If you are in a blue ocean, new market, whitespace scenario — what I will call a ‘greenfield’ problem — it is usually best to to start with qualitative research that generates Thick Data. From deep ethnographies, to shop-alongs, to diary studies, the only way you are going to uncover latent, unmet needs that can drive growth and differentiation is through talking to consumers and trying to see what they may not be able to articulate. Here, there are no numbers or existing stats to guide what a product should be — and you thus need qualitative understanding as a starting point.
However, this does not mean that a greenfield approach should rely solely on the qualitative. Rather, it should be a foundation, a point of departure, for new ideas. Once an understanding of a space is understood, user needs and attitudes about the space as unearthed through qualitative research can be used to build complex, quantitative models, or even be used to build more robust personas and journey maps. Essentially, Big Data needs to be born out of the Thick Data since there is not an existing user base or product to tackle. With this sequencing, you uncover the needs and build deep understanding, and then, mathematically, are able to prove that your research represents a quantifiable portion of the potential user population with shared characteristics (and often times real business-case potential).
To illustrate, I think about a recent project I worked on at Stanford understanding how users think about the barbecue as both a cooking tool and event. Through rigorous qualitative research, we were able to unpack the term ‘barbecue’ — and all sorts of hidden meanings that it has for people. However, once we built an understanding of the space and potential launchpads for needs, we then build a quantitative model that captured the attitudes and stated needs of ~500 users who looked like our initial qualitative participants. Using clustering techniques, we were able to identify four need-based groups in the market, choose one that was our user group based on their willingness to grow and try new things, and then type all users with whom we prototyped to ensure they represented our user group, de-risking the project. Thus, we found a cluster of individuals who wanted to improve their cooking skills and show them off through barbecue events — that had a meaningful market size and addressable needs to build concepts for.
This cursory example is only the tip of the iceberg. I have seen and personally applied this process in sectors like healthcare or financial services where you can marry together rich behavioral and attitudinal datasets to build a deeper view of the customer groups for whom you are designing, taking all the guesswork out of design. You are not trying to merely guess if you picked up on a strong prevailing wind in your user research — you are quantifying what the clusters of prevailing winds are and designing for one of them in unison. This approach also allows design to better connect into other business disciplines — especially in startup environments — as design clusters can then translate into customer segments across business silos. Chunky Data, thus, helps move design to be more central to the business and allows it to be a part of the larger organization more easily.
If your goal is to improve an existing service or offering incrementally — a ‘brownfield’ problem — such as tweaking a current product, streamlining an app’s flow, or coming up with ways to better an existing service — especially on a constrained budget — you want to start in Big Data. If you are not going to launch a whole new platform for sales, why not focus on where it is broken? Essentially, for brownfield work, companies need to master what the tech companies already do so well — relentless focus of applying creativity in a targeted way to the funnel.
Is a certain feature of a new product malfunctioning, needing targeted rework so that the rest of the offering can be salvaged? Start with the area of malfunction where the impact can be quantified. If looking at a sales or conversion funnel, what are the highest points of leakage along the flow? Focus your redesign efforts on those once they are understood. Brownfield redesign is not about reimagining net new — it’s about reimagining specific pieces to drive meaningful impact for companies through the power of design.
Once the problem areas are pinpointed, they may then require targeted user research that looks more qualitative. The question moves from what is broken to understanding why those things are broken. Thus, the quantified approach is necessary to bound the solution space towards impact, and then the generative research work you would find in ‘greenfield’ problems gets applied to specific needs, leading to Thick Data.
Think about an e-commerce company. Through data analytics, they may find that there are three specific steps of onboarding where users drop out. One the one hand, they can take a relentless guess-and-test approach, A/B testings until something, perhaps, sticks. However, they can instead employ thick data, going into the field to ask potential customers about these screens and what needs or issues they are triggering. This Thick Data — focused in a bound solution space- can then be used as the foundation for ideation to come up with a fix more expediently without the guesswork. Chunky Data, here, saves time and energy.
The Future — A Chunky World
In light of the above, I have one clear message for designers — embrace growth and get out of your purist silos. All types of research, all disciplines, bring value to the table — machine learning algorithms iterating over a dataset can find insights 1,000,000 interviews will not. One killer ethnography creating killer Thick Data can lead to a breakthrough of hidden meaning and unearth mental models for decision-making that a survey of 10,000 consumers will never point to. It is in synergy, as Chunky Data, but sequenced for the type of problem, greenfield or brownfield, that research can become more robust. Through this, design can keep growing as a discipline and drive the meaningful, de-risked impact we all crave. A Chunky World is a better designed world — and I for one am excited for it.
Alex is a Master’s Candidate in Stanford’s Design Impact Program. He focuses on the development of business strategy, human-centered design and research, and data-driven insights to unlock breakthrough growth and innovation opportunities. Prior to his graduate studies, he was an Engagement Manager and Design Fellow at McKinsey, where he served as a global leader of McKinsey Design, a platform that incorporates LUNAR Design, Veryday, and Digital McKinsey Experience Design. Learn more at wolkomir.me.