3 Myths About Human-Centered Design and How to Embrace a Human-Centered Mindset
Are you familiar with human-centered design (HCD)? You may have marveled at the idea, tried to uncover its mystery, or even questioned whether the practice has a place in your organization. HCD, distilled to its essence, is a process or framework for solving problems. While not specific to technology, HCD has been recognized as a core technical competency of a 21st century workforce. When looking at HCD through the lens of digital problem solving, it means empathizing with the people who use a solution and putting their needs at the center of the development process.
Today, we’ll uncover some myths about HCD as it applies to the digital world. Alongside each example we’ll explore some realities — proving how each myth reveals a case for HCD. By the end of this article, you will have some perspective and techniques to help you adopt a human-centered mindset.
Myth #1:
You can fix a digital product by administering training to users.
Reality:
The idea that you can overcome barriers to use a product by delivering training may be partially true. Take for example, enterprise products and applications that are intended to satisfy the needs of complex organizations or agencies. These types of products are meant to support users with specialized domain knowledge. It’s completely fair that some level of training is essential to onboard team members as they support the business via enterprise tools.
However, I’d like you to consider the cost associated with creating and maintaining a training program that extends beyond onboarding. When you add team growth and turnover into the mix, you have a recipe requiring you to continuously deliver your training program. This strain on your training team is likely reflected in the quality of training they can provide. Frankly, it all sounds like a hamster wheel that I would not want to get stuck on.
Training is not the silver bullet to solve our tech woes. So, how can HCD help?
Start by considering where your training initiative might be giving users instruction on workarounds to efficiently use a system:
- Are you guiding users to complete tasks outside of the product or application itself?
- Are you spending time explaining the quirks of unconventional interaction patterns?
- Are you avoiding features in your application altogether due to their complexity?
- Are you supplying tips to use the system in ways it is not intended? (Hey, we’ve all been there when we take makeshift notes to ourselves in email.)
The items listed above are just a few indicators of poor support and they are ripe for HCD methods to tease out. A core tenet of HCD is understanding how work gets done in a user’s context rather than in a lab or training environment. When you can identify indicators of poor support in the day-to-day environment you are opening the door for design to do what it does best: explore solutions to overcome challenges in collaboration with the people experiencing them. By creating solutions with real user perspective guiding the process, you may ultimately realize a reduced need for training.
Myth #2:
Human-centered design is displaced by analytics.
Reality:
Let’s start by defining analytics. Simply put, analytics is the process of discovering and interpreting patterns in data about user interactions with an interface.
Analytics data is best at answering questions such as:
- How many times was a page or feature visited?
- What UI elements have users interacted with
- What paths are users taking between activities or tasks?
- How do different groups of users behave and how do they compare to one another?
Analytics are undeniably powerful at revealing quantitative patterns about what is taking place in the digital world. What the practice lacks is a qualitative understanding of a user’s context in their day-to-day environment revealing why certain patterns may be prevalent in the data.
If analytics can tell us what is occurring, then HCD can help reveal why. Let’s examine what pairing analytics with HCD methods can do for you in practice.
Imagine that your digital product allows job seekers to submit applications for jobs. Your analytics reveal that a sizable portion of applicants are dropping out of the process before submission. You might dig deeper into the analytics to reveal that applicants are dropping out of the process during a step where they are asked to provide details about their previous work experience.
You ask, “why is this happening?”
To find out why, you employ contextual research — a foundational HCD research method —to observe and ask questions of potential job seekers as they traverse the application process. During this study, you hear job seekers describe the process of providing details about their work history to be cumbersome as many were confused about how to provide multiple instances of previous experience. When the job seekers were unable to complete the task of providing their previous experience, they became frustrated and looked to apply for jobs elsewhere.
Analytics revealed what was happening. Or, that job seekers are dropping out of the process before submission.
HCD research methods revealed why. Or, that job seekers find the process of providing multiple instances of work history to be too tedious.
Analytics become exponentially more powerful when you can relate back to something that is taking place in the real world. When you use multiple methods (analytics + contextual research) to collect data about the same phenomenon, you can quickly get to the bottom of where your website or product has challenges worth addressing.
Myth #3:
You don’t need design roles when everyone is a designer.
Reality:
Don’t get me wrong, simply because someone has design experience doesn’t mean they have a monopoly on good ideas. In fact, those with the title of “Designer” cannot reasonably expect to be successful at producing promising ideas without seeking collaboration with clients, stakeholders, peers, and users alike. Those contributions we, as designers, look for from our collaborators are essential to the design process. Yet, the notion that everyone is a designer continues to dilute the impact that design can have in an organization.
Hear me out with this illustrative comparison:
You wouldn’t say chefs aren’t necessary in a restaurant because the entire staff has access to the kitchen and ingredients. Technique, execution, even creativity set apart what a dedicated chef will prepare in the kitchen. Like the approach of a designer, a chef actively seeks feedback to perfect the final result — feedback from suppliers on the best ingredients, feedback from peers on new techniques, or feedback from guests on the dishes served.
Knowledge of best practices and how to apply processes set apart designers in dedicated roles, too. Whether in the kitchen or on a development team, the symphony of collaborators and feedback ultimately results in a stronger dish or digital solution.
Where the idea that everyone is a designer becomes problematic is when it is used by other business areas as leverage to rescind design recommendations that are often the result of an evidence-based process. When design is everyone’s responsibility, it is no one’s responsibility. So, I’d like to leave you with this:
Design decisions are not made on a whim.
Designs improve by following a process.
Designers have the means to facilitate the process.
I’d love to hear from you in the comments!
How are you embracing a human-centered mindset in your role?
What myths about human-centered design would you like me to explore next?