Inclusive Tech? It Starts With Design

McKinsey Digital
McKinsey Digital Insights
8 min readMay 14, 2021

by Mehdi Bilgrami — Product Management Digital Expert, Shenglan Qiao — Associate, Sara Cinnamon — Design Director, Joe Zachariah — Expert Associate Partner, McKinsey Digital

Technology, while often created with the best of intentions, can amplify the biases found across wider society. From automatic soap dispensers that fail to recognize certain skin tones, to AI-based mortgage lending systems charging Black and Hispanic borrowers higher rates than white borrowers for the same loans, to voice recognition software with significant gender biases, there are plenty of examples of tech creating or sustaining bias against potential users.

There are many ways technology can perpetuate these biases: the tool’s build or test phases may not include enough diverse customer profiles, the team building the product may not be diverse, or the algorithm could be built on data marked by unintended bias. As a result, the technology reflects a blind spot that may not have been created if someone with lived experience had been involved in its creation.

There are unfortunately many examples of systemic inequity in the production process, and the industry is beginning to take note of the harm that these issues can cause. There is a growing movement among tech practitioners to take steps to prevent this inequity and produce products, spaces and digital experiences that welcome users who have been historically marginalized or underprivileged.

Inclusive technology creates societal benefits via greater participation, but they also make economic sense. As populations become increasingly diverse, products and services that sustain bias risk alienating a sizable market of potential customers.

So how can designers and developers actively instill inclusivity in their products and drive greater equity? Identifying areas at risk and diagnosing the biases at play is the first — and most difficult — task. We’ve developed a four-step approach to highlight potential sources of bias in the design process and guide development of products and services that minimize unintentional discrimination and intentionally create equity. These steps are applied throughout the build cycle, from pre-development through to operation.

Step 1 — Discern

This step takes place before production and focuses on building a greater awareness of diversity and systemic bias across the team and the wider organization.

Research has long shown that diverse organizations, particularly those with diverse leadership teams, tend to thrive over homogenous ones — a group containing a range of perspectives is more likely to generate a range of varied ideas. From designers and developers to leadership, teams need to create and cultivate their own diversity and become mindful of potential blind spots.

One practical example in how these can be explored is shown below. This exercise involves the team to reflect on how their identity in each of these areas affects how they perceive themselves in the world, how they are perceived by others and how these identities may affect the products they create.

In a psychologically safe environment, these facilitated conversations help teams recognize privilege, identify how unconscious and systemic bias show up every day, and build empathy for users outside their own demographic. Who precisely facilitates these sessions — whether that’s design colleagues, specialist ethnographic researchers or someone else entirely — is less important than the organization deciding to have this conversation in the first place and selecting the right framework to govern the process. Establishing this understanding around how products or services could create bias is crucial in proactively identifying blind spots and avoiding potential pitfalls later.

Step 2 — Diagnose

Building on Step 1, this step uses systemic processes to identify potential sources of bias in order to guide ideation and mitigation strategies later in the development process.

One method that we’ve used with success is to deploy the user journey as a frame for asking difficult questions. For instance, at each step in the journey we can ask:

  • Who might be excluded from this feature/function/experience?
  • How might this feature/function/experience support or create a biased system?
  • What metrics are being used to define “success”?
  • What message does this product send to non-users?
  • What if users were to misuse this product/service?
  • Who is making decisions on behalf of others?

We can be methodical about addressing different dimensions of our user base to ensure nobody is overlooked and that all user needs are addressed.

These frameworks offer a useful way to bridge gaps in design thinking that could be the source of inclusivity issues. The questions can be incorporated into smaller group discussions or written on a whiteboard for the wider team to consider — whatever the decision, this is a multidisciplinary exercise that aims to help the group uncover risks of bias which can then be flagged.

Others have developed similar tools. The likes of AirBnB’s Another Lens and the Ethics Litmus Test provide exercises that help teams step back and identify unintended consequences.

Most product management toolkits highlight the value created by each feature built in the product and how that functionality should be prioritized in the build process. However, value-driven prioritization does not consider a function’s potential negative effect on the wider world or the reputation of the organization. A feature that may result in commercial success, driving increased revenue, could also exclude a particular group of potential users. When we consider the cost of building the product correctly, we should consider both potential upsides and downsides when evaluating the business value.

Step 3 — Deep Dive

After uncovering biases, Root Cause Analysis (RCA) can be used to identify factors that enabled them.

For example, consider the simplified risk decision tree below that illustrates an algorithm for deciding whether to approve a user for a loan. The algorithm presumes that previous access to credit, income and collateral are good predictors of someone’s risk of default. What the algorithm has not considered is the extenuating circumstances behind some of the answers, or that other factors could be just as good at predicting someone’s credit-worthiness.

To understand the implications of using these variables, we must dig deeper to understand the factors underlying the data. Alongside this, consider examining additional variables to create a more holistic picture of the person’s current situation and potential future situation. Can you create opportunities for them to succeed?

As seen in the example application below, how might this model make assumptions that inaccurately calculate James’ risk valuation?

Developed for diagnosing complex mechanical system errors, Root Cause Analysis uses proven methodologies to ask questions and dig deeper. Taking James’ case as an example, the root cause of his unknown credit history could be due to a lack of trust in financial institutions and worries about the security of credit card transactions. He may not need a car or own a home but may have a sizable balance in a savings account or increasing direct deposits from recently securing a higher-paying job. Digging deeper into the root causes behind these often faceless user profiles can help designers expand their thinking beyond the conventional decision-making and start ideating, building and testing more equitable offerings through an inclusion-driven lens.

RCA invites us to reframe the problem at hand: Are we asking the right questions? Are we asking the questions the right way? Why is this variable most likely to give the answer we are looking for? What factors contribute to this variable? By always asking why, reframing the problem leads us to challenge assumptions and the status quo.

Step 4 — Debias

By now we have identified bias in our design thinking and understand why it has occurred. Addressing this discrimination is not a quick process but rather a sustained effort across the product’s lifecycle, from the design phase to research and testing, then on to development and to operation. This ultimately requires a multidisciplinary approach:

  • As designers, broaden the range of personas used, dive into their unmet needs, frame the problem statement with empathy, and get creative to make equitable products.
  • As product owners, build equity into the product vision, goals and outcomes. Consider if prioritization frameworks disadvantage the needs of certain types of users and build in mechanisms to continuously plan and prioritize inclusive features.
  • As a development team, build ethical AI/ML models; define clear KPIs/ metrics to track for equity-driven outcomes and test with a diverse group of potential users to reduce potential bias.
  • As leadership, ensure we recruit and build diverse teams and bring in different perspectives into problem solving discussions, so that everyone is able to bring their own diverse, lived experiences into the products and services we design and develop.
  • As risk managers, front line and operational team members, ensure we confirm soundness of the approach and validate insights and results based on business experience, as well as working alongside Legal colleagues to understand protected classes and regulatory compliance. Products in use should also be monitored and tested on an ongoing basis.

We hope you found this approach useful and that it assists you in making inclusivity an inherent part of your design process. Inclusive design is a growing movement across the tech industry and should result in fairer, more accessible products that support the end user’s agency.

More widely, the onus is now on tech teams to hire diverse talent that can bring their lived experiences into a product and create tools that work for everyone. Societal biases will need to be proactively designed out of technology, and this will require designers who know the damage that bias can wreak all too well.

--

--