5 Ways You Can Practice Inclusive Technology Design Right Now

Jennifer McCormick
9 min readDec 22, 2016

Over the past few years progress towards increasing diversity in the tech industry has been painfully slow - especially for women of color - despite increased awareness, corporate commitment, and accountability.

Gender demographics for all US workers. Visualization: Brittany Schell/Gizmodo. Sources: Airbnb, Facebook, Microsoft, Google, Apple, Yahoo

This situation reflects more than just an egregious lack of equal opportunity. The entire future of human-centered technology and access is at stake. The lack of diverse representation in the industry impacts out ability to create technologies that serve and are usable by all people regardless of age, ethnicity, socio-economic perspective, gender, physical ability, or any combination thereof (also known as intersectionality).

The effects of this inequality on technology design are painfully obvious. All you have to do is look at the experience of racial discrimination when using Airbnb, Uber de-prioritizing service for disabled riders, and Twitter’s lack of crackdown on misogyny and harassment.

It’s really frustrating to work in tech and see this situation continue. So lately I’ve been focusing on impactful ways I can work to better understand and serve the needs of folks who are underrepresented in tech, especially when designing and building our products. So here are five ways which I have found to help me work towards more inclusive design in our current environment of disparity.

1. First and foremost, let people speak for themselves

Sometimes all it takes is to recognize when you are attempting to speak for other people and simply stop and move out of the way.

Voices are powerful. So pass the mic, as they say. As individuals and product designers we can create platforms and opportunities for our users to represent themselves first hand, make their individual voices heard, create community, and increase visibility.

Remember that each and every lived experience is unique, and it is neither fair nor appropriate to expect one member of an underrepresented group to ‘speak for’ or represent an entire group of people.

If you have a question about the needs of an underserved group of people, do your research and find a way ask them directly. Always seek ways to learn from individuals first-hand about their experience, and why.

2. Support the development of a diverse new wave of technologists

There is no form of data or proxy that can (or should) replace an actual person in the room, having agency, taking part in and contributing equally to the future of innovation.

So while the massive gap in industry still exists, each of us can choose help build a stronger, more diverse pipeline for the incoming tech workforce. You can support efforts that are underway to educate and train new coders across the country such as groups like Black Girls Code and Girls Who Code. You can contribute by simply attending events, spreading the word, volunteering and mentoring, donating equipment, or by participating in hackathon activities, inviting job shadowing, or offering press, partnership, or financial support.

A happy camper at GirlPowered! Game Developer Camp at Indiana University Bloomington. Image: The Media School at Indiana University

And anyone can share their passion for design thinking and coding skills by creating free, accessible tutorials online using websites like Udemy or YouTube. Or you can volunteer an hour or two at your local library, especially if you are located in a more rural area. Or you can help and encourage others to try out free coding programs like Hour of Code on Code.org or Swift Playgrounds.

Apple’s Swift Playgrounds, a free learn-to-code app. It gets hard but it’s also fun so you’ll want to stick with it!

3. Recognize unconscious bias

Educational efforts underway now may still take up to another 5–10 years to yield a new and prepared, diverse tech workforce. In the meantime we can take a look at ourselves. Each of us can examine how our own personal perspectives and experiences impact the technologies that we design and build.

The world-views and values that people use when making decisions, including when we design and build technologies, is bias. And bias isn’t always a bad thing; on its own it simply refers to a personally held attitude or belief towards a thing, person, or people.

But then you add context. We all have unique beliefs, needs, hopes, dreams, and fears that equate to our lived experiences. We manifest these feelings in the form of biases, and use these biases to make decisions, fulfill our passions, feel safe, and navigate the world.

This is where our biases can also be destructive. Implicit or unconscious bias is where a person is unaware that his or her perspective and values are excluding and/or can have a negative affect on other people or groups.

An example of unconscious bias in action showed up in some recent viral postings of people ‘redesigning’ TV remotes for grandmas. Sadly, this kind of insult-in-disguise occurs rather regularly when it comes to consumer electronics.

Anyone working in technology can fight the negative effects of unconscious bias. Start by making an effort to be conscious about your biases. Make a checklist and reflect upon what you believe to be your truth. What do you value most? What are your biggest concerns, or fears? When do you feel empowered, or excluded? What makes you feel safe? How do you feel about technology? And most importantly, how do you feel about various groups of people who are not like you? What do you believe is their relationship to technology? Reflect upon why other groups may or may not feel or see the world the same way you do. Try to recognize when your biases enter your work and decisions when building products.

4. Do the extra work to design & test products with everyone, not just your ‘core market’

All people independent of age, ethnicity, gender, and physical ability depend on the folks in the meeting rooms building technology to represent and defend their needs and interests, so they aren’t excluded from access to the tools that drive our technology-centered world.

We can seek to serve the needs of all technology users by using inclusive design practices and including people of various ethnicity, age, gender, and abilities in our design activities like:

· Ethnographic research

· Participatory design exercises

· Concept, usability, and beta testing

Let’s say you are asked by your team to focus on a narrower market-driven customer base, or a specific group of early adopters. That shouldn’t conflict with your ability to practice inclusive design. You can take one additional step and be bold. Dedicate a little extra time and effort to simply expand your user research to include a few folks from the otherwise non-represented consumer groups to make sure you are not excluding their needs. I promise you - your team will be intrigued by the additional results, including the possibility of addressing even MORE markets and customers.

Virtual Reality Aimed At The Elderly Finds New Fans Kara Platoni/KQED. Images: Kara Platoni/KQED One Caring Team
Let’s hope Pew starts to include intersectionality in their polls! Image: Pew Research Center

You’ll need to reach a wide audience in order to include a highly diverse set of people in your research. Luckily there are many ways to find prospective research participants. Helpful online methods include using paneling tools such as Ethn.io and Usertesting.com, and posting screener surveys on social media and web message boards. Try to reach out offline to an even broader range of people through church and veterans groups and library, grocery, community college, and coffee shop posting boards.

You’ll also want to closely examine any screening tools you use. Because the job of screeners is to actually screen out people, it is by nature exclusive, not inclusive. So it’s important to question hard why and how you screen out certain users for testing. Look carefully at your biases and assumptions.

An easy update is to ensure that people can report about themselves using gender non-binary, multi-race, and intersectional demographic selections (examples: using ‘Select All’ and an ‘I prefer’ text field).

Instead of leaning on stereotypes to guide the development of your screener (such as ‘tech bro’, ‘midwesterner’, or ‘soccer mom’) focus on seeking various combinations of demographics, current tools/technologies in use, and actual behaviors to capture a truly diverse set of people and needs.

Example of a brief intercept research participation tool from Ethn.io Image: Ethn.io

Including remote research activities (research via webcam or web conference) can also help you connect with even more people such as caregivers, people with limited mobility, and people who live in more rural areas.

Once you recruit these hard-to-find folks be sure to keep a list of these participants so you can refer back to them for ongoing product testing and feedback.

Lastly, if you identify as one of the folks who are underrepresented in the tech industry, you can provide input on your needs by volunteering to participate in user research yourself. Or, if you have friends or family who identify with any of these groups you can tell them about the importance of their contribution to the tech industry through product testing. There are many ways to volunteer to give input. Reach out to friends who work in technology and tell them you’d like to help them test their product (they’re probably looking for you and will be happy for your help). Go to the websites or social media sites of your favorite tech companies and sign up for usability testing. You can also reach out to a marketing research firm and tell them you would like to participate in technology product research, or respond to reputable calls for usability participation on Facebook or Craigslist.

5. Sample Diverse Data Sets

Technologists who work with sets of user data know that these data are often used to make critical decisions about product features and experiences. Good examples of this are when product teams to A/B testing to compare website use and determine which site is more effective or frequently used, and when Netflix uses large sets of user data to predict whether you will like a certain movie.

But note that humans determine from whom data is paneled and collected (known as sampling), which sets are used for analysis, and determine the algorithms used to make conclusions - which are all opportunities for introducing bias. Scientists and researchers construct the systems for data collection modeling. Therefore, the output and inferences these systems make will ultimately be based on these peoples’ values, decisions, and priorities - in a word, these data are also biased. And the process of machine learning can amplify the effect of bias, in both good and bad ways.

Image: Wikipedia Sample (statistics)

Cases which exemplify more troubling possibilities of biased sampling and algorithms include the instance of racial bias in Google Images associations and Microsoft’s experiment with its conversational bot ‘Tay’.

Just like how designers make decisions about which product features get edited out (or in), data scientists can develop systems which reflect the activities of a diverse population of users by seeking to understand and use innovative measures of diversity to guide their data sampling and algorithms.

Each of us can take action

It has been helpful for me to identify these activities I can do right now - on both an individual and collaborative level - to help the industry design products to be more inclusive of the incredible variety of people we serve. I hope you’ll share your ideas and experiences with the design community too.

--

--

Jennifer McCormick

Researcher and Founder at User Lens, a user experience consulting firm based in San Francisco.