A Chat about Tech’s Hidden Biases: Technically Wrong by Sara Wachter-Boettcher

Shweta Singh
Bootcamp
Published in
4 min readSep 22, 2023

“Technically Wrong” opens a candid discussion about inequality and discrimination within the tech industry. It is a thought-provoking exploration of the various ways in which technology and digital products can perpetuate biases, discrimination, and other harmful practices.

As UX professionals, our mission goes beyond creating aesthetically pleasing interfaces; it encompasses the ethical responsibility of crafting digital experiences that empower, respect, and include every user.

Mitigating Biases and Ethical Dilemmas

UX design is the bridge between technology and human behavior. The book takes us on a thought-provoking journey through biases and ethical dilemmas that can infiltrate our design processes. It compels us to introspect on every design decision, recognizing that these choices carry ethical implications that can influence users’ experiences and perceptions.

Technology isn’t inherently biased, but people are. And when we’re designing technology, our biases can become part of it.

Ethical Imperative of Digital Well-being

The author advocates for Wi-Fi to be integrated into Maslow’s hierarchy of needs, emphasizing technology’s pervasive role in our lives. She underscores the trading and sharing of personal data in the digital age. As technology rapidly advances, data becomes a coveted asset for platform owners. Users’ habits are meticulously tracked through cookies and proxies, leading to tailored recommendations based on past preferences. As technology advances, the pursuit of delight and disruption often overshadows considerations of who benefits and who is left behind. The tech industry, often depicted as a world of privileged white males, tends to prioritize the needs and interests of this demographic. It’s a narrative tech loves to propagate — the genius programmers shaping the future. But, as we’ve seen time and again, the reality is more complex.

Diversity in Design: Building for Real People

The lack of diversity in tech isn’t merely a problem of underrepresentation; it’s a problem of shortsightedness. When design teams fail to reflect the real-world diversity of users, they inadvertently exclude a plethora of perspectives and experiences from the design process. This exclusion leads to products and services that don’t resonate with users who don’t fit the “average” mold.

To truly understand users, design teams must embrace diversity in all its forms. Stress cases — scenarios where users encounter technology during moments of stress, anxiety, or urgency — provide invaluable insights. By considering stress cases, designers can create products that cater to real-life situations, not just idealized ones.

Neglecting the Unseen User

Are our user personas truly representative of the diverse user base we aim to serve? Are we accounting for the rich tapestry of user demographics and experiences, or are we inadvertently designing for a narrow segment?

Tech companies, in their relentless pursuit of innovation, can sometimes fail to consider the diverse and imperfect ways in which users engage with their products. The “average user” persona becomes a mythical ideal, obscuring the real-world diversity of users. In truth, no one perfectly fits this ideal, and that’s where the concept of “stress cases” comes in.

Stress cases are scenarios that represent the challenges and struggles users may face, often during stressful or urgent moments. Designing for stress cases helps identify flaws in products and ensures they can accommodate a broader range of users.

Data and Discrimination: The Unintended Consequences of Algorithms

Algorithms are the backbone of many tech products, and their decisions are not inherently “correct.” They reflect the values and biases embedded in their training data. When historical data is biased, algorithms perpetuate and amplify these biases, often with harmful consequences.

Take, for example, the Compas algorithm used in the criminal justice system to predict recidivism. Despite its claim to be objective, it relies on historical data tainted by systemic bias. This bias results in unfair outcomes, disproportionately affecting communities of color. Tech companies need to address these biases actively and work towards fair and ethical algorithms.

Towards a Future of Ethical UX Design

In a digital era characterized by rapid technological advancements, UX professionals play a pivotal role in shaping the ethical contours of technology. “Technically Wrong” reminds us that UX design isn’t just about crafting delightful interfaces; it’s about steering technology toward a future where ethical considerations are at the forefront.

As technology becomes increasingly integrated into our lives, it must serve the needs of all members of society. This can only happen when the tech industry embraces diversity and inclusivity, recognizing the unique value that diverse perspectives bring. It’s time to break free from the cycle of perpetuating biases and build a tech ecosystem that truly benefits everyone.

Cheers!

👏🏻Clap if you enjoyed the article

See you in the next article 👋🏻

☕ Buy me a coffee

--

--

Shweta Singh
Bootcamp

UX Researcher Intern at Amazon | MS HCI | Ex-ADP