Image sourced from Pexels.

Why the testimony of Frances Haugen is important to designers

Kim Holmes
IBM Design
Published in
3 min readNov 2, 2021

--

The algorithm

Many of us were not surprised to hear the testimony of Facebook’s Frances Haugen last month. Red flags have been waving around Facebook since the outcome of the 2016 election. The presence of faulty algorithms have become newsworthy since the release of Coded Bias on Netflix, but to what degree has this really become a threat to our democracy?

Hate speech and polarizing content sparks engagement. It moves us to share, it moves us to comment, and it moves us to tap that “like” button. It does so more than your aunt’s pumpkin spice recipe or the latest dance move on Tik Tok. It gets our adrenaline pumping. The more we react, the more these posts appear in our feed. That’s how the algorithm is designed.

If you are a communications director for any political campaign, in the U.S. or globally, you are in charge of engagement metrics for your cause. Although people in this position have expressed their disdain for an algorithm that promotes conflict and dissension, at the end of the day, they still need the clicks. Their job depends on it. In response, political organizations have tweaked their messaging to compete with the shock value that warrants basic levels of engagement on Facebook. It’s a vicious and predatory cycle.

The ranking algorithm in question at Facebook is built with artificial intelligence (AI), utilizing Machine Learning. Many of us who heard Haugen’s testimony in Congress may have known that, but we are, by far, the minority. The bulk of the 2.89 billion Facebook users, in fact, do not know this. That’s why it’s important for Congress to act now.

The purpose of legislation and policies that enforce data privacy, like those of the EU and the GDPR, are not meant to quash First Amendment rights, but simply to promote transparency to the user that AI is being used in the application they’re interacting with. Notifications in tech have become a fundamental aspect of our lives. Why aren’t we being notified of how our own data is used?

The designer’s influence

How is this important to us as designers? Data scientists and developers have already started the urgent work of building transparency into AI models. The job cannot be done without them. But how will your aunt ingest these notifications while she’s looking up that Thanksgiving recipe? What about your teenage daughter? Your elderly neighbor? These are the users of that ranking algorithm, and this is where design comes in. We are the experts in empathy — understanding the needs of the end user. Ethical AI is only attainable if it’s presented in a way the user can consume. No matter who that user is.

Blackbox model: A system which can be viewed in terms of its inputs and outputs, without any knowledge of its internal workings. Image assets sourced from IBM Digital Asset Management. Composition by Kim Holmes.
Image assets sourced from IBM’s Digital Asset Management. Composition by Kim Holmes.

Most AI models in existence today are black box models, therefore it’s very important for the business to find ways to make them transparent. Start to have conversations among your design, data science, and development teams. Truly ethical models must address these five cornerstones: fairness, explainability, robustness, transparency, and privacy.

Begin to build guidelines that can be ingested by every discipline in your business. Create a safe place where concerns can be addressed. Build a common repository to spread the word throughout your business on the importance of AI ethics.

The ideal time for laying this groundwork has passed. “The train has already left the station.” There is, however, a silver lining. AI, in and of itself, creates a new genre of work for designers, and along with that, comes great opportunity.

Kim Holmes is a Senior User Experience Designer for CIO Design at IBM based in Raleigh, NC, USA. The above article is personal and does not necessarily represent IBM’s positions, strategies or opinions.

--

--

Kim Holmes
IBM Design

Design lead for developer tool apps at a major corporation that favors the color blue.