Navigating AI’s Moral Labyrinth at MozFest

MozFest
Mozilla Festival
Published in
4 min readJun 10, 2019

Would you trust a robot that was trained on your behaviors?

Would you rather be a bug or a baby?”

Do you trust the calculator in your phone?

Participants discovered these sticky questions and more, as they wandered through the Moral Labyrinth, a gallery session and art experience in the Digital Inclusion Space at MozFest 2018. Artist and researcher Sarah Newman, who created the 5x5 meter walkable installation, explains: “The lines of the Labyrinth are comprised of philosophical and whimsical questions about human relationships to technology and the world.” Tucked away in a quiet spot, the Labyrinth offered a moment of reflection in the midst of the Festival’s creative chaos.

Newman created the Labyrinth to explore the notion of “value alignment” in emerging artificial intelligence systems. “‘Value alignment’ means that the goals that are programmed into intelligent or powerful systems are consistent with the values of the people they serve,” says Newman. Embedding values into AI systems is challenging, Newman explains, because many of our values are so deeply internalized that we don’t know how to articulate them, much less program them into AI.

A walk through the Moral Labyrinth becomes an exploration of each participants’ core values. Moving through this maze of words and ideas, visitors to the Moral Labyrinth at MozFest noticed that even a slight movement might cause a letter or word to morph and smudge. Unlike other iterations of the work, this installation was special: “The entire labyrinth was made of baking soda, so that it was very ephemeral, and participants’ behavior changed in response to the ‘technology’ of the piece.” says Newman. “People visited the labyrinth all throughout the weekend, walked it, reflected on it, and many returned for additional visits. The labyrinth slowly changed throughout the weekend. By the end there were some words that had human footprints in them, which was great.”

The delicate nature of the MozFest installation points to another set of problems in the practice of value alignment: values may evolve over time, and sometimes we may claim a certain value but then act in a different way. Values can also contradict one another, Newman explains. A value can be deeply held, and at the same time be in conflict with another core personal, social or cultural value. And values are subjective, differing across individuals and groups. “The diverse audience was a wonderful aspect of showing the work at Mozfest” says Newman, “Some values are largely cultural — so having a mix of people from different places interacting with the labyrinth was really ideal.”

“A small subset of people are building AI systems now… but their effects are, and will continue to be, widespread. The values of the programmers or, more likely, the companies that employ them, are integrated in these systems, without the consent of the users,” says Newman. These systems may determine the news we see on social media, the routes given by GPS, and whether we’re considered for a job or mortgage. We should be asking what these programs are optimized for, and whose interests they serve.” Being explicit about embedded values is particularly important right now, Newman says, as many AI systems are difficult or even impossible for everyday users (and sometimes even programmers) to fully understand. “We won’t always have traditional forms of checks on these systems to make sure they are performing consistent with the values of the people they claim to serve.”

To approach this complex topic, Newman experimented with a number of forms before settling on the labyrinth. “I wanted to make an interactive work, using the form of Socratic dialogue — asking rhetorical and often playful questions that have more serious corollaries in the technological world — while still being accessible and open… The labyrinth was ideal because you can follow a single path to the center and then back out, leaving where you began but having traversed an important space, philosophically..”

Since Mozfest, Newman has installed outdoor versions of the labyrinth in Boston and Miami, and she is offering a collaborative Moral Labyrinth workshop at Rightscon in Tunisia, June 11–14, 2019 (with designer Mindy Seu and artist Jie Qi). Newman says, “The collaborative workshop will be an opportunity for participants to each generate a unique question for the labyrinth, and then we’ll assemble these questions — which will be in multiple languages — into a single large labyrinth, that further examines the cultural diversity of value systems and morality.”

Sarah Newman is a Senior Researcher and Principal at metaLAB at Harvard and a Fellow at the Berkman Klein Center for Internet & Society. She uses installation art to understand and engage with technology’s role in human experience and self-understanding. Learn more.

MozFest 2019 is 21–17 October, in London, England.

--

--

MozFest
Mozilla Festival

The world’s leading festival for the open Internet movement.