Listening for Unintended Consequences
Mitigate potential negative outcomes
This is part of a series to provoke dialogue and provide concrete ways to help teams ethically build and design intelligent systems. Read our introduction.
Intelligent systems are dynamic, constantly learning from information that is being generated and touched by humans. They don’t live in isolation, but connect to and reflect the ever-changing contexts around them. This means that our designs are never static, and that over time they will veer off the course we originally set. Good intentions aren’t enough. Data scientists and other designers need to explore, identify, and properly communicate the dependencies within a system so they can monitor the system’s progress as it learns.
It’s impossible to anticipate exactly what a system will learn (if we could do that, the ability to learn would be superfluous) or how the players outside your control evolve. Instead, we can consider how people and our biases may influence what and how it learns in unexpected ways. Imagine ways that the system might learn unexpected or unintended things, plan for how you will listen for the signs that your model has gone awry, and build in failsafes to shut problematic ones down
Activities to try
_Write out what problem your design is solving, and then list three human behaviors or cultural values that you believe your design evoke or encourage. Examine your concept and each of these behaviors or values into an extreme case and discuss the potential consequences. As a team, explore how you can build in safeguards, redundancies, or alerts to signal your design is no longer acting as intended.
_Describe two future scenarios that could impact your system by changing input signals and/or the humans participating, one positive (generosity, peace, and abundance) and one pessimistic (division, discord, and destruction). For example, how might it impact family vacation? Dating? Homelessness? Don’t spend a lot of time working on accuracy or likelihood of the outcome. The intention is to define extreme contexts to explore different possible evolutionary paths for the systems we set in motion today.
_It’s often said that “what’s measured is what matters.” What metrics could help you understand that your design is having unintended consequences? Select a key set to help you monitor progress early and often.
A client in the service industry wanted to leverage the millions of data records it had on customers to improve the overall purchasing experience. After weeks of interviews with customers and sales agents, the design team defined the different categories of information that these agents used and wished to have at their fingertips. The design team then conceptualized a new software interface that could surface multiple customer data points at once, including past spending totals. After much discussion with the client core team, we decided that showing such financial data would put customers at a disadvantage if they were talking to an aggressive salesperson. A sales agent could use a customer’s average spending amount as a proxy for wealth, and so be tempted to sell a more expensive service that didn’t align with the customer’s wants or needs. The client team didn’t want the sales agents to have inside information they could potentially use against their customers. In the end, financial data was removed from the new interface.
Explore the other posts in this series: