Applying systems thinking to technology’s ethical challenges
My friend lives on the bottom floor of a two-story apartment complex. Above him lived a person who one evening ran a bath and absentmindedly forgot about it. Sometime later that night, my friend’s spouse noticed the water streaming from the ceiling, down the wall and onto their carpet.
The result? My friend got a hassle and a remodel. Yet, not all instances of inattention are so benign.
Systems thinkers use the term “overshoot” to describe what happens when a system goes beyond its limits.
In their well-known book Limits to Growth, authors Donella Meadows, Jorgen Randers and Dennis Meadows explain that, regardless of the scale of the system, there are always three causes of overshoot. “First, there is growth, acceleration, rapid change. Second, there is some form of limit or barrier, beyond which the moving system may not safely go. Third, there is a delay or mistake in the perceptions and the responses that strive to keep the system within limits.”
Returning to my friend’s flooded apartment, the growth of the water flowing into the bathtub constituted the change. Once the water completely filled the tub, a barrier was crossed. Finally, a delay took place — the water had to seep through the top-story apartment’s floor, into the ceiling and down the walls before my friend’s spouse discovered the problem and they both rushed upstairs shouting, “Overshoot!” Or, maybe they were yelling something else.
Sometimes the consequences of going beyond the limits are profound. Those consequences can be aggravated by the third cause of overshoot, delay.
The longer it takes for us to notice a limit has been crossed, the more profound the impact of the overshoot.
I want to begin this three-part series on ethics and technology by asking if we are experiencing a kind of ethical overshoot in the use of our technologies. I’m keenly aware as I begin this discussion that this is not trivial stuff. Many people are experiencing genuine pain because ethical boundaries have been crossed. Some people have lost their lives, victims of “collateral damage” created by the ethical choices we program autonomous technologies to make. And yet, I believe it’s realistic to hold a hopeful view of ethics and technology. Dystopia is not the only possible outcome. That we are capable of more empathy and better collaboration are reasonable conclusions to draw from surveying the opportunities before us.
This fall Brian Chesky, CEO of Airbnb, wrote an apology letter addressed to his employees and the communities Airbnb’s technology platform serves. Chesky said, “Bias and discrimination have no place on Airbnb, and we have zero tolerance for them. Unfortunately, we have been slow to address these problems, and for this I am sorry.”
He was responding to allegations that the Airbnb technology platform created an environment conducive to discrimination. The Twitter hashtag #AirbnbWhileBlack trended in the first half of 2016, revealing the stories of people of color who were discriminated against by Airbnb hosts. Repeatedly studies have confirmed that discrimination exists within online marketplaces like Airbnb. Admirably, Chesky and his company are working to solve this problem.
Back to the question of overshoot. Are some technology platforms in overshoot? By definition, yes. Airbnb’s technology platform grew to the point where its services spilled over cultural boundaries of tolerance and equality. A delay occurred: those discriminated against complained but Airbnb did not initially hear them. During this time the problem of discrimination on the platform grew more acute. Airbnb was in overshoot.
Here’s where things start showing signs of hope. While Airbnb didn’t catch the problem before overshoot occurred, once the company learned about discrimination on its technology platform, it recognized the problem and took action to address it. Moreover, in this particular case technology played an important role in revealing discrimination. While Airbnb’s technology platform couldn’t catch the discrimination it harbored, Twitter’s technology platform provided an outlet for the community to share the problem widely.
The most encouraging aspect of the Airbnb case is that researchers then used the very technology platform that was aiding discrimination to conduct studies that confirmed that discrimination was, in fact, happening.
Researchers didn’t need to recreate Airbnb in a lab to test their hypotheses. They simply used Airbnb itself.
This is a remarkably promising situation. While technology platforms can play host to unethical users, they can also function as research labs and solution engines for the problems unethical users create. Not all technology is this adjustable like this.
If I build an airplane out of steel in the shape of a huge cube with an engine sticking out the back, it will — obviously — not make it into the air. There is nothing the steel cube can do to fix the problem. It was designed, built and it does not work. Period.
By contrast, modern technology platforms are designed and built to be responsive. If they do not work, they can adjust. If they work — but not in the way users find helpful — they can adjust.
The creators of technology platforms can build a watchful eye into their processes that scans for ethical overshoot. When they spot a problem, they can iterate the very technologies they created to help solve the problem. There’s not necessarily a need to create a new technology to solve an ethical problem created by an existing technology.
In my next post I’ll explore the ethical implications of the iterative nature of technology for creators and users of technology platforms. But for now I’d like to hear your thoughts: what responsibility do you think we have when it comes to our technology platforms and the way they are used?