Bowing Down to Big Data
In a Nation and World That Is Increasingly Secular and Agnostic, Data and Algorithms Emerge as Deities of the 21st Century
We are entering an age which, it is becoming abundantly clear to me, is ruled by new gods. Nearly every conceptualization of the religious god is a mystical “black box” — one that few, if any, individuals know what secret is locked inside — one which sends its rules and edicts down from high above. And this is what data and algorithms are becoming (or may have already become).
In Cathy O’Neil’s recent book, Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy, she describes how and why these algorithms meant to work with data (what she calls WMDs) behave this way, like gods:
Verdicts from WMDs land like dictates from the algorithmic gods. The model itself is a black box, its contents a fiercely guarded corporate secret. This allows consultants…to charge more, but it serves another purpose as well: if the people being evaluated are kept in the dark, the thinking goes, they’ll be less likely to attempt to game the system. Instead, they’ll simply have to work harder, follow the rules, and pray that the model registers and appreciates their efforts. But if the details are hidden, it’s also harder to question the score or to protest against it.
You don’t question a god, and even if you do, you are told that it’s too complicated or it’s too mysterious or you wouldn’t understand. As with the god of most major world religions, most people aren’t equipped to be critical of these verdicts. Further, few may even allow themselves to be. In fact, if you aren’t a statistician or a mathematician or a computer scientist, you truly might not be able to understand these algorithms. At least, not by yourself, even with all the documentation and all the relevant resources in front of you.
“They’re opaque, unquestioned, and unaccountable, and they operate at a scale to sort, target, or ‘optimize’ millions of people. By confusing their findings with on-the-ground reality, most of them create pernicious WMD feedback loops.” — Cathy O’Neil, Weapons of Math Destruction
If these algorithms will always be opaque to the common person, data is a god with the power to create new sociopolitical and economic strata. Those with the knowledge to understand these algorithms (or the money to pay someone else to understand them) could form a new class. Differences between the amounts and kinds of currency between individuals, families, communities, and more start grow to elevations that may one day be utterly insurmountable. And those with the power may have little incentive to care, to modify the algorithms that are creating this reality that benefits them.
Algorithmic models that don’t “learn” or don’t incorporate feedback “define their own reality and use it to justify their results” (O’Neil). I can already imagine ways this might cause a conflict or a push-and-pull between the value of “objective” data and “subjective” feelings about that data. It’s hard enough to fight the realities created by the feelings of politicians and political parties. What will the world look like as we more increasingly have to fight the realities created falsely by data?
One of the very real issues with big data is that it is often too correct. So often, in fact, that we begin to believe its errors are so infrequent that they must be harmless (or even more dangerously that they are never wrong at all, just seeing something the rest of us, with our subjective eyes, can’t). But if these algorithms are wrong once, that’s too many times, especially when lives are on the line, as they so frequently are. Data is a god that believes in its reality and makes you believe in it, too:
“But you cannot appeal to a WMD. That’s part of their fearsome power. They do not listen. Nor do they bend. They’re deaf not only to charm, threats, and cajoling but also to logic — even when there is good reason to question the data that feeds their conclusions.” — Cathy O’Neil, Weapons of Math Destruction
In Alice Marwick’s piece, “How Your Data Are Being Deeply Mined,” she explains:
“Using social media allows us to connect with friends; to learn more about ourselves; even to improve our lives. The Quantified Self movement, which builds on techniques used by women for decades, such as counting calories, promotes the use of personal data for self-knowledge. Measuring your sleep cycles over time, for instance, can help you learn to avoid caffeine after 4:00 pm, or realize that if you want to fall asleep you can’t use the Internet for an hour before bedtime.”
These pieces of information meant to help you manifest like commandments from our data god. We believe these rules can help us, so we make this exchange of value. So, we are complicit in this. We obey the big data gods, without questioning. We tolerate them because we see some value in our lives and are perhaps unaware of the potential negative or dystopic consequences.
In Eli Pariser’s TED Talk on filter bubbles, he ends with a call to action to those making the decisions about these algorithms and they way they are implemented:
- He urges them to take into account our public lives, the fact that we are all real humans in a real world who have civic responsibilities.
- He states the importance of transparency in these algorithms, so that we may understand what exactly they are doing.
- He requests that they offer ways we, as users, can have a greater sense of control over these algorithms.
I think his recommendations are valuable, not only for those writing algorithms, but for all designers. These are some of the core nuggets of wisdom behind human-centered design: a holistic understanding of people’s lives. What I think it boils down to is trust. What can we do to make what we design worth trusting? How can we be more trustworthy, not just create an illusion of trustworthiness? A god that we can all understand and can trust is no longer god. When the mystery and illusion of omnipotence is stripped away, big data could look a little more like a wise friend.