Who is to Blame for Algorithmic Outrage?

Will Rinehart
3 min readSep 21, 2017

Last week, Julia Angwin’s reported on an unfortunate part of the Facebook ecosystem:

Want to market Nazi memorabilia, or recruit marchers for a far-right rally? Facebook’s self-service ad-buying platform had the right audience for you.

Until this week, when we asked Facebook about it, the world’s largest social network enabled advertisers to direct their pitches to the news feeds of almost 2,300 people who expressed interest in the topics of “Jew hater,” “How to burn jews,” or, “History of ‘why jews ruin the world.’”

Her work caused a cascade of reporting and mea culpas at Facebook, Google, and Twitter as they all pulled derogatory search terms from their ad platforms.

Devin Coldewey at TechChrunch isn’t convinced by the apologies, “I can’t be the only one who found this affected concern, monocle-popping shock, and confident deflection unconvincing.”

But for those that have worked on these expansive platforms, the perception is different. There is a certain kind of blindness to the platform’s operation, which I have dubbed, the banality of scale.

Ben Thompson recently recalled a story, which I think makes this more concrete:

In August 2011, just a day or two into my career at Microsoft, I sat in on a monthly review meeting for Hotmail (now known as Outlook.com); the product manager running the meeting was going through the various geographies and their relevant metrics — new users, churn, revenue, etc. — and it was, well, pretty boring. It was only later that I realized just how astounding “boring” was; a small group of people in a conference room going over numbers that represented hundreds of millions of people and dollars in revenue, and most of us cared far more about what was on the menu for lunch.

As Thompson has explained elsewhere, it is difficult to understand the scale of Facebook if you have never worked on a product at scale. But a lot of us have experienced the banality of scale. Apple’s market cap is just shy of $800 billion. The federal government is slated to spend $3.65 trillion this year. The population of the US is over 326,000,000. Each in their own way, these numbers are hard to grasp.

While I have never worked at one of the large tech companies, I have worked with large sets of population data. By most accounts, these sets are small, merely 60 million observations or so. But about two weeks ago, it hit me. Every single line represents a person. Knowing what the number means isn’t the same as feeling its empathetic weight.

Interestingly enough, art offers up a term for those moments of realization, defamiliarization. So much of what we do as individuals gets filtered out. We take a lot for granted, use heuristics to perceive the world, and fail to notice small minute changes. Art’s purpose, according to one view, is to “make objects ‘unfamiliar,’ to make forms difficult to increase the difficulty and length of perception because the process of perception is an aesthetic end in itself and must be prolonged.”

Thompson still remembers that moment when the veil was pulled back. Yet in the tech world, making objects familiar is the goal. Mark Weiser, chief scientist at Xerox PARC, once quipped that “The most profound technologies are those that disappear. They weave themselves into the fabric of everyday life until they are indistinguishable from it.” The banality of scale is pervasive within Silicon Valley. Scale is an explicit goal for startups, but a blind spot for large platforms. It might not excuse culpability, but I think it helps to explain the motives.



Will Rinehart

Senior Research Fellow | Center for Growth and Opportunity | @WillRinehart