The Data Panopticon and Surveillance Capitalism
How does Foucault’s fabled reformatory illustrate the problems with the internet’s business model?
The Panopticon is an architectural design for a prison. The structure was created by Jeremy Brenton in 1791. The Panopticon consists of a large tower surrounded by a circular (or octagonal, or hexagonal) structure, divided into cells with openings only visible from the tower. The idea is that from the central tower, a limited number of guards could be utilized to watch over a large number of prisoners. In 1975, French philosopher Michel Foucault published Discipline and Punish: The Birth of the Prison; in chapter 3, he uses the Panopticon model to examine the literal and proverbial application of power from those in the tower over those in the cells.
Foucault’s point of interest boils down to a psychological phenomenon among the prisoners. Because they cannot see into the tower, they cannot tell when the guards are looking in on them. This leads to a feeling of constant surveillance, which Foucault argues “assures the automatic functioning of power.” (230) The conceptual design of the Panopticon is applied by Foucault to various institutions to ensure that those in the ‘cell’ (which could be a prisoner, a student, a hospital patient, a factory worker) govern themselves out of fear that their every move is being observed. While the chapter is difficult reading at best, one can relatively easily obtain a visual of the prison version of the Panopticon in their head.
The 2016 election illuminated the degree to which social media has become an integral facet of our political voice. A recent study by the Pew Research Center has revealed that roughly two-thirds of Americans get their news from social media sites, and roughly two-thirds of those consumers are only consuming on one site.
The important piece of information not stated in this study is an understanding of how these websites operate. Every major social media site offers free membership. This means that in order to maintain servers and profit, companies must sell advertising space. The inclusion of advertisers means that Facebook receives payments per time an ad is viewed and clicked and thus has programmed their site to ‘feed’ users with content based on a series of complex algorithms.
In a recent episode of the Waking up podcast with Sam Harris, Zeynep Tufekci, a contributing opinion writer at the New York Times, associate professor at the University of North Carolina School of Information and Library Science, and a faculty associate at the Harvard Berkman Center for Internet and Society, speaks extensively about how Facebook and YouTube have developed algorithms designed to keep us using their sites longer and with greater frequency in a phenomenon she calls surveillance capitalism.
The business model is powered by machine learning, meaning that the data gathered by companies and search engines are utilized to not only show us more of what we disclosed we are interested in, but also what it predicts we are interested in. The machine learning itself is powered by machine learning. A integral part of this process involves the logical evolution of ideas, by which I mean that if somebody has an interest in a topic, they can take that topic online and learn everything about it.
Tufekci uses the YouTube ‘rabbit hole’ as an example. If I search for videos about vegetarianism, I will likely be suggested (and shown via autoplay) videos about veganism. If I let the algorithm play, I will be shown videos regarding fringe ideas or debates about vegetarianism, such as the ethics in honey harvesting. Similarly, if you search for and start watching a Donald Trump campaign rally, you will go from the rally to the pundit coverage of the rally to videos by Trump supporters to fringe hate videos. (Tufekci, 46:00–48:30)
But this process can be halted by the consumer.
They can simply exit YouTube and go outside.
Facebook and other sites that focus more on the ‘social’ part of social media pose a more immediate problem. Tufekci reports that Donald Trump’s media team (formerly Ted Cruz’s media team) made attempts to demobilize certain constituencies. As an example, they found black men in Philadelphia who’s data met certain parameters and sent out messages intended to decrease the likelihood that they would make it to the voting booth. One of the major notable differences about the 2016 election from the previous two is a major decrease in the black electorate turnout. Surely the election of the first black president has something to do with that, but it’s hard to not take into consideration targeted media campaigns intended to depress the black vote. (27:00–28:50) This is, by the way, to say nothing of gerrymandering and voter ID laws with similar intentions.
How does this tie into some prison designed in the late 18th century?
To once more quote Tufekci, “The business model of capturing your attention, profiling you and persuading you to buy that one extra pair of shoes is very compatible with a manipulative public sphere, where you don’t get to see what’s contested because it’s so segmented, and then buttons are pressed person by person.”(32:00–33:00)
To tie all this gibberish together, I posit an altered version of the Panopticon, to be applied to this surveillance capitalist business model — providing dopamine hits alongside advertisements to the masses while creating a platform for authoritarians to target segments of the population who will respond to fear-mongering and “fake news;” targets which they can very accurately predict with personality science.
The Data Panopticon retains the large central tower. It has instead of many cells, an illusion of one open cell within the spherical circumference structure (as in the picture at the head of this article, just without the cells). Rather than segment us, the prison allows the inmates to interact with each other, but surgically manipulates us with bait and switch to act in our primal tribal nature and move us towards opposite sides of the tower. It provides us with contacts who share our views and allows us to magnify our own tendencies. It erodes the political center and draws invisible walls around the in-groups it manipulates us into forming.