The Digital Architecture is the Market. The Market is the Digital Architecture.

Juddson R Taube
15 min readFeb 28, 2019

--

“The line between private and public is to be drawn on the basis of the extent and scope of the consequences of acts which are so important to need control.” - John Dewey, Public & its problems, 1954

Introduction

Ethan Zuckerman didn’t mean to break the internet. His employer — an early-internet company that allowed users to easily build their own websites — was just trying to monetize their rapidly-expanding, user-generated content. They set a problem before Ethan: How can we sell advertising space without making it look like an advertiser was expressly endorsing whatever that content might be? This problem dawned on them when one of their sales executives displayed an ad mockup over a user page during a demo with Ford executives. The content of the page was entirely explicit, homosexual pornography.

Ethan’s company promised up and down that there was no way a Ford ad would ever again appear on the page ever again. The problem was that, at the time, there was just no way to ensure that. Creating separate ads for every single user page was totally unfeasible. But Ethan had an idea:

Javascript has just been invented. One of the new functions in Javascript allows you to open a new browser window and in that window, I put a small ad, a little 200 pixel by 200 pixel that pops up beside the user homepage (as cited in Vogt and Goldman, 2014).

A tiny new window would appear, apart from whatever possibly objectionable material was beneath it. It created distance. They could monetize every website on their platform with any ad, and Ethan became a hero to his sales team. But he was soon a villain to the rest of the world: Ethan Zuckerman had just created the pop-up ad.

Jeff Foxworthy voice: You might not be a millenial if this image gives you anxiety. (Image Credit: Wikipedia)

We have long held on to the idea of a free and open internet, and because of that, we have tolerated the intrusiveness of advertising. Much like with newspapers before it — who were initially funded by political parties — advertising in media has been heralded as a neutrality paragon. In a way, Ethan saved this independent, “neutral” internet with his Javascript solution. This did not stop Ethan, of course, from receiving death threats for his creation which became both a ubiquitous and hated part of the free and open world wide web.

Ethan, who is now the director of the MIT Center for Civic Media, is apologetic. He insists his intentions were good. Of note is that he’s not sorry for the popup ads, themselves. What Ethan regrets is having paved the way for the advertising-driven model for the internet to survive. The internet could have failed then, and a different model not driven by ad revenue could have risen from the ashes.

And now that popup ads have been relegated to fringe parts of the web, something far more sinister has crept into its place without us even really noticing: surveillance and data collection.

This piece builds off the one I posted to Medium earlier this month about counterculturalism and social media companies. Buried in that post was an argument of how neoliberal (market first, regulation last) philosophy undergirds the financial architecture of the internet. That architecture, however, not only sits beneath but also refracts and transcodes (Manovic, 2001) the ideology both around the web and back upon societies as we conduct an increasing amount of our lives within that ecosystem. I pointed out, as have many researchers before me, that this has rendered our public sphere toxic by our natural tastes for the extreme. This comes in part by engagement algorithms, but far more critically, tech giants that are willing to look the other way for a buck. This transcoded, classically-liberal ideology then diffuses into society to be as a natural part of the democratic backdrop as the water fish swim in.

[T]hese kinds of interactions… are also reflective of and influenced by other platform cultures. Toxic technocultures propagate precisely because of the liminal and fluid connectedness of Internet platforms (Massanari, 2017, p. 341).

I used James D’Amore’s Google manifesto gaff and the public discussion that followed as an example: Instead of a discussion emerging about D’Amore’s poor understanding of gender, the scientific literature he incorrectly cites and ignores, or his creation of a “textbook hostile workplace environment” (Zunger, 2017), the public debate held by most large media outlets instead centered around the censorship he didn’t suffer and his free speech that wasn’t infringed upon. Here, I’m going to elaborate on this argument by pointing to exactly how neoliberal markets are built into the architecture of internet media, which in turn disassembles the architectures of democracy, education, and the first amendment.

Surveillance Capitalism

Ethan, again, isn’t sorry for the popup. He’s sorry about unwittingly enabling its successor, what Shoshana Zuboff refers to as surveillance capitalism. Zuboff (2015) explains how capitalism’s inherent logic of accumulation fundamentally changes within the wake of “big data” to focus on the acquisition of information.

“This new form of information capitalism aims to predict and modify human behavior as a means to produce revenue and market control” (p. 75). In the case of social media, it’s our own behavior data mined from your computer, your phone, and your assorted personal devices. This behavioral surplus, Zuboff coins, is harvested without your consent let alone your knowledge, processed with ever-increasingly sophisticated machine-learning algorithms, and then sold in patterns to companies who want to influence you. This is qualitatively different, Zuboff argues, than older logic of capitalism in the industrial era.

Google and the ‘big data’ project represent a break with this past. Its populations are no longer necessary as the source of customers or employees. Advertisers are its customers along with other intermediaries who purchase its data analyses… [T]he top three Silicon Valley companies in 2014 had revenues of $247 billion, only 137,000 employees, and a combined market capitalization of $1.09 trillion. In contrast, even as late as 1990, the three top Detroit automakers produced revenues of $250 billion with 1.2 million employees and a combined market capitalization of $36 billion (as cited in Zuboff, 2015, p. 80).

This distinct break from the interests of populations served by and relied upon by industrial, Chandlerian forms of capitalism has incredible implications for the project of democracy. The interest of the few are held apart from those from whom their profits are accumulated without oversight. This may be a foreign and jarring notion, but the pragmatist theory of democracy insists that “widely distributed consequences… when they are perceived, create a common interest and the need of special agencies to take care of it” (Dewey, 1954, p. 54). But these neoliberal values of limited government and market reliance regard regulation as interference upon personal freedoms. We then eschew any effort at creating shared solutions to imminent problems as they are viewed as coming from a hulking, overreaching, and incompetent bureaucracy that is incapable of steering us through a storm that is too impossibly complex for us to understand. Problems with pressing and immediate consequences are swept under the rug under the guise of preventing a hypothetical autocratic state. The irony being that this allows private (and arguably state) actors to perpetuate autocracy.

Previewing of the Nuclear Age of the Internet — Private and Public Autocracies

In the early parts of the 2010s, much of the world was awestruck by the democratizing power of social media that was toppling authoritarian regimes left and right. However, much of the hype (like almost anything to do with tech) has since been lost. The opaque smoke of these once hot burning fires of revolution has cleared. We can see now that oppressive governments are not only still standing, but in many instances they have often co opted the social media tools that we thought would be their undoing. With the benefit of hindsight, we can be more understanding and critical and of what exactly it was that our social media companies were exporting to the far corners of the globe if it wasn’t democracy; what we were exporting was our own laissez-faire ideals.

Whether you see tech as something ushering us into utopia, the panacea for the world’s ills or whether you see it as an over hyped and dangerous pairing with American hubris, it likely depends on your ideologies of capital. Larry Diamond’s aptly titled 2010 piece, “Liberation Technology” is emblematic of neoliberal attitudes towards social media as a megaphone for the downtrodden in oppressive public sphere. In it, he is unequivocal for his support for tech and its ubiquity in democratic uprisings: “Since [2001], liberation technology has been instrumental in virtually all of the instances where people have turned out en masse for democracy or political reform” (p. 78). He cites the Orange Revolution in Ukraine, the Cedar Revolution in Lebanon, but ironically points to the most “dramatic recent instance” (p. 79) as the Green Movement in Iran.

To date, the Green Movement illustrates both the potential and limits of liberation technology. So far, the Islamic Republic’s reactionary establishment has clung to power through its control over the instruments of coercion and its willingness to wield them with murderous resolve. Digital technology could not stop bullets and clubs in 2009, and it has not prevented the rape, torture, and execution of many protestors. But it has vividly documented these abuses, alienating key pillars of the regime’s support base (p. 80).

What Diamond has probably come to see along with the rest of us is that the Iranian government has since assembled a cyber crime team to arrest individuals complicit with the Green Revolution, used en masse text messaging to threaten those who were influenced by “de-stabling” propaganda, and protrayed to their citizens how the harmonious efforts of American commerce and American diplomacy destabilized their government. The connection from these companies to the US government was also the message broadcast around the world to other governments, autocratic or not, with regard to how to create social media policies of their own. “Whereas a social movement has to persuade people to act, a government or a powerful group defending the status quo only has to create enough confusion to paralyze people into inaction” (Turfecki, 2017). Before long, Diamond’s exemplar of “liberation technology” became to an oppressive one.

mine is forest green, because earth

The effort to shower autocratic public spheres with social-media freedom-juice in the interest of democracy from a neoliberal pipe dream, is one that ignores the ironies of free speech outlined by Fiss (1997). He cautioned that when speech itself inhibits speech, “the classic remedy of more speech rings hollow” (p. 16). These social “platforms,” the term itself loaded with legal exculpatory frameworks, both profit and pride themselves on their neutrality with regard to their user-generated content and the value gained from the surveillance of its users. Google, Facebook, and Twitter et al. are still widely viewed as engaging in the process of democratizing expression in a way that’s unprecedented, and that is certainly how they wish to be viewed. Twitter, the purported anti-censorship vessel of the Arab Spring, had its Vice President Tony Wang proudly tout at the time of the Green Revolution that they “are the free speech wing of the free speech party” (as cited in Halliday, 2012). It’s easy to forget that the other, more central interest these companies serve are those of profit and their shareholders, an interest made clear by both Twitter and Facebook’s refusal to join the Global Network Initiative (GNI).

(GNI), an industry-wide pledge by other technology companies — including Google, Yahoo, and Microsoft — to behave in accordance with the laws and standards covering the right to freedom of expression and privacy embedded in internationally recognized documents like the Universal Declaration of Human Rights (Morozov, 2012, p. 23).

Facebook didn’t join the GNI until about three weeks before CEO Mark Zuckerberg testified before congress in response to the Cambridge Analytica scandal. Twitter has still yet to join.

Evgeny Morozgov was one of the earliest voices to cry foul with regard to social media’s application as a democratic panacea. In the introduction to his 2012 book, The Net Delusion, he echoed Langdon Winner, who once quipped, ‘[A]lthough virtually limitless in their power, our technologies are tools without handles” (as cited in Morozgov, 2012). To wit, whatever control our policymakers or our CEOs (the line between is growing increasingly unclear) believe they are doing when unleashing the technology upon the world, they have far less control and far more influence than they understand. And while Morozgov flirts with implicating capitalism (the word only appears once in his introduction), he misses a lovely opportunity to pun hubristic Americana: What better place for a tool without an apparent handle than a world governed by an invisible hand?

Weapons of Math Destruction (WMDs)

Algorithms which are the vehicle of these dangers and therefore rightly subject of these debates recently have become salient because of the foibles of one particular company and the strategies of its COO and their CEO. Facebook and Mark Zuckerberg are the centerpiece of the larger conversation about surveillance capitalism because of the Cambridge Analytica (CA) scandal. In short, a right-wing political consulting company’s former computer engineers went whistle blower, suggesting CA’s leveraging of captured Facebook data (to the tune of 50 million personal profiles) may have influenced the 2016 US presidential election. The source of the harvested data was, appropriately, a personality quiz that took advantage of Facebook’s lackadaisical restrictions on third-party apps accessing what is their core resource for data: their users. For one of the first times in the corporate history of the internet era and surveillance capitalism, there is emerging, public understanding that the actions of these tech giants and their data projects could have serious harm.

Social media weighing in reflexively on the Facebook/Cambridge Analytica scandal

This is not to say that there haven’t been a wide array of places outside of Facebook in which someone has been caught up the wake of a sweeping, albeit clumsy algorithm. It has just been hard to notice. Researchers have seen when searching black-identifying names, embedded advertisements will predict the searcher wants to see arrest records (Sweeny, 2013). Google search results of “black girls” returned sexually explicit content as the first result (Noble, 2018). Gay men, looking up perfectly legal ways to meet other gay men encounter problematic associations with sex offenders (Ananny, 2011). Governments through the police and the courts sometimes rely on algorithms for criminal justice decisions that are empirically flawed and racist (Angwin et al., 2016; O’Neil, 2016; Christin, 2017).

These human and computer “mistakes” are all products of what Cathy O’Neil (2016) coined Weapons of Math Destruction. Poorly designed and implemented shortcuts of machine learning that have immediate returns and often invisible or ignored externalities.

These WMDs have many of the same characteristics… They’re opaque, unquestioned, and unaccountable, and they operate at a scale to sort, target, or “optimize” millions of people… For many of the businesses running these rogue algorithms, the money pouring in seems to prove their models are working… The trouble is that the profits end up serving as a stand-in, or proxy for truth (p. 12).

Walking her readers through a number examples of these clumsy algorithms that pervade public and private life, she concludes that “[w]e cannot count on the free market itself to right these wrongs” because it “will mean putting fairness ahead of profit” in a world where selfishness has become a exculpatory moral framework. That remains a non starter.

Part of the reason these algorithms do such a terrible job is despite being fueled by unprecedented rates of collection and availability of data, they are being executed by mathematics that predate most computers: logistic regression, first developed in the 1950s. In fact, as of 2015 almost all ad targeting on the internet was still using this kind of modeling. This is “why so many of the ads you see online are desperately irrelevant” (Chollet, 2018). There is almost no sophisticated intelligence operating in any of them besides linear algebra that takes a handful of X’s matched to a Y. But what Facebook and Cambridge Analytica showed many of us for the first time is what kind of social engineering is possible with more sophisticated application of massive computing power and masses of personal data on a specific objective with nefarious methods: possibly effecting an election by manipulation. An employee at a Bay Area tech firm told me that Facebook’s feature engineering is no longer logistic regression but engagement experimental design mechanisms that “can run real time perfectly, characteristically matched cohort a/b testing of every engagement metric. Constantly” (“Anonymous”, 2019). The Weapons of Math Destruction are hurtling towards nuclear capabilities within this market framework that lacks any moral influence or ethical guidelines with increasing sophistication. And as long as we have the neoliberal veil over our eyes, our individualistic perspectives will make it difficult, if not impossible, for us to accept that our decisions are not our own.

Zunger (2018), a physicist by training and former Google engineer, argues that one of the reasons that real reform hasn’t yet taken place is that the profession of computer science has yet to experience a true reckoning like that of other scientific fields. Physics has the atom bomb. Biology has Eugenics. Civil engineering has bridge and dam failures. Computer engineering, however, has yet to be held responsible for any of the ills they perpetrate, in part because of the veil of secrecy within which their “black boxes” operate occludes justice and regulation interferes with profit. Meanwhile, Facebook operates as one of the vanguards of an industry willing to only engage in after-the-fact restitution for their algorithmic malfeasance.

Facebook’s original motto

They never seem to ask themselves “Should we?” before they find answers to the question “Can we?” When it comes to a computer scientist faced with a wall of code, the concern is always making the code work, not considering what the code might do to real people once it is enacted (Ullman, 2012). When the consequences become public, they follow up with lawsuits that overwhelm victims (Zuboff, 2015) or promises to fix it without addressing the actual issue of their core business model. It is hard to imagine that we are not inevitably headed for a reckoning that Cambridge Analytica will only be some small prelude to.

The invisible hands operating our digital public sphere are private and few; they value exculpatory logics of platform architecture and free “marketplaces of ideas” with no constraints which might cause them to be responsible to their democracy. And while our social media overlords want to hurl more algorithms at us to fix what they broke by moving too fast, no amount of revamped digital architecture will ever be enough to overwhelm incentives of the market because the market created the very architecture that gives it an ideological safety net.

Works Cited

Ananny, M. (2011). “The Curious Connection Between Apps for Gay Men and Sex Offenders.” The Atlantic.

“Anonymous”. (2019). Personal Communication.

Angwin J., Larson J., Mattu S.,, and Lauren, K. (2016). Machine Bias. ProPublica, May 23, 2016.

Chandler Jr, A. D. (1993). The visible hand. Harvard University Press.

Chollet, F. (2018) What worries me about AI. Medium. Retrieved from: https://medium.com/@francois.chollet/what-worries-me-about-ai-ed9df072b704

Christin, Angèle. 2017. “The Mistrials of Algorithmic Sentencing.” Logic 03: https://logicmag.io/03-the-mistrials-of-algorithmic-sentencing/

Dewey, J. (1916). Democracy and education: An introduction to the philosophy of education. Macmillan.

Dewey, J. (1954). Public and its problems.

Diamond, Larry. “Liberation technology.” Journal of Democracy 21, no. 3 (2010): 69–83.

Manovich, L. (2001). Principles of New Media, The Language of New Media. Cambridge, Mass.: MIT Press, 27–48.

Massanari, A. (2017). “#gamergate and the Fappening: How Reddit’s Algorithm, Governance, and Culture Support Toxic Technocultures.” New Media & Society, Vol. 19(3) 329–346.

Noble, S. U., (2018). Algorithms of Oppression: How search engines reinforce racism. New York University Press.

O’Neil, C. (2016). Weapons of Math Destruction. Crown.

Sweeney, L. (2013). Discrimination in Online Ad Delivery. ACM Queue 11(3): 1–19.

Taube, J. (2019). Are social media countercultural technologies? Lol, no., Medium. Retrieved from: https://medium.com/@McGrudis/are-social-media-countercultural-technologies-lol-no-7a8b3ec88678

Tufekci, Z. (2017).Twitter and Tear Gas: The Power and Fragility of Networked Protest. New Heaven: Yale University Press.

Turner, F. (2019). Machine Politics: The rise of the internet and a new age of authoritarianism. Hapers Magazine.

Ullman, E. (2012). Close to the Machine: Technophilia and its Discontents. Picador.

Vogt, P. and Goldman, A. (2014). Reply All. [podcast] We Know What You Did. Available at: https://www.gimletmedia.com/reply-all/3-i-didnt-mean-to-break-the-internet#episode-player

Zuboff, S. (2015). “Big other: surveillance capitalism and the prospects of an information civilization.” Journal of Information Technology 30, no. 1: 75–89.

Zunger, Y. (2018). Computer science faces an ethics crisis. The Cambridge Analytica scandal proves it. Boston Globe. Retrieved from: https://www.bostonglobe.com/ideas/2018/03/22/computer-science-faces-ethics-crisis-the-cambridge-analytica-scandal-proves/IzaXxl2BsYBtwM4nxezgcP/story.html

--

--

Juddson R Taube

Recent PhD graduate of Stanford's Graduate School of Education. I'm on the job market. Hire me!