Living in the Age of Surveillance Capitalism

In this post, Stevie Docherty and Justine Gangneux reflect on the seminar ‘Living in the Age of Surveillance Capitalism’, which took place at the University of Glasgow in June 2019.

Shoshana Zuboff’s book The Age of Surveillance Capitalism (London, Profile) quickly drew widespread popular and academic attention when it was published earlier this year. In exhaustive detail, Zuboff explores the scale and consequences of the tracking and nudging of everyday life via on and offline technology in the pursuit of profit. A review by Sam Biddle for The Intercept called the book “a masterwork of horror…Even those who’ve made an effort to track the technology that tracks us over the last decade or so will be chilled to their core”.

Zuboff previously introduced her concept of surveillance capitalism in a 2015 article in the Journal of Information Technology, where she used the term to describe an “emergent logic of accumulation in the networked sphere” (2015: 75). This was followed by a 2016 piece in the Frankfurter Allegemeine Zeitung. There, she singled out Google and its parent company Alphabet as “ground zero for a wholly new species of capitalism in which profits derive from the unilateral surveillance and modification of human behaviour”. At the same time, she argued, the surveillance capabilities of companies like Google and Facebook were surpassing those of traditional surveillance operators and state agencies like the NSA.

The Age of Surveillance Capitalism builds extensively on this earlier work, exploring what Zuboff sees as the key aspects of surveillance capitalism. These include “behavioural surplus” — the stuff derived through the transformation of our everyday lives and experiences into data. These data might be pressed into improving products and services, but they might also be refined and processed further into what she calls “prediction products” that can be used to anticipate our future behaviours. Such products are also a form of capital that can be bought and sold by players in the expanding “marketplace of behavioural futures”. Greater gains lie not just by knowing what we do and are likely to do next, but by trying to actively shape our actions to various ends. In this way, she argues, the “means of behavioural modification” become the locus of a new kind of power (“instrumentarian power”).

Linked with this is something that Zuboff calls the “problem of two texts”, fostered by the mechanisms of surveillance capitalism.

“When it comes to the first text, we are its authors and readers. This public-facing text is familiar and celebrated for the universe of information and connection it brings to our fingertips […] Much of this public-facing text is composed of what we inscribe on its pages: our posts, blogs, videos, photos, conversations, music, stories, observations, likes, tweets…” (2019: 186).

Behind, through and under the first text, surveillance capitalism prompts the creation of a second, “shadow” text:

“The first text…actually functions as the supply operation for the second text, the shadow text. Everything that we contribute to the first text, no matter how trivial or fleeting, becomes a target for surplus extraction. That surplus fills the pages of the second text. This one is hidden from our view: ‘read only’ for surveillance capitalists” (ibid).

Zuboff writes that the shadow text is capable of saying “more about us than we can know about ourselves”. Similar territory has been explored by others including Deborah Lupton (2018), John Cheney Lippold (2017), and Rob Horning (2017). The idea of the self as “knowable” through data offers its own consolations, as Horning points out — and it’s important not to discount their appeal. He goes on to argue that binding identity to consumer choices “makes us more knowable to others in this datafied form than we are to ourselves. But being scored through our data also feeds the fantasy that we are essentially knowable, that we can know ourselves completely and totally…Algorithms promise a simple solution to the riddle of the self, should we want one”.

It’s interesting to consider this in light of the credit reporting company Experian’s recent “data self” advertising campaign. In a series of adverts (see this one, for example), actor Marcus Brigstocke plays the twin roles of Dan and his “data self”. Dan’s data self is the version of him that lenders see, made up of his past credit transactions. His data self shadows him at work, on dates, at home — an occasionally annoying but helpful presence (as when he assists Dan in buying a new car). The adverts depict the data self as a cuddly physical manifestation — a knowable entity. Experian also released a YouTube video inviting viewers to use an interactive guide to help them find out more about their own data selves. The expressed aim of this was to help people make better financial decisions. But to return to Zuboff’s perspective, the shadow text is only ever incidentally helpful to us as consumers. Rather, it rewrites us as consumables.

Chilling perhaps, but does any of this really come as much of a shock? Evgeny Morozov (2019) notes that Zuboff’s characterisation of surveillance capitalism as “our new invisible Leviathan” fails to recognise the ways in which “power, under capitalism, has been operating for several centuries: the invisible Leviathan has been with us for quite some time.” Elsewhere, Kirstie Ball has pointed out that much of the ground covered in the book has already been broken by surveillance studies scholars over the past twenty years. She argues that it “deploys the term surveillance in its popular form as a sensitising device…It has been written by someone who has spent their working life at an elite business school, and it reflects both the US business context and the form of critique that arises in the vernacular of the US business school”. As such, it is best read as a “wake-up call to the educated business reader”, rather than as a novel contribution to the fields of surveillance, security and/or media studies.

A nexus of events including the Trump election, the EU referendum, and particularly the Facebook/Cambridge Analytica revelations in March 2018, have drawn renewed public attention to the challenges posed to society and democracy by the “datafication” (van Dijck 2014) of everyday life. A rich variety of recent work has sought to articulate and understand the present condition in differing terms: from “platform capitalism” (Snricek 2017) and “data capitalism” (West 2019), to “platform surveillance” (Wood and Monahan 2019) and the “platform society” (van Dijck, Poell and de Waal 2018).

Zuboff’s book is just one part of this wave. At the same time, its juggernaut status and its publication just under a year after the Cambridge Analytica story broke seemed to us to offer a useful prompt for discussion. What other meanings and manifestations of surveillance capitalism might there be? If we are living in an age of surveillance capitalism, what are some of the particular challenges we face as a result? And what possible tactics or solutions can we imagine in response?

With the support of the Department of Sociology at the University of Glasgow, and the newly-formed Glasgow Social and Digital Change Group, we were fortunate to be able to welcome a panel of four scholars working at the cutting edge of their fields to explore these questions: Dr Jennifer Pybus (King’s College London), Dr Morgan Currie (University of Edinburgh), Professor Sally Wyatt (Maastricht University) and Dr Zoetanya Sujon (London College of Communication, University of the Arts London).

The four seminar presentations are outlined briefly below, together with the seminar audio and the slides that accompanied each presentation, in the hope of encouraging ongoing engagement with these ideas beyond the seminar room. We thank the speakers for joining us in Glasgow in June 2019, and for their generosity in permitting us to share these resources.

Dr Jennifer Pybus

iData: Politics of Personalisation on our Mobile Devices

Slides / Audio

Drawing on her existing Arts and Humanities Research Council project, “Zones of Data Translation”, Dr Pybus addressed how we can better understand the technological objects that enable the capture of our personal data via the applications we routinely download on our mobile devices. From Zuboff’s perspective, technological objects that are centred around data extraction are often aimed at making sense of our personal data and carry with them an inherent promise of behaviour prediction. In turn this is predicated on an unprecedented level of data granulation or personalisation: “the productive accumulation and actionability of our digital traces”.

The new logic of accumulation thrives on the subsumption of sociality into multivalent data points via the rise of ‘adtech’ or ‘martech’: umbrella terms for different technologies and third parties which are enabling the intensification of personalised analytics and other such tools to micro-target consumers. But what are the material building blocks for datafication on our mobile devices? Or put another way, what are the technical components — from hidden permissions to user-friendly interfaces — that allow our personal data to flow from the applications that we use? By expanding our capacity to better understand these technological objects on our mobile devices, can we facilitate a more agentic and engaged citizen who is able to evaluate her apps in a different way?

Dr Morgan Currie

Before the Bullet Hits the Body: Organising Against Predictive Policing in Los Angeles

Slides / Audio

The Stop LAPD Spying Coalition is a diverse collective of anti-surveillance activists in Los Angeles which began campaigning against predictive policing around 2015. Their aim is “the dismantling of government-sanctioned spying and intelligence gathering, in all its multiple forms.” The goal of anticipating crime and stopping it from happening isn’t new; what are new are the techniques of algorithmic risk assessment, data-driven modelling, and so on that police and other law enforcement bodies have begun adopting in recent years (Ferguson 2017). Like Zuboff’s “prediction products”, predictive policing is driven by a need for ever-greater amounts of data at ever-finer degrees of granularity. Yet algorithms do not “predict”, in that they cannot “tell” the future — rather, they deal in statistical probabilities. Such models frequently penalise poorer and more marginalised social groups, even as our ability to control our own data representations is declining.

Dr Currie’s paper explored the background and activities of the Stop the LAPD Spying Coalition along with their 2018 report, Before the Bullet Hits the Body — a withering critique of predictive policing that exposes the sociotechnical dimensions of predictive algorithms and the biases found in police data. These anti-surveillance activists use investigative techniques, such as open records requests and old-school community organising, to contest the rise of mass data gathering by law enforcement via technologies such as facial recognition software, body cameras, and drones. What can citizens do to evade (or eradicate) these techniques?

Professor Sally Wyatt

Health and Healthcare in a System of Surveillance Capitalism

Slides / Audio

Digital technologies have long been promising to revolutionise healthcare in all sorts of ways. These range from enabling people to become better informed about their health problems and possible treatments given all of the information freely available online, to more recent promises of how access to data through (self-) monitoring apps and devices will promote health and well-being. Yet these claims are often not empirically well founded.

Moreover, the promises often mask corporate or other institutional needs to reduce costs, to sell devices or to exercise control over individuals. Professor Wyatt’s presentation examined recent developments around the quantified and biopolitical self and how these contribute to surveillance capitalism, addressing issues including a shift away from top-down, panoptical surveillance to the rise of intimate surveillance and self-monitoring (like that enabled by the fertility tracking app FEMM); the implications of popular direct-to-consumer genetic testing ventures like 23andMe and their web-based research agendas; and how we might think of the design of these technologies in terms of “hostility” as well as or in contrast to their user-friendliness or accessibility.

Dr Zoetanya Sujon

The Rise of Platform Empires: Sociality as Mass Deception

Slides

The 2018 Cambridge Analytica ‘scandal’ revealed the collection of mass amounts of data not only from willing Facebook participants, but also from their non-consenting Facebook friends. In her presentation, Dr Sujon highlighted that the language of “scandals” or “data breaches” in connection with such events is misplaced: mass, covert data collection from willing and unwilling users (and non-users) is not an aberration, but an industry standard driving the working business model for social media and digital platforms. The Cambridge Analytica case provides deep insight not only into this business model but into Facebook’s role in the rise of platform empires — platforms as “extensive ecosystems of activities”. These empires are playing a critical role in shaping social interaction and global economics, surveillance capitalism and data colonialism — the topic of recent work by Nick Couldry and Ulises Mejias.

While GAFA (Google, Amazon, Facebook, Apple) and BAT (Baidu, Alibaba, TenCent) make up these platform empires, Facebook has led the way in presenting social connection as its primary aim, rather than the increasingly sophisticated collection of personal data in exchange for highly profitable targeted advertising. This mass deception is significant for a number of reasons: it has historic precedents in the culture industries (cf. Adorno and Horkheimer 1944); it presents digital sociality as an experience of connection and visibility while also transforming sociality into a process for invisibly producing data; and it obscures the protection of privacy-as-a-right through a complex language of copyrights and data ownership. It is this kind of deceptive sociality which promotes the rise of platform empires (and platform imperialism), eroding social privacy and transforming ordinary people into data subjects.

Biographies

Dr Jennifer Pybus (@jrpybus) is a Lecturer in Digital Culture and Society in Digital Humanities at King’s College London. Her research focuses on the diverse ways in which our digital lives are being datafied, turned into social big data that fuels our increasingly personalised, data intensive economy. More specifically, she is interested in questions around youth and privacy which relate to how third-party ecosystems found on social media platforms, are transforming the advertising industry via the rise of data analytics and algorithmic processes. Her current research looks at the politics of datafication and everyday life, specifically in relation to those critical points of tension that lie at the intersections between digital culture, data and emerging advertising and marketing practices.

Dr Morgan Currie is Lecturer in Data and Society in the Department of Science, Technology and Innovation Studies at the University of Edinburgh. She engages with the relationship between data and power and asks how data infrastructures condition the possibilities for democratic governance, civic behaviour, and political struggle. Her research explores how civil society can use data as a tool to shape and contest political issues and also how new information technologies might open — or foreclose –democratic decision-making within government institutions.

Sally Wyatt (@wyatt_sally) is Professor of Digital Cultures at Maastricht University. She has many decades of teaching and research experience about digital technologies, including topics such as digital divides, open research data, digital humanities and how people find and create health information online. In 2016, together with Anna Harris and Susan Kelly she published CyberGenetics. Health Genetics and New Media (Routledge). This was awarded the Foundation for Sociology of Health and Illness book prize in 2017. In September 2019, Maastricht University launched a new BA Digital Society, of which Wyatt is the programme director.

Dr Zoetanya Sujon (@jetsumgerl) is a Senior Lecturer and Programme Director for Communications and Media at London College of Communication, University of the Arts London (UAL). Her research interests broadly address the relationship between “new” technologies and digital culture. Currently, these interests are based around four themes: social technologies and platform politics; the intersections between privacy and sharing culture; innovation and virtual technologies; and the impact of digital media on changing skill sets and digital literacies. Her book The Social Media Age is forthcoming with Sage.

Acknowledgements

Living in the Age of Surveillance Capitalism was made possible by funding from the Department of Sociology at the University of Glasgow and from the Glasgow Social and Digital Change Group.

Research associate @UofGSPS working on digital technology and data, social media, surveillance, digital literacies, youth studies & digital research methods

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store