Why Corporate Surveillance is Unjust and Must be Challenged

Surveillance & Society
surveillance and society
3 min readDec 11, 2017
Via: Wikimedia Commons.

The following is a blog post from Jonathan Cinnamon, whose new article, “Social Injustice in surveillance capitalism” appears in the new open issue of Surveillance & Society.

///

Five decades ago, privacy scholar Alan Westin foretold of serious consequences from a new form of surveillance based not on monitoring people directly, but rather, on monitoring people via their personal data that had been increasingly stored in government databases since the 1960s. Fast-forward 50 years, to the present ‘big data’ era in which personal behavioural data produced online and through interaction with digital systems is now the fuel of a new mode of capitalism. In this era of ‘surveillance capitalism’, private corporations such as Google, Facebook, Target, Amazon, and Equifax are the key players collecting and analyzing personal data, which they do to profile individuals and groups according to spending habits, lifestyles, and behaviours. In the last decade, these practices have rapidly accelerated beyond our ability to fully understand their consequences, but what is becoming clear is that they produce a variety of social harms that are both inadequately legislated against due to outdated laws and regulations, and insufficiently described by the concept of ‘privacy’.

Some critics of corporate surveillance are now suggesting these practices should be considered threats to fairness, equality, autonomy, and freedom. For my recent article ‘Social injustice surveillance capitalism,’ published in Surveillance & Society, I aimed to make these emerging claims explicit. To do so, I draw on a theory of justice by the political philosopher Nancy Fraser; this provides a useful set of conceptual tools for thinking about how the architecture of corporate personal data accumulation and control threatens parity of participation in social life, and therefore must be a target of social justice contestation and reparation.

the architecture of corporate personal data accumulation and control threatens parity of participation in social life

Specifically, the article describes how unfair Web and digital platform use agreements enable the first form of injustice, maldistribution, in which companies accumulate and profit from highly valuable personal data while restricting users from accessing, disputing, or benefiting from it themselves. This paves the way for the next form of injustice in Fraser’s framework, misrecognition, which occurs when the data are then subject to data mining and automated analysis by computer algorithms. These opaque and secretive processes place people into categories for profiling and targeting, which can then influence their future ability to get things like jobs and mortgages. A third form of injustice, misrepresentation, occurs when transnational companies transfer and store personal data in foreign jurisdictions, effectively shielding people from making legal claims against uses of their data that might otherwise be illegal in their home country. Although the consequences may be divergent, each type of injustice can be understood as limiting one’s ability to participate in social life, a normative value in common to all democratic societies.

Here I draw attention to the need for new concepts and frameworks to understand the social harms of surveillance capitalism, towards the development of new legislation and data use policies. Beyond describing these data injustices however, a further question is intriguing and should also be the focus of future attention: how could the personal data landscape be reorganized as force not for injustice and oppression, but for social equality and advancement?

///

Read “Social Injustice in surveillance capitalism” here.

--

--