Is privacy essentially contested?

Victor Zhenyi Wang
4 min readMar 21, 2024

--

The difficulty in coming up with a precise and universal definition for privacy has proven that privacy itself may be under-theorised. Mulligan et al (2016) suggest that the concept is “essentially contested.” Richards (2019) provides a survey of existing definitions but ultimately settles on what he calls a “working” definition of privacy. Cohen (2000) proposes a definition of privacy as “breathing room” as linked to the formation of individual autonomy; this mirrors Brandeis (1890) in defining privacy as a right to be “let alone”. The breadth of approaches in academia would seem to suggest that something about conceptualisation of privacy is contested.

Yet in practice, the most salient examples of contestations of informational privacy we see today are largely fought over sites of privacy as applied, rather than conceptual core disputes over privacy. For essentially contested concepts, typically a “common core” representing the concept is agreed upon, but specific applications of that concept, “conceptions”, is where disputes may lie (Rawls, 1971). In Mulligan et al (2016), this common core is presented as a collection of historical legal precedents but even these are examples of the concept as applied. What exactly then, is this common core?

In part, the difficulty in answering this question has to do with the politics of privacy today. In examining contemporary battles over privacy, we see an inextricable connection to power. The ability for certain viewpoints to make claims over what the bounding territory of privacy in different contexts ought to be is largely a question of who has the power to influence. For if norms around privacy can be shifted by dominant tech giants then surely this limits the kinds of contestations which are possible.

Consider the modern social media platform — these firms typically mediate two-sided markets to leverage information about users (buyer) to create products of prediction for advertisers (sellers). Platform operators can use these products of prediction to also broadly influence user behaviour e.g. in attempts to maximise engagement or swaying real-world behaviour. Platforms therefore wield considerable knowledge-power over its users; users know very little about platforms, whose internal processes and mechanisms of influence, are usually secretive and represent, what some have referred to as a re-distribution of privacy rights.

Individual users may express particular ways of conceptualising privacy in relation to other users. For instance, teenagers may express a desire for their communications to be implicitly private between groups of friends to the exclusion, especially, of their parents and other adults (Boyd, 2014). Parents of course, may feel that this expectation of privacy is unreasonable.

In this regard, the essentially contestability of privacy plays out as a tussle between stakeholders in drawing and redrawing boundaries on a mapping of privacy. Lawmakers, designers, and engineers all play a role in reifying these boundaries, as law, policy, or code. For instance, parental controls on various social media platforms are an example of how code takes a position in drawing the limits of privacy expectations for minors.

Yet an analysis of the privacy landscape in this context actually requires an understanding of the existing power relations between the platform and its users. The vast difference in power between the platform and individual users means that the platform is much better at policing the boundaries of privacy in various areas of concern for the betterment of their own interests.

An existing area of tussle in this domain lies in drawing the boundaries of “personal data”. Typically, platforms include personally identifiable content of its users in the category of personal data. Privacy concerns are situated on secure storage of this data and mechanisms that prevent its leakage, leading to the identification of individuals, whilst still laying claim to the right to use this data for the purposes of analytics, trading, or other interests. Meanwhile, users have pushed back against this thinking, attempting to expand the boundaries of what privacy ought to cover by both redefining the personal (should behavioural data or metadata be included?) and the ways in which firms can apply the products of prediction (discrimination in application of algorithms in various areas). Yet legislative and regulatory bodies have struggled to defend a rapidly shrinking territory.

Legislation like the GDPR have not dramatically redrawn the contours governing the distribution of privacy rights; tech incumbents have merely reshaped the flow of information to comply with these new legal contours on paper, not in spirit. Similarly, emerging statistical privacy tools, while an important guarantee of privacy for specific permissible use cases, may help negotiate where to draw the boundaries on contested territories but do not renegotiate the contours of what ought to be permissible.

If we leave privacy as contestable, we cede far too much territory to parties with the power to enforce boundaries favourable to their own interests. In order for privacy rights to function and represent the interests of the broad public, a real possibility for contest must first exist. The importance of a true common core of contested concepts is in allowing such possibilities to exist.

--

--

Victor Zhenyi Wang

T1D data scientist working in global development. I write about diabetes, technology, philosophy, and policy. Opinions are my own