What does “banal surveillance” look like?

Azadeh Akbari
surveillance and society
7 min readSep 26, 2022

In the post below, Gabriel Pereira and Christoph Raetzsch reflect on their article, “From Banal Surveillance to Function Creep: Automated License Plate Recognition (ALPR) in Denmark,” which appeared in the 20(3) issue of Surveillance & Society.

As you walk the streets of your city, you may see signs that you are being surveilled. These signs often show just a stylised drawing of a camera. Others may have cryptic phrases such as “POLICE ANPR in use”. But what does being surveilled really mean? And what is actually happening behind the surveillance camera that we see?

In our article for Surveillance & Society, we study a rather simple technology of what we call banal surveillance: Automated License Plate Recognition systems (ALPR). These systems may appear banal because they only surveil car license plates and are often used for seemingly worthwhile purposes. However, ALPR systems are much more than just a camera in a public space. The camera in itself is only a visible sign of a whole infrastructure that automatically captures, stores and analyses license plates of cars through databases and algorithms.

Since writing this article, we have become much more sensitive to how ALPR appears in our everyday life. We wanted to use this blog post to show what “banal surveillance” looks like, discussing our main findings and arguments through pictures of ALPR infrastructures we have found since starting this research. Our research takes inspiration from, among other projects, Lisa Parks’ work on the politics of infrastructural visibility, which discusses how cell towers are disguised as trees, and coveillance.org’s surveillance walking tour of Pittsburgh, which seeks to expose information networks hidden behind streetlights, maintenance hole covers, and utility boxes.

In Figure 1, for example, you see a sign with a symbol of a camera. It’s a common sight in the streets of London (UK). But which kind of camera is it? Does the sign indicate a CCTV or ALPR system, and for which purpose is it used? Figure 2 shows a similar but less common sign indicating the use of ALPR (ANPR in UK spelling) by the Metropolitan Police. A link to the police website is included here, but the Met offers no maps or further information on what each surveillance camera does.

Figure 1
Figure 2

As we explain in the article, which focuses on Denmark specifically, the use cases of ALPR can vary widely. In the past six years, the number of ALPR cameras used by the Danish Police has jumped from 24 to 160 stationary cameras (on top of poles) and 48 to 171 mobile cameras (on top of police cars). The data generated by cameras is stored and used for different things, such as issuing alerts for cars on a “watch list” and investigating crimes.

Beyond policing, we also discuss parking and environmental zoning as use cases of ALPR. For parking operators, the goal is to create a “barrier-free” customer experience. Drivers don’t need to pick up a parking ticket and can quickly pay through an app. This technological setup for contactless payment can also be used for public parking. In Figure 3, you can see a vehicle enforcing the “Zona Azul Digital” in São Paulo (Brazil), where fines can be given to infringing car owners through automated capture and analysis of license plates. Through such surveillance technologies, the use and accessibility of public space are being transformed. When data is collected in an automated fashion, and the automation of data processing leads to tangible effects for citizens, like having to pay fines or being excluded from specific spaces, such seemingly banal technologies raise fundamental questions about the oversight of such “dragnet” data collection and automated decision making (ADM). This is especially crucial when and where public data protection regulation and governance are weak.

A third use case of ALPR discussed in our article is environmental zoning, which aims to automate the enforcement of “Low Emission Zones” by recording plates of highly-polluting cars trespassing into restricted zones. The system can be configured differently, including how captured license plates are stored and processed. As we argue in this case, the decision to issue a fine or to delete the data can be automated or deliberately left to human staff to decide. In sum, technologies do not dictate only one mode of operation. Figure 4 shows a sign indicating the presence of Ultra Low Emission Zone (ULEZ) cameras in London (UK). The use of ALPR for enforcing environmental zones in the UK is recent and has received criticism, particularly from right-wing parties. Unlike the case of Denmark, which we analyse in the article, London’s ULEZ system already shares the data it gathers with the police, a clear case of “function creep” (the broadened use of a technology beyond its original goals).

Figure 3
Figure 4

Other uses of ALPR exist, some of which we don’t cover in the article. Figure 5 shows ALPR at a gas station in Aarhus, Denmark. As in many places worldwide, you can pay automatically when driving out of the station by registering your license plate and credit card on an app called “Easy Fuel”. But how seamless and convenient is it when this information could be combined with other data, e.g. to create a personal profile of your movements and energy consumption?

Figure 5

In the article, we argue through different use cases that ALPR enacts a form of “banal surveillance”. This system is being widely implemented but operates inconspicuously under the radar of public attention and comprehension. Take Figure 6, for example. We believe it to be a ULEZ ALPR camera in London. However, it’s hard to see (and photograph) these cameras, as they are small and placed very high up on the pole. More critical about analysing such surveillance infrastructures is that the camera in the street does not show the algorithmic analyses and automated decision-making routines that can happen far from the data collection site. Understanding the banality of ALPR, in our cases, underlines that surveillance becomes socially and politically acceptable precisely by presenting itself as a way to be more efficient and, at the same time, too banal to criticise or even bother.

Figure 6
Figure 7

Because ALPR is not about very personal things such as your face or health data, its effect is felt to be less intrusive. But the same critique of surveillance that we find in discussions of social media platforms analysing their users and steering their behaviour through automated decision-making algorithms is present in the example of ALPR infrastructure. Our core argument is that these seemingly banal infrastructures allow new functions of surveillance to be embedded in the long term. Once installed for a rather delimited purpose, they generate new objectives and goals for enhanced data collection, analysis, and action.

This gradual expansion of the initial goals and design of a surveillance technology is what is often discussed as function creep, defined by the Dutch law scholar Bert-Jaap Koops as “an imperceptibly transformative and therewith contestable change in a data-processing system’s proper activity.” While the initial installation of a technology may be limited and legitimate for a narrow set of purposes, such technologies tend to pass a ‘tipping point’ beyond which their initial functions are diluted and expanded. In other words, data can be collected for environmental zoning, for example, to improve air quality in cities. But what are the consequences of this collection once such data is shared with the police, analysed for crime detection, targeted operations or group profiling?

As we argue, function creep can be curtailed and partly avoided when explicit legal constraints are imposed, and public governance of such technologies is made possible. One key issue is the visibility and public awareness of these systems. Indeed, what purposes a camera has and how it records, stores, analyses, or deletes the data is not visible from the outside. It is often difficult to find out as each system allows for a high degree of customisation. Figure 7 shows an automated toll collection system in Tokyo (Japan), and Figure 8 shows a similar ALPR surveillance system in Bologna (Italy). However, how can we be sure what kinds of data are collected here and how they are analysed, stored, or fed into other systems?! Addressing this invisibility of banal surveillance technologies, Danish activists have created a crowd sourced map of ALPR cameras in Denmark (anpg.dk). Only through this crowdsourced resource can we know, for example, that Figure 9 shows an ALPR camera placed between two highway roads in Copenhagen (Denmark), seemingly used for policing purposes.

Figure 8
Figure 9

How do you see “banal surveillance” in your everyday life? Think about the streets in your neighbourhood. Are there speed cameras, CCTV, “smart lamp posts”, or other forms of surveillance that you’ve grown accustomed to ignoring? How could you determine what these cameras do, who installed them, and on what legal basis? Can you find out who owns the data and how and where it is processed? We promise you, find out, and you will be surprised.

(All pictures are copyrighted ©Gabriel Pereira and Christoph Raetzsch)

--

--

Azadeh Akbari
surveillance and society

Assistant Professor in Public Administration & Digital Transformation , University of Twente + Digital Editor at the journal Surveillance & Society