Photo by Tony Webster licensed under CC BY-SA 2.0 | flickr

Police Reform

What People Think About Automated Enforcement and Why It Might Matter for Police Reform

Susan Miller
3Streams
Published in
4 min readAug 21, 2020

--

Written by Lael R. Keiser and Susan M. Miller

Evaluations of police performance are at the lowest point in years, particularly among Black Americans. The killings of George Floyd and Breonna Taylor, among others, have once again highlighted concerns about systemic racism in the criminal justice system. Massive protests have called for reforming the police. Among the suggested police reforms is a call to shift some of the responsibilities of the police to other agencies and workers, including social workers, mental health professionals, and non-police traffic patrollers. Some have advocated a move toward more automated enforcement, particularly for traffic, as part of this reform.

Automated decision systems in government have potential downsides, of course. They can be potential sources of bias, expand government surveillance by making it cheaper and less noticeable, etc. Yet, certain forms of automated decision-making may offer benefits when compared to the human alternative, such as a possible reduction in physical interactions with government agents. For policing, this may be particularly important. As we have witnessed, interactions with the police, such as traffic stops, can escalate into violence. Research indicates that Black drivers are more likely to be stopped than white drivers, and that the evidentiary threshold for searching Black and Hispanic drivers is lower than that for searching white drivers (also see this study). Within this environment, automated decision systems that offer a potential way to reduce citizens’ interactions with the police could be viewed positively by citizens, particularly minority citizens.

In our recent article, we examine attitudes toward a type of automated decision-making used in law enforcement, red light cameras. We were particularly interested in whether citizens were more likely to rate red light cameras higher than police officers in terms of fairness when the police officers do not racially represent the citizens compared to when they do. This is important given the underrepresentation of minorities in many police agencies.

Research suggests that a lack of racial representation reduces the perceived fairness, trustworthiness, and performance of a police agency and decreases the likelihood that police stops are viewed as legitimate. This leads to our question: Does a lack of racial representation among the police make citizens more likely to view automated decisions as fairer than police decisions?

To answer this question, we conducted a survey experiment that examined attitudes toward red light cameras and the police. We find that Black respondents are more likely to rate a red light camera as higher on fairness than a police officer when presented with a picture of local police officers who are not racially representative. We do not find differences in fairness ratings among white respondents.

To gain insight into perceptions of fairness, we also asked respondents to provide their rationale for their fairness ratings. Black respondents (all and those who rated the red light camera higher on fairness) who were presented with a lack of racial representation among the local police were more likely to indicate that concerns about drivers being searched were important to their fairness ratings. This result highlights a notable characteristic of certain types of automation — the potential to limit the physical interaction between the citizen and police officer.

Our findings suggest that using automated decision-making systems, like red light cameras, may improve how Black citizens view the fairness of law enforcement decisions when there is a lack of racial representation among the police. Moreover, the potential to reduce physical interactions with the police seems to be an important consideration in this evaluation.

The flip side of our results also has implications for another possible reform — increased recruitment of minority officers. In our experiment, Black respondents were less likely to view the red light camera as fairer than the police officer when they were racially represented among the local police in the picture compared to when they were not. Given the possible pitfalls of automated decision-making, this is also an important finding, pointing to potential positive consequences associated with the recruitment of more minority officers.

We are not suggesting that either automated decision-making or increased recruitment of minority officers are a panacea for criminal justice policy. Both have challenges. And automated decision systems should be carefully considered as different systems will have different drawbacks and benefits, with some increasing the potential for physical interactions and enhancing police power. However, our research demonstrates that automating certain tasks, like traffic enforcement, particularly in locations where police do not racially represent the communities they serve, may improve how members of the underrepresented community think about government decisions. This may be particularly important in areas, such as policing, in which race is highly salient and limiting physical interactions can be desirable. And although biases may exist in automated decision-making, as noted recently by Kevin Drum, “With a computer algorithm, however, careful study can often identify biases — and once those biases are uncovered, they can be fixed. [. . .] Compare that to the years and years it would take to fight human racism with bias training and diversity programs and so forth, with no guarantee even then of success.”

Moreover, like many reforms, the use of red light cameras or other automated decision systems needs to be considered holistically and likely coupled with other reforms (e.g., traffic enforcement primarily handled by automated systems and non-police traffic patrollers). In the case of red light cameras, it is important to ensure that the placement of cameras does not introduce biases and to consider what police officers are doing instead of traffic duty. However, red light cameras and other forms of automated decision-making could be part of the reforms to shift responsibilities away from the police, which may enhance the perceived fairness of government decisions among Black Americans, particularly in areas in which they are underrepresented within the public service.

--

--

Susan Miller
3Streams
Writer for

Associate Professor of Political Science at the University of South Carolina