The ‘Minority Report’ of the digital age

Technology has penetrated almost every system in the modern world. We rely on technology for anything and everything from ordering a pizza to paying our taxes. Government at all levels are adopting technology to help improve citizens quality of life. While most of these technologies have undoubtedly improved quality of life, sometimes technology might cause more harm than good as in the case of the predictive policing.

Predictive policing is an analytical technique used by law enforcement agencies to predict potential criminal activity. From its initial use by the LAPD in 2008, predictive policing is spread to various states and is increasingly being adapted by various police agencies across the United States. While law enforcement agencies claim that predictive policing has helped reduce criminal activities in the places it is being implemented. Various minority groups have raised concerns of being racially discriminated due to results of these black box machines.

Predictive policing algorithms comb through past crime data to tell officers which people and places are most at risk for future crimes. Unfortunately, the problem with such an algorithm is the fact that these algorithm relies on past crime data.

“Crime data is notoriously bad ,easily manipulated, glaringly incomplete, and too often undermined by racial bias.” — Ezekiel Edwards, Director of the ACLU Criminal Law Reform Project

When such polluted data is fed into a machine, the predictions are going to reflect the very same biases of the data. Unfortunately, police departments nationwide are willing to implement these practices despite scant evidence of reliability, with little public debate or transparency.

A recent study published this month in the academic journal Significance shows that the predictive policing software created by PredPol increases racial bias.

Another study by Human Rights Data Analysis Group (HRDAG), came up with similar results when it applied the tool to crime data in Oakland, the algorithm recommended that police deploy officers to neighborhoods with mostly black residents.

Results of a simulation of the Predictive Policing Algorithm in Oakland.

PredPol CEO Brian MacDonald has clarified that PredPol isn’t used for drug crime, but instead looks at reports for crimes like assault, robbery and auto theft, precisely to avoid the kind of bias the HRDAG discovered.

As a student of an Urban Data Science course, I understand the benefits of application of technology in the society. But unlike the private sector applications, the ethical impacts of incorporation of such radical technology should be rigorously discussed, debated and analyzed and I believe this is missing in the case of predictive policing technology. Predictive policing has only widened the ridge between the police and the communities the police are protecting. As aptly said by President Barack Obama — all that communities want to see is a better relationship between the police and the community so they can feel better that its serving them; which in essence is a matter of trust towards the functionality of a system. This is not to suggest that police should seize to adapt to technology which increases their effectiveness and indirectly promotes community health and safety. But till the time the police do not address the concerns of the various invested communities (citizens at large), they should hit pause on predictive policing.