The Real Minority Report?

--

I attended the excellent INTERPOL Digital Forensics Experts Group (DFEG) 2024 last week, and the use of AI was top of the agenda. Its use in gathering evidence and reporting could open up a whole new world of digital investigation. And so, last week, Police Scotland announced that it would explore the usage of AI.

I appreciate that most of this relates to the recovery of data related to a crime, but our society must worry about a Big Brother world of citizens being observed for every aspect of their lives.

Minority Report — released in 2002 — predicted the future fairly well, with the usage of self-driving cars, touch analytics, personalised ads, video-controlled homes, facial/retina recognition, gesture-based actions, … and “predictive policing”.

Within the Minority Report, law enforcement used “precogs” to predict a crime before it happened and then make an intervention to stop it. This pre-crime approach uses the past to predict future events and thus identify risks. The machine operates in a way that humans would assess rights, such as with one major red flag — such as where someone has just bought a firearm from an online site — or with many red flags — such as where someone has been continually posting angry messages about someone. As humans, we continually make these judgments about others and might often say that “we knew that he/she was going to do that from his actions before it”.

Recently, an Israeli/US company — BriefCam — has developed software which analyses video footage and then creates key events, where police can now take hundreds of CCTV video feeds and distil them down into keyframes.

The increasing usage of predicted software in law enforcement worries many people, especially as it can result in false positives. An example used by the Washington Times defines that:

“…officers raced to a recent 911 call about a man threatening his ex-girlfriend, a police operator in headquarters consulted software that scored the suspect’s potential for violence the way a bank might run a credit report.

The program scoured billions of data points, including arrest reports, property records, commercial databases, deep Web searches and the man’s social- media postings. It calculated his threat level as the highest of three color-coded scores: a bright red warning.

The man had a firearm conviction and gang associations, so out of caution police called a negotiator. The suspect surrendered, and police said the intelligence helped them make the right call — it turned out he had a gun.”

The HunchLab software is used in some states of the US and uses machine learning and predictive analytics to identity high-risk areas within cities. This data includes crime incidents, arrests, and weather data (the hot conditions can often lead to an increase in crime). This data is then added to real-time data from video footage and sensors which are placed around the city.

A major criticism of predictive crime software is that police officers will often be deployed into high-risk areas and then end up arresting more people, which means that the software will keep predicting that that area has a high chance of crime — and this setting the software into a “feedback loop”. For many, the solution to this is that the crime rate should be measured against other areas and will only trigger deployment if the rates are higher than would be expected.

Another major criticism of predictive crime analysis is that poor-quality data often leads to the amplification of racial biases. For example, it is well known that black men are more likely to be stopped than white men, and thus have more data recorded on them. Studies have shown that the Oakland PredPol system was twice as likely to target mainly black communities than mainly white ones for illicit drug use, even though medical data showed that there was an equal balance of drug problems across the communities.

Predicting risk

Social care is also increasingly using machine learning to predict the risks to individuals. In the US states and national governments have been using methods which aim to predict whether a mother and father are fit to be parents before the child is even born. The good news is that New York City aims to stop this type of profiling:

The worst part of this is that the rate of false-positives, within some studies, give 95% (with only a 5% success rate):

Many overpromise on machine learning, and to allow computers to profile on people’s fitness to be a parent, and completely dehumanises the whole process. If we ever move to this kind of world, we should leave our planet to governments and machines. It will be a world without compassion, and where computers predict our every move. In fact, we just become another computer profile.

Trust in building a new world will evaporate.

Have a read of the article, and make up your own mind [here].

Conclusions

Digital methods can bring new ways to support citizens, and data analytics and machine learning will play a key part of this. The state does need to understand when it oversteps the mark, and where our world becomes the nightmare envisioned by George Orwell.

Sometimes things start with the best of intent, but end-up with poor deliverables. I believe the Name Person’s Act in Scotland is an example of a system which started off with great intent — GIRFEC (Get It Right For Every Child) — but, someone, somewhere, forgot that the child and their families have the rights to know how the risk assessment works. I’ve yet to meet one parent who actually knows how the Named Person’s Act will work in Scotland. An opportunity lost, by the state thinking that it knows best, and operates within a closed environment.

Society needs to debate the usage of predictive analytics, otherwise you could start to see your profile matched to a crime, and be arrested before you even commit the crime.

--

--

Prof Bill Buchanan OBE FRSE
ASecuritySite: When Bob Met Alice

Professor of Cryptography. Serial innovator. Believer in fairness, justice & freedom. Based in Edinburgh. Old World Breaker. New World Creator. Building trust.