My sons were profiled by a racist predictive policing system — the AI Act must prohibit these systems

Fair Trials
5 min readSep 28, 2022

--

By Diana Sardjoe

When I found out my sons were placed on lists called the ‘Top 600’ and the ‘Top 400’ by the local Amsterdam council, I thought I was finally getting help.

The council says the purpose of these lists, created by predictive and profiling systems, is to identify and give young people who have been in contact with the police “extra attention from the council and organisations such as the police, local public health service and youth protection,” to prevent them from coming into contact with police again. [1]

This could not have been further from the truth.

These predictive systems assess and profile young people in Amsterdam to determine who is likely to commit crimes in the future; the ‘Top 600’ profiles children over the age of 16, the ‘Top 400’ profiles children under 16.

As a result of my sons being profiled by these systems, they were continually monitored and harassed by police. One was repeatedly accused of criminal activity along with other young boys he hung out with. Another time, I had to deal with police coming into my home. My son was arrested and I had to go and pick him up from the detention centre late at night. Other mothers I know were threatened with having their children taken away.

We had to deal with a system based on prejudice, with the weight of the authorities behind it. I was completely alone in this struggle, and there was nothing I could do. During this time, my sons withdrew further and further. I felt like a prisoner, watched and monitored at every turn, and I broke down mentally and physically, ending up on cardiac monitoring.

These events were an eye-opener for me, and it made me look for answers. How did my sons end up on these lists? What information do they use to decide this? How is it that it is always young, minoritised ethnic boys on these lists? How do the authorities have so much information on me and my family, and how is that justified? When does the monitoring start — and when does it end? I started a group for other mothers called The Mother is the Key, to share our stories, and start questioning and challenging the authorities.

So let me tell you the problems with these profiling systems.

First of all, the information they had about me and my family was inaccurate and prejudiced. The authorities’ data collection on young people often starts in school, the Top400 system uses this information to profile children. [2] Maybe the child missed some days of school. Maybe they had to go to a different school. All this information is registered, and it is used by the Top400 system to profile a child as a future criminal.

Once these authorities make a negative report about you, and it’s filed away in their system, you’ll never get rid of it. This is the consequence of our database society: you no longer have control over your life. Every time you interact with an authority or agency, you notice that they are suspicious of you, that they hold some prejudice against you. Your data and these profiles and predictions are shared amongst different public authorities, and held against you forever.

Another piece of information the police use to profile young boys on these lists is ‘contact’ they have had with the police. [3] This contact is often purely the result of ethnic profiling. I once saw my son, aged thirteen, being stopped by police and asked for ID while he was walking home down the street. That was enough for him to be registered on the police database as having had ‘contact’ with them. In this way, racist policing becomes hard-wired in the police data, and young men from racialised backgrounds are of course unfairly over-represented within that data. The information that goes into these systems is the same as the information that comes out. If prejudice goes in, prejudice will come out.

These predictive and profiling systems used by police and criminal justice authorities always use this same racist data and produce the same outputs. They are based on the idea of what a ‘criminal’ is, what their background and profile is, something that in western countries is often tied to ethnic minorities. Anyone who fits this mould is considered a ‘risk’ and criminalised. The majority of young people profiled by the Top400 and Top600 are minoritised ethnic people, including people with Moroccan or Surinamese heritage, or from neighbourhoods and districts where there are many residents with migrant backgrounds.

I do not agree with the use of these predictive and profiling systems, and I think they should be banned. They are inaccurate and racist, and they label people as criminal, including children, before they have done anything wrong, often just based on their background. This is completely unjust, and we must act before it is too late.

Many groups in Europe have investigated and challenged the use of these harmful data-driven technologies. In the US, where many of these technologies were first implemented, community groups have advocated for and legislated to prohibit predictive policing systems and other invasive surveillance technologies, like facial recognition. We must prohibit them in Europe too.

The AI Act, a law which aims to regulate these types of systems, is currently being considered by the European Parliament. But this proposed law does not go far enough — yet. It should ban law enforcement and criminal justice authorities from using these predictive and profiling systems completely, to protect people like my sons from racism and injustice.

Tell you MEP to ban predictive systems in policing and criminal justice here. It takes less than one minute.

--

--

Fair Trials

Fair Trials is an international NGO that campaigns for fair and equal criminal justice systems / fairtrials.org