Predictive policing has exposed a new group of would-be criminals: MEPs.
A new testing system has put five EU politicians in the spotlight as “at risk” of committing future crimes. Fortunately for them, it is not a tool used by law enforcement, but one designed to highlight the dangers of such systems.
The project is the brainchild of Fair tasting, a criminal justice watchdog. The NGO is campaigning for a ban on forecasting police work, which uses facts analytics to predict when and where crimes are likely to occur – and who may commit them.
Don’t miss our limited 2-for-1 offer ending soon!
The heart of technology comes to the heart of the Mediterranean – March 30 – 31
Proponents argue that the approach can be more accurate, objective and effective than traditional policing. But critics warn it entrenches historical biases, disproportionately targets marginalized groups, reinforces structural discrimination and infringes on civil rights.
“It may seem unbelievable that law enforcement and criminal justice authorities make predictions about crime based on people’s background, class, ethnicity and associations, but that is the reality of what is happening in the EU,” said Griff Ferris, Senior Legal and Policy Officer at Fair Trials.
In fact, the technology is becoming increasingly popular in Europe. In Italy, for example, a tool known as Dalia has analyzed ethnicity data to profile and predict future crime. In the Netherlands it now has the so-called Top 600 list used to predict which young people will commit high-impact crime. One in three people on the list – many of whom have reported harassed by the police – turned out to be of Moroccan descent.
To illustrate the effects, Fair Trials developed a mock assessment of future criminal behavior.
Unlike many of the real systems used by the police, the analysis has been made completely transparent. The test uses a questionnaire to profile each user. The more ‘yes’ answers they give, the higher their risk outcome. You can try it yourself here.
Politicians from the Socialists & Democrats, Renew, Greens/EFA and the Left Group were invited to test the tool. After completing the quiz, MEPs Karen Melchior, Cornelia Ernst, Tiemo Wolken, Peter VitanovaAnd Patrick Brewer were all identified as being at “medium risk” of committing future crimes.
“There should be no place for such systems in the EU – they are unreliable, biased and unfair.
The gang will not suffer any consequences for their possible transgressions. In real life, however, such systems could place them in police databases and subject them to close surveillance, random questioning, or stop and search. Their risk scores can also be shared with schools, employers, immigration agencies and child protection services. Algorithms even have led to people being imprisoned with little evidence.
“I grew up in a low-income neighborhood, in a poor Eastern European country, and the algorithm profiled me as a potential criminal,” said Petar Vitanov, an MEP of the Bulgarian Socialist Partysaid a statement.
“There should be no place for such systems in the EU – they are unreliable, biased and unfair.”
Fair Trials has released the test results amid growing calls to ban predictive policing.
The topic has divided proposals for the AI Act, which will become the first-ever legal framework for artificial intelligence. Some legislators are pushing for a total ban on predictive policing, while others want to give room to law enforcement agencies.
Fair Trials has given supporters of the systems a new reason to reconsider their positions: the technology can target them too.