How the Dutch police aim to use algorithms responsibly
PhD defence Isabelle Donatz-Fest National Policelab AI

Increasingly, Dutch police are deploying artificial intelligence and algorithms to work more efficiently and effectively. In a field where decisions can be a matter of life and death, it is crucial that this deployment is carried out responsibly. Isabelle Donatz-Fest spent 350 hours with the police to see how algorithms are given a role in daily operations.
Have you ever been fined for using your mobile phone while driving? If so, odds are you were spotted by the MONOcam. This smart camera has been trained with AI to recognise at lightning speed whether someone has an electronic device in their hand while driving. The camera system takes a picture that is reviewed by two officers, and if you have indeed been holding your smartphone, you will receive a fine that can exceed 400 euros.
This is just one way systems in policing employ algorithms. For her PhD research at the National Policelab AI (NPAI), Isabelle Donatz-Fest spent some 350 hours at the organisation, observing the everyday context of working with algorithms. Here, she took specific note of the role of public and ethical values; shared norms and principles about what is right, fair and responsible in how we engage with one another.
“These are values like integrity, transparency, equality of treatment and efficiency”, says Donatz-Fest. “And when discussing policing, one cannot overlook security. The safety of people on the street and tackling crime, but also data security and privacy.”
Data is never self-explanatory. The physical context affects data and how it is processed
For her PhD thesis, Donatz-Fest investigated how these values play a role throughout the organisation: from the data scientists who create analyses and write code, to the front desk staff, managers, policy makers, and administrators. And, of course, the users of the system, such as traffic control officers. “All those people have an impact on how the system works and consequently on what the outcomes of such algorithms are.”
The PhD student shadowed several departments to observe how public values were implemented in daily operations. In doing so, she looked specifically at different aspects of the design and use of algorithms. “Regarding design, I looked at data and data professionals. Data never speaks for itself. Take police reports, for instance. There are standardised forms where an officer is required to tick boxes about an incident. There are close to seven hundred categories for the officer to choose from.”
To illustrate, Donatz-Fest uses the example of a drunken homeless person causing a public nuisance. The officer could then choose nuisance by a vagrant or homeless person, nuisance by substance or alcohol misuse, or nuisance by a confused person, if they are incoherent due to alcohol consumption. “Many times, real-world situations don't fit very neatly into a single box.”

Next, data scientists will work with the data, but they are not infallible either, says Donatz-Fest. “They value those public values immensely, but do not always know how to interpret certain data. Typically, they have little or no experience of front-line police work. Or they might have a deadline that forces them to fast-track an algorithm to the next stage of development. In a nutshell, the physical context affects data and how it is processed.”
And even if everything works correctly in their design, in practice the deployment of algorithms may turn out differently from what was intended. Back to the MONOcam. Used since July 2021, this system was largely responsible for a 35 per cent increase in motorist fines one year on. Donatz-Fest: “I was impressed by the way public values were embedded in the design of the MONOcam.” It removes the pixels of any passengers' faces, for example, deletes photos without a violation, and always requires two officers to independently check the photo. “Everything neatly by the book. But then people start to use it and you find that those public and ethical values become fluid again and are renegotiated with every use.”
For example, Donatz-Fest observed how photos of offences were viewed by more than two officers, and how a detective contacted a traffic officer to request a photo that might show a suspected criminal. “The detective had spotted in the system that a vehicle involved in an incident with a firearm had been issued a fine via the MONOcam. He wanted to know who was driving the car, whether they were male or female. So, as a police officer, what do you do? You can help solve a crime, but not the crime for which the MONOcam was intended.”
Balance between fear and blind trust
To sum up, designing algorithmic systems as well as their responsible deployment are still human endeavours, explains Donatz-Fest. “I don't think it is ever possible to be error-free in innovation, nor indeed should that be the goal. Innovation is messy and as an organisation you sometimes have to face up to reality. It's an ongoing process, which involves continuously checking whether the algorithms are making a positive contribution to society.”
As an organisation, you have to strike a balance between fear and blind trust, and to continuously consider the way you make public values part of the organisational culture, says Donatz-Fest. “I have found that the police take these matters extremely seriously. For my research, I was allowed to get up close and personal with the organisation; I was like the critical friend offering some insights. Some of the risks I drew attention to, for example at the MONOcam, have in fact been addressed. That's really pretty cool.”
National Policelab AI (NPAI)
Isabelle Donatz-Fest received her PhD on 16 May. She conducted her research within the National Policelab AI, led by Albert Meijer (Law, Economics and Governance), José van Dijck (Humanities) and Mirko Schäfer (Science), as part of the NWO-funded research project ALGOPOL. You can find her PhD thesis here: Cop, Code, and Conduct: A practice-based understanding of responsible policing in the algorithmic age. From 1 June, Donatz-Fest will start working as an Advisor on Responsible Data Use and AI at The Green Land.