COA uses algorithm to place status holders across the country: “Instead of a win-win, everyone loses”
COA pilot with algorithm risks discrimination, researchers and journalists warn
The Central Agency for the Reception of Asylum Seekers (COA) is testing an algorithm that automatically decides where a status holder holder will be settled in. GeoMatch promises to match refugees with places where they have the best chance of finding work. Researchers from Utrecht University and Follow the Money warn that this promise is not being met. “Instead of a win-win situation, everyone loses,” says PhD researcher Kinan Alajak. “The refugee, the municipality, and the community.”
GeoMatch algorithm causes inequality
GeoMatch does not look for the best place for a refugee to live, the researchers point out. Instead, it calculates which refugee best ‘fits’ a particular location. “If another refugee has even a slightly higher chance of earning more in a given area, that person is assigned to that region,” Alajak explains. He researches technology and policy at Utrecht University (UU) and HU University of Applied Sciences Utrecht. “This means that people with fewer opportunities are eventually placed in areas where their prospects are already limited.”
“The algorithm widens inequality, both between refugees and between municipalities,” says Koen Leurs, a specialist in migration and digitalisation at UU. Alajak adds: “Placing refugees in areas where their chances of finding work are lower affects everyone. Not just the migrant, but also the municipality and, ultimately, the whole community.”
Alajak en Leurs worked together with economist Merve Burnazoglu, media scholar Gerwin van Schie (both UU), and investigative journalists David Davidson and Evaline Schot. They analysed confidential COA documents, including reports and risk assessments on data protection and artificial intelligence (AI) drawn up by independent bodies. This allowed them to uncover how the algorithm reaches its decisions and what the consequences are.
Risk of discrimination through artificial intelligence
The documents revealed serious concerns about the system. One report, for example, stated: “Some of the variables unmistakably relate to the ethnicity of those involved. This creates a high risk of discrimination.” Despite these warnings, the GeoMatch pilot went ahead.
Education and work experience carry little weight, while whether you are Syrian, Somali, or Kurdish does make a difference.
The concerns raised at an early stage have proved well founded. “The algorithm relies on so-called ‘special category personal data’, such as ethnicity and nationality, to estimate people’s skills and employment prospects,” Leurs and Alajak explain. “Education and work experience carry little weight, while whether you are Syrian, Somali, or Kurdish does make a difference.”
“GeoMatch also takes little account of women. Within a family, it mainly focusses at the person deemed most likely to find work. Because of the biased data on which the algorithm was trained, according to GeoMatch, that is usually the man.”
Opaque AI system
Exactly how GeoMatch works remains unclear to everyone involved. “Refugees do not know why they are placed in a particular municipality, and COA staff are given no insight into the system’s reasoning,” say Alajak and Leurs. “Human-in-the-loop oversight is eliminated altogether. Yet GeoMatch’s decisions have far-reaching consequences: they determine where someone has to build a new life.”
COA plans to press ahead regardless
Despite all the concerns and criticism, the COA intends to roll out the system on a large scale. “GeoMatch could soon be matching thousands of people to municipalities each year,” Leurs and Alajak stress. The researchers are calling on the COA and the government to halt experimenting with algorithmic decision-making involving vulnerable groups. “At the very least until the systems are transparent, can be properly audited, and people’s rights are safeguarded.”
Read more
Kinan Alajak, Merve Burnazoglu, Koen Leurs, and Gerwin van Schie published their findings in a peer-reviewed article in the international academic journal Social Inclusion. David Davidson and Evaline Schot reported on the investigation for the investigative journalism platform Follow the Money.