Veronika Nagy: “Our digital bureaucratic systems are limited and sometimes dangerous”


Silhouet van een man tegen strepen neonverlichting
Photo by Chris Yang on Unsplash.

The child care tax scandal in the Netherlands was no surprise to her. Scholars like herself have warned for ten to twenty years already, that categorizing people in digital systems for social policies is potentially harmful and criminalizing. Veronika Nagy, studied digital data collection of Roma migrants in the UK and also refugees in other parts of Europe to get a better understanding of  irregular online information sharing and how we deal via digital systems with people
 on the move.

Is there too much digital optimism in governance?

We should always take into consideration that  information in datasystems never fully represent a person and all administrative systems are build by specific interests. Though many of the current administrative technologies are implemented with good intentions, these should not only consider bureaucratic needs, but also protect rights like social justice. With the development of databases, in particular with the increasing use of algorithmic decisionmaking there are several issues that need to be considered. If you use attributes to categorize people, for example nationality, gender, religion, language, social class, you are also incorporating cultural biases.

Dr. Veronika Nagy

Corporate companies promote that digital implementation of social policies , are more efficient, objective, fast, transparent as opposed to humans being slow, subjective and inefficient. But recent scandals, like the Dutch child care tax one, show us that technology poses new ethical and practical challenges. There can be a bad feedback loop in a profiling software, as we saw it on mass scale, due to false or outdated data, resulting false accusations, discrimination or even criminalisation. These systems often lead to more distrust, increased surveillance and affecting the already vulnerable groups in society, leaving them more deprived and even more vulnerable than they already were, due to governmental mistakes noone seems to be accountable for.

Two topics that you’ve studied recently are surveillance and self-censorship. What did you discover?

Surveillance, for example in the field of border control, is no longer strictly done by local governments or the police. It is often done by state corporate projects, including  private security companies who screen cars, phones, homes. There is a commercialisation of security services but the different parties represent different values and interests. Border technologies are not problematic  because these stakeholdersare evil, but due to their conflicting objectives considering the rights of people.

Kunstwerk van Banksy - One Nation Under CCTV, CC BY-SA 2.0. Bron: Wikimedia
Kunstwerk van Banksy - One Nation Under CCTV, CC BY-SA 2.0. Bron: wikimedia.

The problem isn only the fact that governments can’t keep up with the current expectations of mobility control, but they are also forced to digitally track and trace all the tools that people use. Moreover the big tech lobby is very strong: there is a tradeoff. Goverments want good internet networks, 5G, working phones and therefore sometimes look the other way when technology poses ethical problems about privacy, data uses or surveillance. We should all pay attention that both companies and governments keep acting in a moral, ethical and social way and make them accountable for mistakes. We all know that our mobile data can be tracked and like us, migrants who suspect they’re being screened, increasingly practice self-censorship. Not only to protect themselves, but also to protect their family or relatives in the country they come from.

How are people being screened and how do they react? 

It is far more then simply by silencing themselves in a call,  or by not  texting about their whereabouts or other sensitive information. They also consciously choose hardware, software, simcards, providers or use others accounts. Authorities that want to access and assess their identity are not shying away from voice recognition, dialect recognition softwares, they are tracking browsers histories, gps data, use content analyses of social media accounts or to map contact lists. As a response, people on the move and fear these assessments get creative and learn from each other to manage their digital footprints. They use coping strategies to prevent legal expulsion, by censoring data traces, like paying to get removed from digital places, manipulate different accounts, remove or recraft online or phone content.

Migrants get very creative, using Telegram for an informal taxi ride or Datingapp Tinder to find shelter.

Criminologist and anthropologist, Willem Pompe Institute for Criminal Law and Criminology

Some even switch their communication between several platforms in a single conversation. They use platforms for other goals than they were designed for, like or Telegram for an informal taxi ride or TIKKTOKK for documenting their journey. . Dating app Tinder is being used to find addresses for people that are willing to take in refugees. I’ve conducted a research project called Virtual asylum – Hiding Refugees from the all-seeing Eye of Europe in which I explore how stigmatized refugees adapt to rapidly changing circumstances by inventive virtual data sharing methods in the bureaucratic labyrinth of host societies. Hopefully we can share the report with Gerda Henkel Foundation in a few weeks.

How do you conduct research?

Next to my theoretical research, as an anthropologist, I conduct fieldwork and often engage withempirical methods: I go to places where the social problem occurs that I try to explore, so I stay there  and stay as much as possible with the people I’m writing about. My goal is to learn about their decision making and to experience their reality. Due to this collection of different perspectives I try to reconstruct their concerns, views, reflections, experiences and show the fringes in social justice, created by digitised bureaucracies. I’ve done field work in Turkey, Greece, Hungary, Germany and the Netherlands. For instance there is an NGO providing a phoneservice froma bus in the region of Calais, in France, where they repairs phones for refugees, for example.

People seldomly move just pragmatically from A to B. They are affected by death, love, disease, coincidences. Migration follows a whimsical path.

Criminologist and anthropologist, Willem Pompe Institute for Criminal Law and Criminology

Some policy makers think that migrants make very calculated choices to move somewhere else. However, in my studies it seems that  people seldomly travel just pragmatically from A to B. They are affected by: changes in accommodation or jobs, the decisions of friends or family who are maybe migrants too, death, love, disease, coincidences, social services, new opportunities. Migration follows a whimsical path.

What do you think about the influence of the War in Ukraine on how Europe deals with migration?

As a Hungarian who is now based in the Netherlands, I miss a nuanced West European representation of post-communist countries like Ukraine. It seems like we are still not recovered from this post cold war iron curtain narrative in which dichotomic geopolitical representations are still dominating the main discours. I see the region as well as Ukraine as a mixed, multicultural country where people with lots of different identities live. Throughout the years the context people live in has changed and the dominance of certain identities too. It is easy to speak out against the war in Ukraine, it is harder to pinpoint who is being exploited, who is deserving of asylum, who is a dissident. Migration is never just financially fuelled, there is far more behind it. For our approach, we should try to understand the reasons behind certain actions and try to understand the political discourse in central Europe.

Dr. Veronika Nagy is an Assistant Professor in Criminology at the Law Department of Utrecht University and International Research Fellow at the University of Milan. Her research interest includes surveillance, digital inequality with a focus on the connection between mobility and technology, criminalisation and digital self-censorship. She has conducted research on specific forms of securitisation, financial surveillance, ethnic mobility, human trafficking, and digital profiling (exploitation of workers, forced criminal activities and forced labour, trafficking of children). Her latest empirical research Virtual Asylum is part of her Gerda Henkel Fellowship exploring Self-censorship practices of refugees on mobile applications in different stages of their mobilities.