Utrecht leads research on misinformation and deepfakes

Telefoon met headlines van het nieuws
Photo: Unsplash

With the arrival of Federica Russo at Utrecht University, the coordination of the European SOLARIS project has also come under Utrecht's control. In this project, the professor of Philosophy and Ethics of Techno-Science is investigating the political risks of fake news and deepfakes as well as the opportunities of AI-generated contents for strengthening democratic processes.

According to Federica Russo, who switched from the University of Amsterdam to Utrecht University in August 2023, the spread of fake news is a threat to democracies worldwide. Citizens who primarily seek news online are “bombarded” with an enormous amount of unfiltered information. This makes it easier to spread conspiracy theories. In addition, people are increasingly retreating into online bubbles where they only come into contact with like-minded people. This reinforces the polarisation between social groups.

“It is becoming increasingly difficult for people without specific technical knowledge to distinguish between ‘true’ facts and false or biased information,” Russo said at the start of the project early last year. “That is why the problem of the reliability of information is more urgent than ever before.”

Distinguish between real and fake

The Solaris project group focuses on deepfakes in its research: AI models are used to create highly realistic images and videos that are almost indistinguishable from real content. The quality of such deepfakes is constantly improving, making it increasingly difficult to distinguish between real and fake. Russo and her colleagues want to know which factors contribute to the believability of deepfakes. Next year, they want to run a simulation with targeted stakeholders such as representatives of institutions or news agencies to investigate how they react to deepfakes.

Initial results from the Solaris research show that video quality is not the only element that makes us believe in a deepfake. “Much more important is the spread of the fake videos and the social networks of the people who see them. If a video is watched frequently, it becomes part of the story. Especially if it is shared by your friends - people you trust - this contributes to the believability of deepfakes. You could say: it takes a village to trust a deepfake.”

If fake news is shared by people you trust it contributes to the believability of deepfakes. You could say: it takes a village to trust a deepfake

Federica Russo, Professor of Philosophy and Ethics of Techno-Science

The results of the research should contribute to formulating recommendations for politicians and policymakers, such as ways to regulate the use of AI, as is done with medicines. Russo calls the European AI Act “a start”. In addition, the researchers are developing a statistical model for prediction of occurrence of fake news spreading, analysing real time data from social media.

Russo emphasizes that there is also a responsibility for the media and individual citizens. “The media must fact-check and promote reliable information sources. People should be careful about what they share and critical about what they believe.”

Positive deepfakes

At the same time, Russo also wants to be careful not to only talk about the “apocalyptic side” of AI. “I am convinced that deepfakes can be dangerous. But you can also use them to strengthen democratic citizen engagement. For example, the AI models used for deepfakes can also be used to develop videos that raise citizens’ awareness of important global issues such as climate change, gender and migration.” Later in the project, SOLARIS will run co-creation tables with citizens to produce AI-generated contents for good purposes.

About SOLARIS

The research project SOLARIS (Strengthening democratic engagement through value-based generative adversarial networks) started in February 2023 and is funded by the Horizon Europe program with a grant of 2.6 million euros. There are thirteen partners from eight different countries involved.