Andrea Longobucco on using AI to fight corruption and pushing the boundaries of knowledge

What kind of mind-set is needed to carry out ground-breaking research as we do at UUCePP? UUCePP researchers introduce themselves in brief interviews conducted by Elisabetta Manunza and Fredo Schotanus.
'Who' and 'what' are you?My name is Andrea Longobucco, and I am a PhD candidate in the Applied Economics section at the Utrecht University School of Economics (USE). I am also currently working as a researcher in the EU-funded research project BridgeGap, which focuses on understanding and combating corruption through interdisciplinary methods and frameworks.
I was born in southern Italy, in Rossano Calabro, province of Cosenza, where I lived until my final year of secondary school. I completed my Bachelor and Master's degree in Statistical Sciences at the University of Milano-Bicocca in 2024. From Calabria, Italy's southernmost region, I chose to move to Milan for my studies because of the city's strong academic environment and its vibrant, innovation-driven atmosphere. Compared to Calabria, which is known for its natural beauty and slower lifestyle, Milan offered me broader educational opportunities and exposure to a more dynamic and international environment. During my years in Milan, I had the opportunity to build a solid background not only as a Statistician but also as a Data Scientist, through both coursework and practical work experiences.
My main research interests focus on the use of machine learning, statistical learning, and AI tools in the fight against corruption and economic and financial crimes conducted by public and private institutions. I've always been interested in crime analysis because, coming from a region marked by social and economic challenges – such as high unemployment, limited public services, and the presence of organized crime – I've sometimes witnessed injustices linked to corrupt power systems. For this reason, I see the field of crime detection as an opportunity to make a tangible contribution to society.
What are you working on, and why?More recently, my research interests have focused on the detection of financial crime by public and private institutions, such as the market for Trust and Corporate Services Providers (TCSPs) in the Netherlands. Trust companies, which provide administrative, fiduciary, and domiciliary services, exist worldwide under various names and structures; in the Netherlands, TCSPs are called "Trustkantoren". To operate, they must be licensed and regulated by De Nederlandsche Bank (DNB); in recognition of the ability of TCSPs to facilitate financial crime, the Dutch regulatory framework requires TCSP services to be licensed. In the Netherlands, the CSP sector is regulated by the Act on the Supervision of Trust Offices 2018 ("Wet toezicht trustkantoren", Wtt 2018), which stipulates that the provision of certain corporate services, such as acting as a nominee director for a company, requires a licence and an obligation to know and monitor the client. Dutch Trustkantoren have been exposed in the past in the Panama Papers leak, as key facilitators in the creation of offshore structures, sometimes helping clients set up shell companies in tax havens.
In this field, much of my effort has been devoted to researching and developing artificial intelligence (AI) tools that can identify suspicious patterns and relationships between actors involved in potentially illicit activities. These models do not rely on direct evidence of financial flows – such as money transfers – but they rely on the social network structures between different actors like directors, companies, ownership and addresses.
I am currently working on corruption detection as part of the EU-funded BridgeGap research project. The project aims to achieve a comprehensive and deeper understanding of corruption from an interdisciplinary perspective, to improve knowledge and data on corruption, and to support and further stimulate the use of modern technologies to detect, prevent and fight it. Corruption is conceptualised here, following Alina Mungiu-Pippidi, as a systemic deviation from ethical universalism, where access to public resources is governed by particularistic rather than universalistic norms. This approach supports the development of micro-level, transaction-based indicators that proxy corruption through patterns of actor behaviour – such as ownership links, procurement outcomes or personal networks – rather than relying on perception-based indices.
Within this project, my goal as a researcher is to develop targeted artificial intelligence models characterized by three main features: explainability, effectiveness, and fairness. By explainability, I refer to the ability of the model to make its decision-making process transparent and understandable to human users, especially non-technical stakeholders. Effectiveness relates to the model’s capacity to accurately detect relevant patterns and flag potential cases of wrongdoing. Fairness means ensuring that the model’s predictions are not biased against specific groups or entities, particularly when sensitive attributes (such as geographic origin or company size) are involved.
Public procurement is one of the most important areas of government spending, but it is also highly vulnerable to corruption, fraud and the inefficient or improper use of public resources - for example, when contracts are awarded without fair competition, contrary to legislation, or when funds are allocated in ways that may not serve the public interest effectively. This makes it a strategic field for the application of advanced analytical methods, as improving transparency and accountability in this area can have a significant impact on the overall integrity and efficiency of public sector operations.
-- text continues under image --

Our world seems to be in a continuous state of various crises: can you indicate for one (or possibly several) of these crises how this affects your research?As a data scientist, the biggest challenge I face on a daily basis relates to data collection. On a practical level, recent geopolitical shifts – such as rising protectionism and the trend towards a less globalised world – have a significant impact on the availability and accessibility of data. My research mainly focuses on public procurement data from EU member states and accession countries. However, sometimes policy decisions outside the EU can still have indirect but meaningful consequences. For instance, the weakening of international data sharing commitments and open government initiatives during the Trump presidency contributed to a broader climate of mistrust and data fragmentation. This had an impact on global standards, including those promoted by the EU, and pushed European institutions to take a more self-reliant and defensive approach to data governance - such as strengthening open data frameworks.
Which person inspires you and what would you like to ask her/him?My family and closest friends are undoubtedly a great source of inspiration for me; I truly believe that I wouldn't be the person I am today without their constant support. My parents taught me that personal growth comes from connecting with others, and I try to live that lesson every day of my life by talking to people. They also taught me that sometimes a conversation with a stranger can teach you more about yourself than you might expect; engaging only with those closest to you can be more useful for comfort or emotional fulfilment than for truly challenging your perspective.
Name the book/movie/thinker that impressed you the most, shaped you, would like to read or see you 100 more times and why?A work that has been a great source of inspiration for me is “The Master Algorithm” by Pedro Domingos. This book is a popular science essay that explores the world of machine learning with a touch of irony, setting out the ambitious goal of identifying the "master algorithm" – a system capable of learning any kind of universal knowledge from data. The book opens with a provocation against a system that separates the various fields of machine learning into five major "tribes": the Symbolists, the Connectionists, the Evolutionaries, the Bayesians, and the Analogizers. With a critique of the highly fragmented nature of knowledge within the scientific community, the author provocatively suggests that the ultimate solution to building the master algorithm lies in integrating the insights and methods from each of these five tribes.
I found this perspective particularly relevant because it challenges the rigid disciplinary boundaries. Innovation is often held back not by a lack of ideas, but by a reluctance to engage with methods or perspectives outside one's own niche. This book reminded me of the value – and necessity – of thinking across disciplines if we are to truly push the boundaries of knowledge. It reinforced my belief that an interdisciplinary perspective is not only enriching, but essential for solving complex real-world problems. As a data scientist and statistician, working with data is part of my daily routine. However, I have come to realise that mastering methods and making accurate predictions is not always enough. In my research on economic and financial crimes, it is crucial to understand not only the data, but also the legal frameworks that define illegal behaviour, the criminological theories that explain why it occurs, and the economic incentives that drive it. Without this broader understanding, there is a risk of misinterpreting patterns, overlooking context, or proposing solutions that are technically sound but practically ineffective.