Realities of Algorithmic Warfare

As AI and other emerging disruptive technologies are increasingly integrated into all aspects of human life, advanced militaries worldwide have found themselves in what some call an AI arms race, feeding into the third revolution in warfare.

In this context, advanced militaries such as those of the US, Russia, China, Turkey, and several EU countries including the Netherlands have been developing, experimenting with and deploying technologies with various levels of autonomy across numerous battlefields in Libya, Syria, Mali, Ukraine and beyond. While developers and armed forces promise more effective military engagement through increased speed and precision, academic and policy debates tend to focus on the threat of fully autonomous weapons making life-and-death decisions.

These debates are fed by the expanding number of weapon systems capable of autonomously identifying targets through machine learning algorithms, including the Agile Condor pods on MQ-9 UAVs, loitering munitions, Hivemind AI on VBATs, uncrewed ships and drone swarms. To have an informed debate amongst all stakeholders involved in developing, analysing, and regulating autonomous weapon systems, it is crucial to reflect on questions such as: 

By whom and how are these weapon systems being developed? What problems are these technologies seen to be a solution for? How are they developed and operators trained? How autonomous are they in reality? How and where are they being experimented with and deployed? And to what effect? What impact do they have on the legitimacy, transparency, and accountability of operations? How and why should they be regulated? And how are they changing the character of warfare?

From an inter- and transdisciplinary perspective, this research project engages with the realities of increasing autonomy in warfare through artificial intelligence. From a Conflict Studies, Media and Cultural Studies and Law perspective, we explore how integrating algorithms into existing military technology paves the way to more ludification, remoteness, and autonomy in war, bringing opportunities as well as serious risks to the battlefield as well as the fundamental building blocks of democratic societies like transparency, accountability and the rule of law.

The Realities of Algorithmic War project will engage in in-depth research and debate amongst developers, investigators and regulators to reflect on how this affects security, human rights and democratic principles. By engaging with a variety of stakeholders crossing disciplinary thresholds, we will collectively contribute to bringing the academic and policy debates forward on topics related to the lived realities of increasing autonomy in war.

Project Members

  • Jessica Dorsey LLM

    Assistant Professor
  • International Humanitarian Law, Remote Warfare, Human Rights, Artificial Intelligence and Law, Public International Law, Academic and Legal Skills
  • dr. Lauren Gould

    Assistant Professor
  • Conflict Studies, Remote Warfare, Hybrid Warfare, Civilian-Military Relations, Autonomous Weapon Systems, Civilian Harm, Framing Violence
  • Game and Play Studies, Digital Media, Critical Theory, Conflict Studies, Remote Warfare, Civilian-Military Relations