Dutch House of Representatives endorses mandatory use of Human Rights and Algorithms Impact Assessment
The Utrecht Data School and Professor Janneke Gerards developed the Human Rights and Algorithms Impact Assessment (IAMA), an assessment that enables well-informed and responsible decision-making regarding the use of algorithms. The motion by MPs Kauthar Bouchallikh (GroenLinks) and Hind Dekker-Abdulaziz (D66) to make the use of IAMA mandatory was approved by the Second Chamber (Tweede Kamer) on Tuesday.
Algorithms may seem unbiased, but they are not. To be able to use them to support governments and businesses in carrying out legal obligations in the future, negligence, ineffectiveness or, worse still, violation of human rights must be ruled out. A repeat of the scandal surrounding the ethnic profiling of the Dutch Tax Authorities, the 'toeslagenaffaire', must be prevented at all costs.
To this end, Mirko Schäfer, Arthur Vankan and Iris Muis of Utrecht Data School and Professor of Fundamental Rights Janneke Gerards developed IAMA at the request of the Ministry of the Interior and Kingdom Relations. The adopted motion clears the way for the mandatory use of impact assessments before algorithms can be applied to evaluate and make decisions about people.
Update 4 May
An English version of the IAMA is now available: the Fundamental Rights and Algorithms Impact Assessment (FRAIA).