Study by the ITAS with the support of the Anti-Discrimination Agency
Substantial risks of discrimination due to algorithms/ Providing evidence should be simplified for those affected
2019.09.16

Algorithms and large data sets involve significant potential for discrimination in working life, when allocating housing or within the credit industry, when calculating insurance rates, in medical care and in police work. This is indicated by a recent study conducted by the Karlsruhe Institute for Technology Assessment
and Systems Analysis (ITAS) with funding from the Federal Anti-Discrimination Agency. Algorithms apply a large variety of information, products, services, fees or positions to differentiate groups of people, which can be particularly problematic if it is directly or indirectly linked to grounds of discrimination banned by law, such as age, disability, ethnic origin, gender, religion, beliefs or sexual identity.

Using numerous examples, the study not only illustrates technical and societal causes for potential discrimination but also – and above all – the risks resulting from them. In this context, the author Dr Carsten Orwat also discusses the need to reform the anti-discrimination and data protection laws and the necessity to decide as a society which algorithm- and data-based differentiations can be considered acceptable within a society.

"As the study shows clearly, there are substantial risks of discrimination that already exist today and cover numerous areas of life", states Bernhard Franke, acting head of the Anti-Discrimination Agency. "The good news is that with its bans on discrimination, which apply to the analogue as well as the digital realm, the General Equal Treatment Act (AGG) already gives us the right answers. So, we do not need new laws, but we do need to make our existing laws future-proof."

The study gives concrete recommendations for regulation, such as a right for anti-discrimination agencies to inspect algorithms, and the establishment of ‘algorithm audits’ to facilitate the identification and detection of algorithm-based discrimination and to strengthen the rights of those affected. This also includes creating preventive services such as, for instance, training for HR managers or IT administrators. Furthermore, companies and administrations that use algorithms in legally sensitive areas should be subject to specific documentation obligations to avoid discrimination.

The study was presented by the Anti-Discrimination Agency in an expert discussion on Monday, 16 September 2019, in Berlin.

The study is available for download here.

The Federal Anti-Discrimination Agency (ADS for its initials in German) was established when the General Equal Treatment Act (German abbreviation: AGG) entered into force in August 2006. This Act aims to prevent or eliminate any discrimination on grounds of racism or ethnic origin, gender, religion or belief, disability, age or sexual orientation.