woman with camera

Working towards more inclusive artificial intelligence

Published: 19th September 2023 Last updated: 19th September 2023

Gender bias in Machine Translation is not just a technical challenge, but a social, ethical, and legal one that stems from everyday communication. Through an interdisciplinary analysis Eva Vanmassenhove and her team want to contribute to more inclusive artificial intelligence.

“Many of us are using language technology on a day-to-day basis, but the models we are currently using perpetuate and exacerbate all kinds of biases.” – Dr. Eva Vanmassenhove 

Besides Dr. Eva Vanmassenhove (TSHD, Cognitive Science and Artificial Intelligence Dept.), the team includes Dr. Seunghyun Song (TSHD, Philosophy Dept.), Dr. Hanna Lukkari (TLS, Public Law and Governance Dept.) and student assistant Sonja Siebeneicher. Also involved are Dr. Jasmijn Bastings on behalf of Google DeepMind, and Santi van den Toorn on behalf of Transgender Netwerk Nederland (TNN). 

Their project, funded by Tilburg University’s Digital Sciences for Society program, explores why and under which circumstances current Machine Translation systems perpetuate gender bias, how an ethical framework can evaluate and tackle that bias, and whether existing legal frameworks recognize and redress gendered linguistic injustice in this context. 

Watch this brief introduction to the project:

Get ready for the digital future

The Digital Sciences for Society program invests in impactful research, education and collaboration aimed at seizing the opportunities and dealing with the challenges of digitalization for science and society.

Read more