Computer says no: Artificial Injustice
If the system can’t do it, it can’t be done. This ‘computer says no’ mentality; where people put the blame on the “intelligent’ information systems that they use when something goes wrong, can have grave consequences. As we increasingly use AI technology in our democratic and legal institutions, what are crucial steps we should not forget to take? (English / SG-Certificate*)
Time: 16:45-17:45 hrs.
Admission is free, registration required (limited number of seats available)
Can AI be just?
The now infamous example of the Dutch benefit scandal, shows that mistakes made by our “intelligent” information systems can have tremendously bad consequences on real human lives. In AI research, new questions have therefore come into play about how automation affects both our rights and our everyday activities. Instead of immediately blaming the information systems that we use, maybe we should look at our own responsibilities first. Are our laws and rules still up to the challenges that we are now facing when it comes to AI technology? Or are other protective measures needed to ensure sustainable and ethical computing?
Preventing artificial injustice
During this key note lecture by dr. Linnet Taylor (Professor of International Data Governance) we will look into the question how to conceptualise our rights (and those of others far away) in relation to AI technologies. Linnet will explain how categorisation, prediction and the scale of big tech infrastructures influence our political and personal autonomy. She will argue that we should place certain demands on these big tech firms, as they increasingly provide the environment in which our civil and political decisions and values are negotiated. Lastly, she will address the issue of having to deal with fundamental disagreements on what is right and wrong in the digital sphere
Linnet Taylor is Professor of International Data Governance at the Tilburg Institute for Law, Technology, and Society (TILT), for which she leads the ERC-funded Global Data Justice project, seeking to understand the different perspectives worldwide on what constitutes just treatment through data technologies.
Her research focuses on the use of new sources of digital data in governance and research around issues of human and economic development. More recently she has also been working on the prestigious Gravitation programme, “The Algorithmic Society”, which is funded by the Dutch Research Council (NWO).