Skip to main content

Please note: this event has passed


As big tech companies have started to fire their ethics teams and scholars have argued that ethical commitments have only been ‘ethics-washing’, this seminar revisits questions about AI ethics. Can AI ethics still be relevant? What would its relevance entail? What mechanisms - if any - and governed by whom are capable of ethical AI regulation and could be trusted to do so? Are public institutions capable of doing so? Is it ethical to design for a post-human world?

About the speakers

Elke Schwarz, Homo technologicus and the challenge of moral agency, Queen Mary’s University of London

Claudia Aradau, Ethics and the ‘limit case’ of AI, King’s College London

Chair: Themis Tzimas, Aristotle University of Thessaloniki

About the series

Most recently, the rise of generative Artificial Intelligence (AI) has intensified public anxieties around algorithmic governance, machine learning and big data by reconfiguring social and cultural relations. AI now does not only find patterns among masses of data, but it can generate text and images. At the same time, concerns about AI and its uses across social and political fields – from warfare and border controls to health governance and identification practices – have not unabated. Questions of the ethical and legal implications of recent developments in AI have been supplemented by concerns about political and social effects. Our interactions with each other, state actors, private and public institutions are mediated through AI in ways that often remain opaque and unaccountable.

AI: disruptive technologies, disruptive politics? is seminar series that seeks to understand disruption through four interrelated dimensions: ethical, legal, political and social. As AI is understood to be disruptive, we ask what and how it disrupts. The aim of the seminars is to address challenges of AI and modes of analysis from across different geographical locations and from interdisciplinary perspectives. The seminars will be held online.