Skip to main content

Please note: this event has passed


Machine learning relies on data, but that data is not always the best possible option. Historical biases and discrimination can affect how our systems learn, and our old biases can become deeply embedded within our algorithms. We must actively work to combat these biases. Tools like MLighter can identify and correct these biases, but it still requires time and effort.

To see for yourself how biases can manifest in computer vision models like YOLO, check how accurately the system recognizes you.

This event is part of Bringing the Human to the Artificial, an exhibition presented by the King’s Institute for Artificial Intelligence, showcases cutting-edge research from across the university exploring the effects of artificial intelligence (AI) on different aspects of our lives. The exhibition is open between 2 May and 30 June 2023. Find out more here.

At this event

Héctor Menendez

Lecturer in Computer Science (Programming and Software Engineering)

Event details


Bush House Arcade
Arcade at Bush House, South Wing, Strand WC2B 4PJ