Skip to main content

Job id: 093987. Salary: £43,205 - £50,585 per annum, including London Weighting Allowance.

Posted: 08 August 2024. Closing date: 08 September 2024.

Business unit: The Dickson Poon School of Law. Department: Law School.

Contact details: Dan Hunter. dan.hunter@kcl.ac.uk

Location: Strand Campus. Category: Research.

About us

The Dickson Poon School of Law, King's College London is one of the oldest law schools in England and recognised globally as one of the best law schools in the world.* The School was established in 1831 and has played an integral role in the life of King's since the university was formed almost 200 years ago.

King’s has been in service to society since its foundation and we’re proud to continue that tradition to this day. Our research and teaching address some of the most pressing questions of our time relating to equality and human rights, the legal implications of climate change, globalisation, international relations, trade, competition and global finance, to name but a few. Members of The Dickson Poon School of Law advise governments, serve on commissions and public bodies and are seconded to national and international organisations, helping to shape policy and practice nationally and internationally.

About the role

This role supports Hunter’s contributions to two work packages in the Participatory Harm Auditing Workbenches and Methodologies (PHAWM) project, focused on interviewing subjects, developing methodologies, assessing investigator efforts against legal, regulatory and ethical frameworks, writing up academic articles, dissemination, public engagement, and other auxiliary activities for the project.

The project

A significant barrier to reaping the benefits of predictive and generative AI is their unassessed potential for harms. Hence, AI auditing has become imperative, in line with existing and impending regulatory frameworks. Yet, AI auditing has been haphazard, unsystematic, and left solely in the hands of experts. This project investigates how to enable individual and collective participatory auditing of current and future AI technologies so that diverse stakeholders beyond AI experts can be involved in auditing their harms. Our research will systematically investigate stakeholders’ needs for and notions of fairness and harms to create auditing workbenches comprising novel user interfaces, algorithms and privacy-preserving mechanisms that help stakeholders to perform audits whilst guarding against unintended negative effects or abuse by malicious actors.

We will create participatory auditing methodologies which reflect, anticipate and inform regulatory frameworks, specifying how to embed participatory auditing in the AI development lifecycle using the workbenches we have developed. We will develop and implement training for stakeholders in participatory auditing to embed our project outputs in practice. We will work towards a certification framework for AI solutions, thus ensuring that AI is safe and trustworthy.

This is a full time post (35 hours per week), and you will be offered a fixed term contract until 31 April 2025 depending on start date (this is a 6 month contract in total with a proposed start date of 1 November 2024).

About you

To be successful in this role, we are looking for candidates to have the following skills and experience:

Essential criteria

1.      PhD qualified in relevant subject area such as Law, Ethics, Computer Science, Artificial Intelligence, or related fields (or pending results)

2.      Data analysis skills

3.      Track record or pipeline of publications in the area of artificial intelligence and law, regulation of artificial intelligence, responsible AI, or cognate areas

4.      Knowledge in the theory and literature around predictive or generative AI and/or the regulation of AI

5.      Excellent interpersonal, presentation and communication skills

Desirable criteria

1.      Experience or capacity in developing AI workbenches and/or methods for workbench assessment

2.      Experience or demonstrated capacity in undertaking AI auditing

3.      Experience or demonstrated capacity in training laypeople in AI, AI auditing, or AI regulation

Downloading a copy of our Job Description

Full details of the role and the skills, knowledge and experience required can be found in the Job Description document, provided at the bottom of the next page after you click “Apply Now”. This document will provide information of what criteria will be assessed at each stage of the recruitment process.

* Please note that this is a PhD level role but candidates who have submitted their thesis and are awaiting award of their PhDs will be considered. In these circumstances the appointment will be made at Grade 5, spine point 30 with the title of Research Assistant. Upon confirmation of the award of the PhD, the job title will become Research Associate and the salary will increase to Grade 6.

Further Information

We pride ourselves on being inclusive and welcoming. We embrace diversity and want everyone to feel that they belong and are connected to others in our community.

We are committed to working with our staff and unions on these and other issues, to continue to support our people and to develop a diverse and inclusive culture at King's.

We ask all candidates to submit a copy of their CV, and a supporting statement, detailing how they meet the essential criteria listed in the advert. If we receive a strong field of candidates, we may use the desirable criteria to choose our final shortlist, so please include your evidence against these where possible.

To find out how our managers will review your application, please take a look at our ‘ How we Recruit’ pages.

Interviews are due to be held likely in late September.

We are not able to offer sponsorship for candidates who do not currently possess the right to work in the UK.