Subject areas:
Computer science.
Engineering.
Funding type:
Tuition fee.
Stipend.
Research Training & Support Grant.
4-year fully funded studentship for a Home-fee student to start in June 2025.
Award details
Project start date: 1 June 2025
Supervisors: Professor Elena Simperl, Prof Elizabeth Black
The successful candidate will contribute to the UKRI research project PHAWM (Participatory Harm Auditing Workbenches and Methodologies), a major project (with £3.5M of investment) involving 7 UK universities and 23 partner organisations. PHAWM will produce workbenches to enable diverse AI stakeholders to audit AI technologies, and will develop participatory audit methodologies which guide how, when and who carries out these audits. The PHAWM project will train stakeholders in carrying out participatory audits and work towards a certification framework for AI solutions. The research is grounded in four use cases: Health, Media Content, Cultural Heritage, Collaborative Content Generation.
The PHAWM team from the Department of Informatics at King’s will be working on the Collaborative Content Generation use case, supported by the Wikimedia Foundation, Wikidata, Full Fact and the Open Data Institute. We will undertake research into participatory auditing of AI writing assistants for article generation, using LLMs, and focusing on under-resourced languages. There are more than 300 language editions of Wikipedia, but their number of articles and editors varies greatly. For example, while Arabic is the fifth most spoken language by number of speakers, Arabic Wikipedia has only 10% of the articles compared to the English version, created, and maintained by fewer than 4k editors. Given Wikipedia’s role as a trusted information source, this can entrench existing inequalities in accessing and sharing knowledge, hamper cultural diversity and heritage efforts, create unfair erasure of marginalised voices and representation, and contribute to the spread of misinformation. More perniciously, foundational AI models also use Wikimedia articles as training data, leading to a potential feedback loop of harms. Stakeholders will include article editors, and well as users of articles and information written by AI, including fact checkers, journalists and researchers.
PhD project description
The PhD project will focus specifically on the development of a participatory Responsible AI methodology and tools that support stakeholders to agree on the features and values against which a particular AI technology should be audited – including priorities over these and the tradeoffs that are acceptable – and then to audit the tool accordingly. The project will use techniques from computational argumentation to support collaborative participatory decisions about which features and values to prioritise in the audit, and produce a protocol that specifies the auditing process. The tools and protocol that are developed will be evaluated with different language communities of wikipedia editors, considering the use of AI writing assistants for article generation.
Deliverables
- Literature review on: user-centric aspects of AI writing assistants; participatory data and AI, with a focus on Wikipedia; computational argumentation for participatory decision making
- User-centric design of argumentation mechanism for collaborative decisions on audit criteria
- User-centric design of audit protocol
- Audits of prototype article generation models
Alignment with UKRI Centre for Doctoral Training in Safe & Trusted AI
The PhD student working on this project will be aligned with the UKRI Centre for Doctoral Training in Safe and Trusted AI – known as the STAI CDT. Established in 2019, the STAI CDT is led by King's College London in partnership with Imperial College London. It currently supports more than 60 PhD students across five cohorts, working in areas related to the responsible development of safe and trustworthy AI. Students within the STAI CDT engage closely with their peers and are part of a collaborative community. Students regularly come together for training activities, shared lab space is provided, and there are a range of cohort building activities.
Award value
A four-year studentship that covers the following.
Stipend: set at the UKRI rate plus £2,000 per annum London-weighting (for 2024-25, this is expected to be £21,237).
Tuition fees: covered at the appropriate rate, whether home or international.
Other: A generous RTSG (Research Training and Support Grant) allowance for things like research costs, additional training, attending conferences.
Eligibility criteria
We can only consider candidates who qualify for home fee status.
Applicants will normally be expected to have a First Class Honours at BSc level (or equivalent) in computer science or other discipline related to the project. However, in exceptional cases (e.g., where extenuating circumstances apply, or where the candidate has compensating relevant experience) we may consider other qualifications.
Applicants must meet Band D of the King’s College London English Language Requirements.
Applications from individuals with non-standard backgrounds (e.g. those from industry or returning from a career break) and from underprivileged backgrounds are encouraged, as are applications from women, candidates with disabilities, and candidates from ethnic minorities, who are currently under-represented in the sector.
Application process
Make your application via the application portal at: King's Apply
Programme name: “UKRI CDT in Safe and Trusted Artificial Intelligence (MPhil/PhD)”
Under “Employment Details”, select “Yes” to “Do you have relevant work experience you would like to add?” and upload a CV detailing any relevant work or academic experience.
Under “Supporting Statement”, under “Research Proposal”:
- In the “Project Title/Reference” section, enter “STAI-CDT-2024”.
- In the “Brief synopsis of your research proposal” section, enter “STAI-CDT-2024-PHAWM”.
- Upload a 3 – 4 page Research Proposal, with:
- your ideas on the specific challenges you would want to address within the project,
- a brief review of the relevant state of the art, identifying any limitations or open questions, and
- your initial plan of the research you would carry out.
Under the “Funding” section of the application:
- Select: “5. I am applying for a funding award or scholarship administered by King’s College London.”
- In the “Award Scheme Code or Name” box that appears when you select the option above, enter “STAI-CDT-2024-PHAWM”.
Optionally, you can also upload an Extenuating Circumstance Statement (up to one page). If you have faced significant personal or medical challenges that have impacted on your academic performance or relevant experience, you can explain these here. For example, these might include significant caring responsibilities, chronic illness or disability, experience of the care system, or coming from a deprived background. You should explain clearly your circumstances and how these have impacted you. The admissions panel will consider these circumstances in assessing your application. Note, you may be asked to provide evidence of your circumstances (e.g., medical evidence, evidence you grew up in an area of socio-disadvantage according to the ACORN methodology or an area with a low proportion of students participating in higher education as measured by POLAR4, evidence you were eligible for Free School Meals or have experience of the care system).
After applying via the King’s application portal, you must also complete a STAI-CDT Application Information Form.
Contact Details
Professor Elena Simperl elena.simperl@kcl.ac.uk
Professor Elizabeth Black elizabeth.black@kcl.ac.uk