or
Looking to list your PhD opportunities? Log in here.
Project start date: 1 February 2025. For an exceptional candidate, we may consider 1 June 2025 start date.
The successful candidate will contribute to the UKRI research project PHAWM (Participatory Harm Auditing Workbenches and Methodologies), which started in May 2024. This is a major project (with £3.5M of investment) involving 7 UK universities and 23 partner organisations. To enable AI stakeholders to carry out an audit, the project will produce workbenches that support assessment of the quality and potential harms of AI. The participatory audits will be embedded in methodologies which guide how, when and who carries out these audits. The project will train stakeholders in carrying out participatory audits and work towards a certification framework for AI solutions. The research is grounded in four use cases: Health, Media Content, Cultural Heritage, Collaborative Content Generation.
The PHAWM team from the Department of Informatics at King’s will be working on the Collaborative Content Generation use case, supported by the Wikimedia Foundation, Wikidata, Full Fact and the Open Data Institute. We will undertake research into participatory auditing of AI writing assistants for article generation, using LLMs, and focusing on under-resourced languages. There are more than 300 language editions of Wikipedia, but their number of articles and editors varies greatly. For example, while Arabic is the fifth most spoken language by number of speakers, Arabic Wikipedia has only 10% of the articles compared to the English version, created, and maintained by fewer than 4k editors. Given Wikipedia’s role as a trusted information source, this can entrench existing inequalities in accessing and sharing knowledge, hamper cultural diversity and heritage efforts, create unfair erasure of marginalised voices and representation, and contribute to the spread of misinformation. More perniciously, foundational AI models also use Wikimedia articles as training data, leading to a potential feedback loop of harms. Stakeholders will include article editors, and well as users of articles and information written by AI, including fact checkers, journalists and researchers.
PhD project description
The project will define a participatory framework and tools to allow editors in the language communities to use article generation assistants effectively (e.g. considering mechanisms like explanations or checklists, but also collaborative features that support team editing). Furthermore, the framework should support editors to agree on responsible AI features to be considered in the evaluation, and audit the data and models used towards those features.
Deliverables
Alignment with UKRI Centre for Doctoral Training in Safe & Trusted AI
The PhD student working on this project will be aligned with the UKRI Centre for Doctoral Training in Safe and Trusted AI – known as the STAI CDT. Established in 2019, the STAI CDT is led by King's College London in partnership with Imperial College London. It supports more than 70 PhD students across six cohorts, working in areas related to the responsible development of safe and trustworthy AI. Students within the STAI CDT engage closely with their peers and are part of a collaborative community. Students regularly come together for training activities, shared lab space is provided, and there are a range of cohort building activities.
Entry requirements
Applicants will normally be expected to have a First Class Honours at BSc level (or equivalent) in computer science or other discipline related to the project. However, in exceptional cases (e.g., where extenuating circumstances apply, or where the candidate has compensating relevant experience) we may consider other qualifications.
Applicants must meet Band D of the King’s College London English Language Requirements.
Applications from individuals with non-standard backgrounds (e.g. coming from industry or a career break) and from underprivileged backgrounds are encouraged, as are applications from women, candidates with disabilities, and candidates from ethnic minorities, who are currently under-represented in the sector.
Applicants who require an ATAS certificate cannot be considered.
How to apply
Apply via the application portal at:
Programme name: “UKRI CDT in Safe and Trusted Artificial Intelligence (MPhil/PhD)”
(Programme expected to open for applications by 4 October.)
Under “Employment Details”, select “Yes” to “Do you have relevant work experience you would like to add?” and upload a CV detailing any relevant work or academic experience.
Under “Supporting Statement”, under “Research Proposal”:
Under the “Funding” section of the application:
Optionally, you can also upload an Extenuating Circumstance Statement (up to one page). If you've faced significant personal or medical challenges that have impacted on your academic performance or relevant experience, you can explain these here. For example, these might include significant caring responsibilities, chronic illness or disability, experience of the care system, or coming from a deprived background. You should explain clearly your circumstances and how they have impacted you. The admissions panel will consider these circumstances in assessing your application. Note, you may be asked to provide evidence of your circumstances (e.g., medical evidence, evidence you grew up in an area of socio-disadvantage according to the ACORN methodology or an area with a low proportion of students participating in higher education as measured by POLAR4, evidence you were eligible for Free School Meals or have experience of the care system).
After applying via the King’s application portal, you must also complete a STAI-CDT Application Information Form.
Send any questions to stai-cdt-admissions@kcl.ac.uk.
This PhD project is funded by a studentship that provides funding for 4 years and covers the following.
This studentship is available for either a home or international student. However, due to timing constraints, we cannot consider applicants who require an ATAS certificate.
The university will respond to you directly. You will have a FindAPhD account to view your sent enquiries and receive email alerts with new PhD opportunities and guidance to help you choose the right programme.
Log in to save time sending your enquiry and view previously sent enquiries
The information you submit to King’s College London will only be used by them or their data partners to deal with your enquiry, according to their privacy notice. For more information on how we use and store your data, please read our privacy statement.
Based on your current searches we recommend the following search filters.
Check out our other PhDs in London, United Kingdom
Start a New search with our database of over 4,000 PhDs
Based on your current search criteria we thought you might be interested in these.
Next Generation of Low Power Wireless Communication for Internet of Things
University of Reading
Space-time-varying Superconducting Surfaces for Next-Generation Quantum Computers
University of Southampton
Implementing a (tree-based) decoding for robot generation
University of York