Participatory auditing of AI assistants for article generation


   UKRI Centre for Doctoral Training in Safe and Trusted Artificial Intelligence

  Prof Elena Simperl, Prof Elizabeth Black  Sunday, October 20, 2024  Funded PhD Project (Students Worldwide)

About the Project

Project start date: 1 February 2025. For an exceptional candidate, we may consider 1 June 2025 start date.

The successful candidate will contribute to the UKRI research project PHAWM (Participatory Harm Auditing Workbenches and Methodologies), which started in May 2024. This is a major project (with £3.5M of investment) involving 7 UK universities and 23 partner organisations. To enable AI stakeholders to carry out an audit, the project will produce workbenches that support assessment of the quality and potential harms of AI. The participatory audits will be embedded in methodologies which guide how, when and who carries out these audits. The project will train stakeholders in carrying out participatory audits and work towards a certification framework for AI solutions. The research is grounded in four use cases: Health, Media Content, Cultural Heritage, Collaborative Content Generation.

The PHAWM team from the Department of Informatics at King’s will be working on the Collaborative Content Generation use case, supported by the Wikimedia Foundation, Wikidata, Full Fact and the Open Data Institute. We will undertake research into participatory auditing of AI writing assistants for article generation, using LLMs, and focusing on under-resourced languages. There are more than 300 language editions of Wikipedia, but their number of articles and editors varies greatly. For example, while Arabic is the fifth most spoken language by number of speakers, Arabic Wikipedia has only 10% of the articles compared to the English version, created, and maintained by fewer than 4k editors. Given Wikipedia’s role as a trusted information source, this can entrench existing inequalities in accessing and sharing knowledge, hamper cultural diversity and heritage efforts, create unfair erasure of marginalised voices and representation, and contribute to the spread of misinformation. More perniciously, foundational AI models also use Wikimedia articles as training data, leading to a potential feedback loop of harms. Stakeholders will include article editors, and well as users of articles and information written by AI, including fact checkers, journalists and researchers.

PhD project description

The project will define a participatory framework and tools to allow editors in the language communities to use article generation assistants effectively (e.g. considering mechanisms like explanations or checklists, but also collaborative features that support team editing). Furthermore, the framework should support editors to agree on responsible AI features to be considered in the evaluation, and audit the data and models used towards those features.

Deliverables

  • Literature review on user-centric aspects of AI writing assistants
  • Literature review on participatory data and AI, with a focus on Wikipedia
  • User-centric design of audit protocol and criteria
  • Evaluation of mechanisms such as checklists and explanations to improve user trust
  • Audits of prototype article generation models

Alignment with UKRI Centre for Doctoral Training in Safe & Trusted AI

The PhD student working on this project will be aligned with the UKRI Centre for Doctoral Training in Safe and Trusted AI – known as the STAI CDT. Established in 2019, the STAI CDT is led by King's College London in partnership with Imperial College London. It supports more than 70 PhD students across six cohorts, working in areas related to the responsible development of safe and trustworthy AI. Students within the STAI CDT engage closely with their peers and are part of a collaborative community. Students regularly come together for training activities, shared lab space is provided, and there are a range of cohort building activities.

Entry requirements

Applicants will normally be expected to have a First Class Honours at BSc level (or equivalent) in computer science or other discipline related to the project. However, in exceptional cases (e.g., where extenuating circumstances apply, or where the candidate has compensating relevant experience) we may consider other qualifications.

Applicants must meet Band D of the King’s College London English Language Requirements

Applications from individuals with non-standard backgrounds (e.g. coming from industry or a career break) and from underprivileged backgrounds are encouraged, as are applications from women, candidates with disabilities, and candidates from ethnic minorities, who are currently under-represented in the sector. 

Applicants who require an ATAS certificate cannot be considered.

How to apply

Apply via the application portal at:

https://apply.kcl.ac.uk/

Programme name: “UKRI CDT in Safe and Trusted Artificial Intelligence (MPhil/PhD)”

(Programme expected to open for applications by 4 October.)

Under “Employment Details”, select “Yes” to “Do you have relevant work experience you would like to add?” and upload a CV detailing any relevant work or academic experience.

Under “Supporting Statement”, under “Research Proposal”:

  • In “Project Title/Reference” section, enter “STAI-CDT-2024”.
  • In “Brief synopsis of your research proposal” section, enter “STAI-CDT-2024-PHAWM”.
  • Upload a 3 – 4 page Research Proposal, with:
  • your ideas on specific challenges you would want to address within the project,
  • a brief review of relevant state of the art, identifying any limitations or open questions, and
  • initial plan of the research you would carry out.

Under the “Funding” section of the application:

  • Select: “5. I am applying for a funding award or scholarship administered by King’s College London.”
  • In the “Award Scheme Code or Name” box that appears when you select the option above, enter “STAI-CDT-2024-PHAWM”.

Optionally, you can also upload an Extenuating Circumstance Statement (up to one page). If you've faced significant personal or medical challenges that have impacted on your academic performance or relevant experience, you can explain these here. For example, these might include significant caring responsibilities, chronic illness or disability, experience of the care system, or coming from a deprived background. You should explain clearly your circumstances and how they have impacted you. The admissions panel will consider these circumstances in assessing your application. Note, you may be asked to provide evidence of your circumstances (e.g., medical evidence, evidence you grew up in an area of socio-disadvantage according to the ACORN methodology or an area with a low proportion of students participating in higher education as measured by POLAR4, evidence you were eligible for Free School Meals or have experience of the care system). 

After applying via the King’s application portal, you must also complete a STAI-CDT Application Information Form.

Send any questions to .

Computer Science (8)

Funding Notes

This PhD project is funded by a studentship that provides funding for 4 years and covers the following. 

  • Tuition fees, covered at the appropriate rate, whether home or international. 
  • tax-free stipend set at the UKRI rate plus £2,000 per annum London-weighting (for 2024-25, this is expected to be £21,237) 
  • A generous RTSG (Research Training and Support Grant) allowance for things like research costs, additional training, attending conferences.

This studentship is available for either a home or international student. However, due to timing constraints, we cannot consider applicants who require an ATAS certificate


Register your interest for this project


Search Suggestions
Search suggestions

Based on your current searches we recommend the following search filters.