Chat online with top universities at our virtual study fair - Tuesday 7th July (12pm - 5pm BST)

University of East Anglia Featured PhD Programmes
Engineering and Physical Sciences Research Council Featured PhD Programmes
University of Reading Featured PhD Programmes

Deepfakes: re-balancing the governance of persona appropriation created with AI. PhD Law Studentship (Funded by the QUEX Institute)

College of Social Sciences and International Studies

Monday, August 31, 2020 Competition Funded PhD Project (Students Worldwide)

About the Project

Join a world-leading, cross-continental research team

The University of Exeter and the University of Queensland are seeking exceptional students to join a world-leading, cross-continental research team tackling major challenges facing the world’s population in global sustainability and wellbeing as part of the QUEX Institute. The joint PhD programme provides a fantastic opportunity for the most talented doctoral students to work closely with world-class research groups and benefit from the combined expertise and facilities offered at the two institutions, with a lead supervisor within each university. This prestigious programme provides full tuition fees, stipend, travel funds and research training support grants to the successful applicants. The studentship provides funding for up to 42 months (3.5 years).

This interdisciplinary PhD research project proposes new ways of governing the practice of persona appropriation created with artificial intelligence (AI) to rebalance public and expert debates on this topic. To achieve this ambitious aim, this PhD studentship will be dedicated to developing an innovative critical framework which will serve as a tool for assessment of positive and negative persona appropriation within a spectrum of harm and benefits. Persona appropriation refers to the act of manipulating, modifying, adding or erasing aspects of an individual’s persona. Persona can be broadly defined as the representation of a person’s identity through the imitation of their image, voice or likeness. The latest and most challenging form of AI-made persona appropriation is colloquially known as ‘Deepfakes’. Deepfakes refers to the synthetic content produced using artificial intelligence; they are a type of AI-made persona appropriation. Deepfakes most commonly take the form of fabricated audio-visual footage of a person created using existing (authentic) footage edited with an AI algorithm to produce more realistic and high-quality results. This project focuses on synthetic audio-visual content, as the most cutting-edge application of the technology in the context of persona appropriation. The reader is invited to view examples of Deepfakes here: .

Deepfakes have revived public and expert interests in tackling harmful forms of persona appropriation to the extent that countries like Australia, the United States or France are implementing or debating new legislation. The industry is also investing in detection programs and developing best practice principles. Yet, these conversations have been overly focused on abusive forms of persona appropriation at the cost of a more balanced and informed approach that could account for both negative and positive applications of AI in this context. The research project fills this gap in knowledge to produce sustainable governance principles for AI-made persona appropriation.

For more information about this studentship including how to apply, please follow the instructions detailed on the following webpage

Funding Notes

Full tuition fees, stipend of £15,000 p.a, travel funds of up to £15,000, and RTSG of £15,000 are available over the 3.5 year studentship

Email Now

Insert previous message below for editing? 
You haven’t included a message. Providing a specific message means universities will take your enquiry more seriously and helps them provide the information you need.
Why not add a message here

The information you submit to University of Exeter will only be used by them or their data partners to deal with your enquiry, according to their privacy notice. For more information on how we use and store your data, please read our privacy statement.

* required field

Your enquiry has been emailed successfully

FindAPhD. Copyright 2005-2020
All rights reserved.