Don't miss our weekly PhD newsletter | Sign up now Don't miss our weekly PhD newsletter | Sign up now

  Should we treat embodied artificial intelligence politely?


   Centre for Accountable, Responsible and Transparent AI

This project is no longer listed on FindAPhD.com and may not be available.

Click here to search FindAPhD.com for PhD studentship opportunities
  Dr N Gjersoe  No more applications being accepted  Competition Funded PhD Project (European/UK Students Only)

About the Project

The prevalence of embodied artificially intelligent agents in intimate spaces such as homes, schools and hospitals is set to grow exponentially. Devices such as Alexa are already a common feature in family homes. We know already that the default psychological mechanism by which naive users make sense of autonomous agents is anthropomorphism – the tendency to treat an object as if it has human-like thoughts and feelings. Anthropomorphism of objects has been shown to increase liking, care and trust in most contexts. Anthropomorphism is readily triggered with a range of objects such as cars and computers, but seems especially susceptible to apparently autonomous and socially interactive devices such as embodied artificial intelligence.

Debates about whether embodied AI should be granted moral rights typically rest not on expense or the the agent’s capacity to feel discomfort, but on the Kantian concern that if we act inhumanely towards those that we think of as having minds, we risk becoming more inhumane individuals and societies. The interaction between anthropomorphism and moral concern can be seen in examples such as the outpouring of public fury when the hitchBOT, an autonomous robot designed to hitchhike around Canada, was found vandalised and left beside the highway. Outraged commentary consistently referred to the perpetrators as “cruel” and “inhumane”.

Alongside this concern, designers must consider how best to design in mechanisms to protect their products from deliberate damage. For example, the Robovie 2 robot has now been programmed with an additional algorithm that causes it to avoid groups of humans under 1.4 meters in height and move towards groups with at least one human above that height. This follows observational research in shopping malls showing unsupervised groups of children repeatedly obstructing, punching and kicking the robot.

The current generation of children will be the first for whom embodied AI will be a day-to-day phenomena. What little research has been conducted suggests that young children do not intuitively respond to such devices with the same moral concern as adults. This is an especially fertile age to explore the implications of moral concern towards embodied AI as children’s own moral thinking and feeling goes through a spurt of development at this time and they are especially sensitive to the social cues they receive and observe. The proposed research will focus on Alexa (or another household AI) to explore a) what design cues elicit or limit anthropomorphism for children and adults, b) the relationship between anthropomorphism and moral concern and c) whether children’s and adults’ behaviour towards the embodied AI has measurable implications for subsequent moral concern towards living agents. If the student is interested, there are opportunities for co-supervision through the Bristol Robotics Laboratory and a close working relationship with the Bristol Science Museum, We The Curious.

This project is associated with the UKRI CDT in Accountable, Responsible and Transparent AI (ART-AI), which is looking for its second cohort of at least 10 students to start in September 2020. Further details can be found at: http://www.bath.ac.uk/centres-for-doctoral-training/ukri-centre-for-doctoral-training-in-accountable-responsible-and-transparent-ai/.

Desirable qualities in candidates include intellectual curiosity, a strong background in maths and programming experience.

Applicants should hold, or expect to receive, a First or Upper Second Class Honours degree. A master’s level qualification would also be advantageous.

Informal enquiries about the research should be directed to Dr Thalia Gjersoe: [Email Address Removed].

Enquiries about the application process should be sent to [Email Address Removed].

Formal applications should be made via the University of Bath’s online application form: https://samis.bath.ac.uk/urd/sits.urd/run/siw_ipp_lgn.login?process=siw_ipp_app&code1=RDUCM-FP02&code2=0002

Start date: 28 September 2020


Funding Notes

ART-AI CDT studentships are available on a competition basis for UK and EU students for up to 4 years. Funding will cover UK/EU tuition fees as well as providing maintenance at the UKRI doctoral stipend rate (£15,009 per annum in 2019/20, increased annually in line with the GDP deflator) and a training support fee of £1,000 per annum.

We also welcome all-year-round applications from self-funded candidates and candidates who can source their own funding.

Where will I study?