University of Edinburgh Featured PhD Programmes
University of Southampton Featured PhD Programmes
University of Glasgow Featured PhD Programmes

This project is no longer listed on and may not be available.

Click here to search for PhD studentship opportunities
Dr Tom Fincham Haines , Prof Edmund Keogh , Prof Christopher Eccleston No more applications being accepted Competition Funded PhD Project (Students Worldwide)
Bath United Kingdom Data Analysis Gender Studies Health Informatics Other Other Software Engineering

About the Project

Pain signals threat, and motivates people to engage in protective behaviours. Pain also has a wider social function, in that it elicits helping responses from others. However, the subjective nature of pain requires observers to make inferences about a person’s needs from the verbal and non-verbal signals used to communicate pain (e.g. facial expressions, body movements/posture, vocalisations). How well observers recognise, understand and respond to these nonverbal signals depends on individual and social-contextual factors, as well as the type of information available. One potential source of variation is gender, especially in the way signals of pain in men and women are recognised and understood.

There has also been an increased use of technologies in health, which allows for remote care and treatment. This reduces ‘in-person’ contact, and changes the nature of interpersonal interactions. Technological advances in computer-based expression recognition systems can automatically detect different emotions, with varying degrees of success. Such technologies might lend themselves well to healthcare settings. To be effective, such systems need to be able to accurately detect expressions across a range of individuals, and to be built in such a way that they do not perpetuate, or even create, health inequality.

The aim of this PhD project will be to combine psychological and computer vision/machine learning approaches to explore whether it is possible to develop an automatic pain detection system based on nonverbal signals. It will build on facial expression detection and body pose estimation, as well as expression encoding for pain. Both big data (traditional datasets of emotion) and small data (people experiencing pain) will be used. Validation, to determine if a level of performance suitable for clinical use, is essential. Here the project will focus on potential variation that can occur in the detection of pain expressions in men and women. It will explore whether machine learning can be used in a way that avoids gender-related detection biases in pain detection, as well as consider whether this approach informs our understanding about the way in which such biases might occur.

Informal enquiries about the project should be directed to Prof Ed Keogh:

Candidates should have a good first degree or a Master’s degree in computer science, psychology, or a related discipline. A strong mathematical background is essential. Good programming skills and previous machine learning experience is highly desirable, as is working in multidisciplinary, and/or health settings.

This project is associated with the UKRI Centre for Doctoral Training (CDT) in Accountable, Responsible and Transparent AI (ART-AI).

Enquiries about the application process should be sent to [Email Address Removed].

Formal applications should be made via the University of Bath’s online application form:

Start date: 4 October 2021.

Funding Notes

ART-AI CDT studentships are available on a competition basis for up to 4 years. Funding will cover tuition fees and maintenance at the UKRI doctoral stipend rate (£15,285 per annum in 2020/21, increased annually in line with the GDP deflator). We offer at least ten studentships each year, up to three of which can be awarded to international students.

We also welcome applications from candidates who can source their own funding.


Keogh E. Gender differences in the nonverbal communication of pain: a new direction for sex, gender, and pain research? Pain. 2014;155(10):1927-31
Keogh E, et al. Exploring attentional biases towards facial expressions of pain in men and women. Eur J Pain. 2018;22(9):1617-1627
Walsh et al. Pain communication through body posture: the development and validation of a stimulus set. Pain. 2014;155(11):2282-90.
Güler et al. DensePose: Dense human pose estimation in the wild, 2018.
Samadiani et al. A review on automatic facial expression recognition systems assisted by multimodal sensor data. Sensors, 2019;19(8): E1863.
Shan et al. Robust facial expression recognition using local binary patterns. Proceedings of IEEE International Conference on Image Processing (ICIP); Genoa, Italy. 11–14 September 2005; pp. 370–373.
Search Suggestions

Search Suggestions

Based on your current searches we recommend the following search filters.

FindAPhD. Copyright 2005-2021
All rights reserved.