Weekly PhD Newsletter | SIGN UP NOW Weekly PhD Newsletter | SIGN UP NOW

About the Project

Pain signals threat, and motivates people to engage in protective behaviours. Pain also has a wider social function, in that it elicits helping responses from others. However, the subjective nature of pain requires observers to make inferences about a person’s needs from the verbal and non-verbal signals used to communicate pain (e.g. facial expressions, body movements/posture, vocalisations). How well observers recognise, understand and respond to these nonverbal signals depends on individual and social-contextual factors, as well as the type of information available. One potential source of variation is gender, especially in the way signals of pain in men and women are recognised and understood.

There has also been an increased use of technologies in health, which allows for remote care and treatment. This reduces ‘in-person’ contact, and changes the nature of interpersonal interactions. Technological advances in computer-based expression recognition systems can automatically detect different emotions, with varying degrees of success. Such technologies might lend themselves well to healthcare settings. To be effective, such systems need to be able to accurately detect expressions across a range of individuals, and to be built in such a way that they do not perpetuate, or even create, health inequality.

The aim of this PhD project will be to combine psychological and computer vision/machine learning approaches to explore whether it is possible to develop an automatic pain detection system based on nonverbal signals. It will build on facial expression detection and body pose estimation, as well as expression encoding for pain. Both big data (traditional datasets of emotion) and small data (people experiencing pain) will be used. Validation, to determine if a level of performance suitable for clinical use, is essential. Here the project will focus on potential variation that can occur in the detection of pain expressions in men and women. It will explore whether machine learning can be used in a way that avoids gender-related detection biases in pain detection, as well as consider whether this approach informs our understanding about the way in which such biases might occur.

Informal enquiries about the project should be directed to Prof Ed Keogh.

Candidates should have a good first degree or a Master’s degree in computer science, psychology, or a related discipline. A strong mathematical background is essential. Good programming skills and previous machine learning experience is highly desirable, as is working in multidisciplinary, and/or health settings.

This project is associated with the UKRI Centre for Doctoral Training (CDT) in Accountable, Responsible and Transparent AI (ART-AI). We value people from different life experiences with a passion for research. The CDT's mission is to graduate diverse specialists with perspectives who can go out in the world and make a difference.

Formal applications should be accompanied by a research proposal and made via the University of Bath’s online application form. Enquiries about the application process should be sent to .

Start date: 2 October 2023.

Funding Notes

ART-AI CDT studentships are available on a competition basis and applicants are advised to apply early as offers are made from January onwards. Funding will cover tuition fees and maintenance at the UKRI doctoral stipend rate (£17,668 per annum in 2022/23, increased annually in line with the GDP deflator) for up to 4 years.
We also welcome applications from candidates who can source their own funding.


Keogh E. Gender differences in the nonverbal communication of pain: a new direction for sex, gender, and pain research? Pain. 2014;155(10):1927-31
Keogh E, et al. Exploring attentional biases towards facial expressions of pain in men and women. Eur J Pain. 2018;22(9):1617-1627
Walsh et al. Pain communication through body posture: the development and validation of a stimulus set. Pain. 2014;155(11):2282-90.
Güler et al. DensePose: Dense human pose estimation in the wild, 2018.
Samadiani et al. A review on automatic facial expression recognition systems assisted by multimodal sensor data. Sensors, 2019;19(8): E1863.
Shan et al. Robust facial expression recognition using local binary patterns. Proceedings of IEEE International Conference on Image Processing (ICIP); Genoa, Italy. 11–14 September 2005; pp. 370–373.

How good is research at University of Bath in Psychology, Psychiatry and Neuroscience?

Research output data provided by the Research Excellence Framework (REF)

Click here to see the results for all UK universities

Email Now

PhD saved successfully
View saved PhDs