Don't miss our weekly PhD newsletter | Sign up now Don't miss our weekly PhD newsletter | Sign up now

  Detecting pain in human expressions

   Centre for Accountable, Responsible and Transparent AI

This project is no longer listed on and may not be available.

Click here to search for PhD studentship opportunities
  Dr Tom Fincham Haines, Prof Edmund Keogh, Prof Christopher Eccleston  No more applications being accepted  Self-Funded PhD Students Only

About the Project

Pain signals threat, and motivates people to engage in protective behaviours. Pain also has a wider social function, in that it elicits helping responses from others. However, the subjective nature of pain requires observers to make inferences about a person’s needs from the verbal and non-verbal signals used to communicate pain (e.g. facial expressions, body movements/posture, vocalisations). How well observers recognise, understand and respond to these nonverbal signals depends on individual and social-contextual factors, as well as the type of information available. One potential source of variation is gender, especially in the way signals of pain in men and women are recognised and understood.

There has also been an increased use of technologies in health, which allows for remote care and treatment. This reduces ‘in-person’ contact, and changes the nature of interpersonal interactions. Technological advances in computer-based expression recognition systems can automatically detect different emotions, with varying degrees of success. Such technologies might lend themselves well to healthcare settings. To be effective, such systems need to be able to accurately detect expressions across a range of individuals, and to be built in such a way that they do not perpetuate, or even create, health inequality.

The aim of this PhD project will be to combine psychological and computer vision/machine learning approaches to explore whether it is possible to develop an automatic pain detection system based on nonverbal signals. It will build on facial expression detection and body pose estimation, as well as expression encoding for pain. Both big data (traditional datasets of emotion) and small data (people experiencing pain) will be used. Validation, to determine if a level of performance suitable for clinical use, is essential. Here the project will focus on potential variation that can occur in the detection of pain expressions in men and women. It will explore whether machine learning can be used in a way that avoids gender-related detection biases in pain detection, as well as consider whether this approach informs our understanding about the way in which such biases might occur.

Informal enquiries about the project should be directed to Prof Ed Keogh.

Candidates should have a good first degree or a Master’s degree in computer science, psychology, or a related discipline. A strong mathematical background is essential. Good programming skills and previous machine learning experience is highly desirable, as is working in multidisciplinary, and/or health settings.

Formal applications should be accompanied by a research proposal and made via the University of Bath’s online application form. Further information about the application process can be found here.

Start date: Between 8 January and 30 September 2024.

Computer Science (8) Mathematics (25) Nursing & Health (27) Psychology (31) Sociology (32)

Funding Notes

We welcome applications from candidates who can source their own funding. Tuition fees for the 2023/4 academic year are £4,700 (full-time) for Home students and £26,600 (full-time) for International students. For information about eligibility for Home fee status:


Keogh E. Gender differences in the nonverbal communication of pain: a new direction for sex, gender, and pain research? Pain. 2014;155(10):1927-31
Keogh E, et al. Exploring attentional biases towards facial expressions of pain in men and women. Eur J Pain. 2018;22(9):1617-1627
Walsh et al. Pain communication through body posture: the development and validation of a stimulus set. Pain. 2014;155(11):2282-90.
Güler et al. DensePose: Dense human pose estimation in the wild, 2018.
Samadiani et al. A review on automatic facial expression recognition systems assisted by multimodal sensor data. Sensors, 2019;19(8): E1863.
Shan et al. Robust facial expression recognition using local binary patterns. Proceedings of IEEE International Conference on Image Processing (ICIP); Genoa, Italy. 11–14 September 2005; pp. 370–373.

How good is research at University of Bath in Psychology, Psychiatry and Neuroscience?

Research output data provided by the Research Excellence Framework (REF)

Click here to see the results for all UK universities

Where will I study?

Search Suggestions
Search suggestions

Based on your current searches we recommend the following search filters.