Coventry University Featured PhD Programmes
Engineering and Physical Sciences Research Council Featured PhD Programmes
University of Kent Featured PhD Programmes
Birkbeck, University of London Featured PhD Programmes
University of Lincoln Featured PhD Programmes

Detecting pain in human expressions

This project is no longer listed on FindAPhD.com and may not be available.

Click here to search FindAPhD.com for PhD studentship opportunities
  • Full or part time
    Dr Tom Fincham Haines
    Prof Edmund Keogh
    Prof Christopher Eccleston
  • Application Deadline
    No more applications being accepted
  • Competition Funded PhD Project (European/UK Students Only)
    Competition Funded PhD Project (European/UK Students Only)

Project Description

Pain signals threat, and motivates people to engage in protective behaviours. Pain also has a wider social function, in that it elicits helping responses from others. However, the subjective nature of pain requires observers to make inferences about a person’s needs from the verbal and non-verbal signals used to communicate pain (e.g. facial expressions, body movements/posture, vocalisations). How well observers recognise, understand and respond to these nonverbal signals depends on individual and social-contextual factors, as well as the type of information available. One potential source of variation is gender, especially in the way signals of pain in men and women are recognised and understood.

There has also been an increased use of technologies in health, which allows for remote care and treatment. This reduces ‘in-person’ contact, and changes the nature of interpersonal interactions. Technological advances in computer-based expression recognition systems can automatically detect different emotions, with varying degrees of success. Such technologies might lend themselves well to healthcare settings. To be effective, such systems need to be able to accurately detect expressions across a range of individuals, and to be built in such a way that they do not perpetuate, or even create, health inequality.

The aim of this PhD project will be to combine psychological and computer vision/machine learning approaches to explore whether it is possible to develop an automatic pain detection system based on nonverbal signals. It will build on facial expression detection and body pose estimation, as well as expression encoding for pain. Both big data (traditional datasets of emotion) and small data (people experiencing pain) will be used. Validation, to determine if a level of performance suitable for clinical use, is essential. Here the project will focus on potential variation that can occur in the detection of pain expressions in men and women. It will explore whether machine learning can be used in a way that avoids gender-related detection biases in pain detection, as well as consider whether this approach informs our understanding about the way in which such biases might occur.

Informal enquiries about the project should be directed to Prof Ed Keogh: https://researchportal.bath.ac.uk/en/persons/edmund-keogh.

Candidates should have a good first degree or a Master’s degree in computer science, psychology, or a related discipline. A strong mathematical background is essential. Good programming skills and previous machine learning experience is highly desirable, as is working in multidisciplinary, and/or health settings.

This project is associated with the UKRI CDT in Accountable, Responsible and Transparent AI (ART-AI), which is looking for its second cohort of at least 10 students to start in September 2020. Further details can be found at: http://www.bath.ac.uk/centres-for-doctoral-training/ukri-centre-for-doctoral-training-in-accountable-responsible-and-transparent-ai/.

Formal applications should be made via the University of Bath’s online application form: https://samis.bath.ac.uk/urd/sits.urd/run/siw_ipp_lgn.login?process=siw_ipp_app&code1=RDUCM-FP02&code2=0002. Enquiries about the application process should be sent to [Email Address Removed].

Start date: 28 September 2020.

Funding Notes

ART-AI CDT studentships are available on a competition basis for UK and EU students for up to 4 years. Funding will cover UK/EU tuition fees as well as providing maintenance at the UKRI doctoral stipend rate (£15,009 per annum in 2019/20, increased annually in line with the GDP deflator) and a training support fee of £1,000 per annum.

We also welcome all-year-round applications from self-funded candidates and candidates who can source their own funding.

References

Keogh E. Gender differences in the nonverbal communication of pain: a new direction for sex, gender, and pain research? Pain. 2014;155(10):1927-31
Keogh E, et al. Exploring attentional biases towards facial expressions of pain in men and women. Eur J Pain. 2018;22(9):1617-1627
Walsh et al. Pain communication through body posture: the development and validation of a stimulus set. Pain. 2014;155(11):2282-90.
Güler et al. DensePose: Dense human pose estimation in the wild, 2018.
Samadiani et al. A review on automatic facial expression recognition systems assisted by multimodal sensor data. Sensors, 2019;19(8): E1863.
Shan et al. Robust facial expression recognition using local binary patterns. Proceedings of IEEE International Conference on Image Processing (ICIP); Genoa, Italy. 11–14 September 2005; pp. 370–373.

How good is research at University of Bath in Computer Science and Informatics?

FTE Category A staff submitted: 24.00

Research output data provided by the Research Excellence Framework (REF)

Click here to see the results for all UK universities


FindAPhD. Copyright 2005-2020
All rights reserved.