Weekly PhD Newsletter | SIGN UP NOW Weekly PhD Newsletter | SIGN UP NOW

Thoughtful gestures: Designing a formal framework to code and decode hand-over-face gestures


   UKRI Centre for Doctoral Training in Socially Intelligent Artificial Agents

This project is no longer listed on FindAPhD.com and may not be available.

Click here to search FindAPhD.com for PhD studentship opportunities
  Mr Jared de Bruin  No more applications being accepted  Funded PhD Project (Students Worldwide)

About the Project

For instructions on how to apply, please see: PhD Studentships: UKRI Centre for Doctoral Training in Socially Intelligent Artificial Agents

Supervisors:

  • Marwa Mahmoud: School of Computing Science
  • Rachael Jack: School of Psychology

Human faces provide a wealth of social information for non-verbal communication. From observing the complex dynamic patterns of facial expressions and associated gestures, such as hand movements, both human and non-human agents can make myriad inferences about the person’s emotions, mental states, personality traits, cultural background, or even certain medical conditions. Specifically, people often place their hands on their face as a gesture during social interaction and conversation, which could provide information over and above what is provided by their facial expressions. Indeed, some specific hand-over-face gestures serve as salient cues for the recognition of cognitive mental states, such as thinking, confusion, curiosity, frustration, and boredom (Mahmoud et al 2016). Such hand gestures therefore provide a valuable additional channel for multi-modal inference (Mahmoud & Robinson, 2011).

Knowledge gap/novelty.

However, the systematic study of hand-over-face gestures—i.e., the complex combination of face and hand gestures—remains limited due to two main empirical challenges: 1. the lack of a large objective labelled datasets, and 2. the demands of coding signals that have a high degree of freedom. Thus, while early studies have provided initial quantitative analyses and interpretation of hand-over-face gestures (Mahmoud et al 2016, Nojavanasghari et al. 2017), these challenges have hindered the development of coding models and interpretations frameworks.

Aims/objectives.

This project aims to address this gap by designing and implementing the first formal objective model for hand-over-face gestures by achieving three main aims:

  1. Build a formal naturalistic synthetic dataset of hand-over-face gestures by extending generative models of dynamic social face signals (Jack & Schyns, 2017) and modelling dynamic hand-over-face gestures as an additional social cue in combination with facial expressions and eye and head movements (Year 1)
  2. Use state-of-the-art methodologies from human perception science to systematically model the specific face and hand gesture cues that communicate social and emotion signals within and across cultures e.g., Western European vs. Eastern Asian population (Year 2)
  3. Produce the first formal objective model for coding hand-over-face gestures (Year 3)

Methods. The project will build on and extend state-of-the-art 3D modelling of dynamic hand gesture and facial expression to produce an exhaustive dataset of hand-over-face gestures based on natural expressions. It will also use theories and experimental methods from the study of emotion theories to run human perception experiments to identity taxonomies and coding schemes for these gestures and validate interpretations within and across cultures.

Outputs/impact/industrial interests.

The formal objective framework produced from this project will serve as a vital building block for vision-based facial expression and gesture inference systems and applications in many domains, including emotion recognition, mental health applications, online education platforms, robotics, and marketing research.


References

1. Mahmoud, M. &Robinson P. (2011). Interpreting hand-over-face gestures. International Conference on Affective Computing and Intelligent Interaction (ACII).
2. Mahmoud, M., Baltrušaitis T. & Robinson P. (2016). Automatic analysis of naturalistic hand-over-face gestures. ACM Transaction on Interactive Intelligent Systems, Vol. 6 Issue 2.
3. Nojavanasghari B., Hughes C.E., Baltrušaitis T., Morency L.P. (2017). Hand2Face: Automatic synthesis and recognition of hand over face occlusions. International Conference on Affective Computing and Intelligent Interaction (ACII).
4. Jack, R.E., & Schyns, P.G. (2017). Toward a social psychophysics of face communication. Annual review of psychology, 68, 269-297.
5. Jack, R.E., Garrod, O.G.B., Yu, H., Caldara, R., & Schyns, P.G. (2012). Facial expressions of emotion are not culturally universal. Proceedings of the National Academy of Sciences, 109(19), 7241-7244.
6. Jack, R.E., & Schyns, P.G. (2015). The human face as a dynamic tool for social communication. Current Biology, 25(14), R621-R634
Search Suggestions
Search suggestions

Based on your current searches we recommend the following search filters.

PhD saved successfully
View saved PhDs