Weekly PhD Newsletter | SIGN UP NOW Weekly PhD Newsletter | SIGN UP NOW

Building rapport with avatars in virtual reality


   Faculty of Engineering, Computing and the Environment

  Dr Gregory Mills  Applications accepted all year round  Self-Funded PhD Students Only

About the Project

One of the central questions in the study of human communication is how strangers create a connection with each other and establish shared understanding and trust. Research suggests that mimicry of non-verbal communication plays an essential role: If we like someone, we tend to subconsciously copy their gestures, posture, facial expressions and head movements. Similarly, if people mimic our behaviours, this increases our positive feelings towards them (Louwerse et al., 2021). These findings have even made their way into popular culture – business people, politicians and other professionals are often instructed to copy the body language of their conversational partner, in order to subconsciously influence them.

More recently, researchers have started using virtual reality to investigate how people build rapport (Jun et al., 2021; Seymour et al., 2021). Virtual reality is extremely powerful, as it allows researchers to automatically manipulate participants’ non-verbal behaviour in real-time, and test its effect on participants. In one experimental setup, researchers use virtual reality to track and record participant’s head movements and then use those recordings to animate their partner’s head movements (Zhang and Healey, 2008). This gives participants the impression that their partner is mimicking them. Importantly, results from these experiments are currently casting doubt on the original findings: Simple mimicking makes people trust their partner less, not more. Instead, rapport depends on people mimicking specific non-verbal behaviours at specific moments of the conversation. But it is currently unclear what this set of non-verbal behaviours is, and which moments of the conversation are important.

In this project you will program and then conduct a set of experimental studies, which detect and transform participants’ non-verbal behaviour, and then test their effect on rapport and trust.

The first study focuses on the role of simultaneous nodding in conversation. You will write a program (in unity, unreal, or openxr), which automatically analyses both participants’ head movements. If the system detects one participant nodding their head at the same time as the other participant, the system will either (1) amplify the apparent nodding, so that both participants observe each other nodding more enthusiastically, and (2) attenuate the apparent nodding, so that both participants observe each other nodding less enthusiastically. If simultaneous nodding is one of the mechanisms of rapport, then people who see amplified nodding should build more rapport with each other, whereas attenuating the nodding should break rapport.

The second study focuses on facial expressions. You will write a program which automatically analyses both participants’ facial expressions. This program will then transform participants’ facial expressions so that their partner’s facial expression is more similar to their own. If mimicry of facial expressions is one of the mechanisms of rapport, then people who see these transformed expressions should build more rapport with each other.

In addition to the above-mentioned studies, this project will also focus on other non-verbal behaviours of interest, including gestures, eye-gaze and posture.

The findings from these experiments are of central importance for the design of avatars – how can we build avatars that are more engaging and fun to interact with? Conversely, how can we ensure that in the near future we can avoid being manipulated by such techniques in the “metaverse”? This project would suit a candidate who is proficient in programming (unreal, unity) and who is also interested in psychological theories of human-human and human-machine interaction. We also encourage applications from individuals who have an artistic interest in exploring the creation of virtual avatars that extend human perceptual and physical abilities (see, e.g., McVeigh-Schultz and Ibister, 2021).


Funding Notes

No funding is available for this project

References

Bailenson, J. N. & Yee, N. (2005) Digital Chameleons Automatic Assimilation of Nonverbal Gestures in Immersive Virtual Environments. Psychol. Sci. 16, 814–819
Gurion, T., Healey, P. G., & Hough, J. (2019) Comparing models of speakers’ and listeners’ head nods.
Jun, H., & Bailenson, J. (2021). Nonverbal synchrony, media, and emotion. In Routledge International Handbook of Emotions and Media (pp. 303-315). Routledge.
Louwerse, M. Dale, R., Bard, E. and Jeuniaux, P.. (2012) Behavior matching in multimodal communication is synchronized,. Cognitive Science, vol. 36 (8)
McVeigh-Schultz, J., & Isbister, K. (2021). The Case for “Weird Social” in VR/XR: A Vision of Social Superpowers Beyond Meatspace. Extended Abstracts of the 2021 CHI Conference.
Seymour, M., Yuan, L. I., Dennis, A., & Riemer, K. (2021). Have We Crossed the Uncanny Valley? Understanding Affinity, Trustworthiness, and Preference for Realistic Digital Humans in Immersive Environments. Journal of the Association for Information Systems, 22(3), 9.
Zhang, L., & Healey, P. G. (2018, October). Human, Chameleon or Nodding Dog?. In Proceedings of the 20th ACM International Conference on Multimodal Interaction (pp. 428-436).
Search Suggestions
Search suggestions

Based on your current searches we recommend the following search filters.

PhD saved successfully
View saved PhDs