Don't miss our weekly PhD newsletter | Sign up now Don't miss our weekly PhD newsletter | Sign up now

  Facial emotion analysis to assess the engagement level and reduce video conferencing fatigue


   Faculty of Engineering & Digital Technologies

This project is no longer listed on FindAPhD.com and may not be available.

Click here to search FindAPhD.com for PhD studentship opportunities
  Dr Irfan Mehmood, Prof Hassan Ugail  Applications accepted all year round  Self-Funded PhD Students Only

About the Project

Faces are a source of rich information about individual identity, and also about mood and mental state, being accessible windows into the mechanisms governing our emotions. Studies reveal that the most expressive way humans display emotions is through facial expressions. Facial expressions are the primary source of information, next to words, in determining an individual’s internal feelings. All people thus certainly lecturers and students use facial expressions to form impressions of another. It has been observed that the facial expressions of the lecturers kept the students motivated and interested during the lectures. A lecturer can also use student’s facial expressions as valuable sources of feedback. While delivering a lecture, a Lecturer should use student’s expressions to determine whether or not to slow down, speed up, or in some other way modify his presentation. The basic strategy of optimizing the classroom behaviour is that the teachers must have the capability to feel student’s minds changing; they must be good at observing student’s facial expressions, every action and movement. This helps the lecturers to understand their own weakness and to change it. Lecturers should be highly skilled in understanding the emotions in order to identify the comprehension of the students from their facial expressions themselves. If the lecturers are not able to identify the significance in the facial expressions it will undermine the understanding of the students, thereby, create a negative impact on student’s learning.

The aim of this project is to examine whether the facial expressions of the participants in videoconferencing can measure the engagement and interpret comprehension. Our goal is to identify physical behaviours of the face that are linked to emotional states and then to identify how these emotional states are linked to participant’s engagement and comprehension. In this work, we will focus on the case study of student and teacher online classroom environment. The effectiveness of a student’s facial expressions in non-verbal communication in a virtual pedagogical environment will be investigated first. Next, the specific elements of the learner’s behaviour for the different emotional states and the relevant facial expressions signalled by the action units will be interpreted. Finally, it will focus on finding the impact of the relevant facial expression on the student’s comprehension. Engagement of participants in the online sessions will be measured into three categories highly attentive, partially attentive and not attentive, considering the facial expression in the spatial and temporal domain along with other cues such as gaze direction, head movement, hand and upper body movement.

Computer Science (8)

Funding Notes

This is a self-funded project; applicants will be expected to pay their own fees or have access to suitable third-party funding, such as the Doctoral Loan from Student Finance.

References

1. Johnson, B.J., 2021. Video Meetings in a Pandemic Era: Emotional Exhaustion, Stressors, and Coping (Doctoral dissertation, Antioch University).
2. Anbusegaran, S., 2021. Unobtrusive Assessment Of Student Engagement Levels In Online Classroom Environment Using Emotion Analysis.
3. Monkaresi, H., Bosch, N., Calvo, R.A. and D'Mello, S.K., 2016. Automated detection of engagement using video-based estimation of facial expressions and heart rate. IEEE Transactions on Affective Computing, 8(1), pp.15-28.

Where will I study?

Search Suggestions
Search suggestions

Based on your current searches we recommend the following search filters.