About the Project
Facial expression is one of the most effective means for human beings to indicate emotions, feelings and intentions. Literature shows that facial expression contributed to about 55% effect of overall emotion expression during social interactions. Therefore, significant progress in facial emotion recognition has been witnessed in cognitive, neuroscience and computational intelligence fields. Facial Action Coding System derived from the cognitive science has brought anatomical knowledge of facial behaviour to guide emotional facial expression recognition. It provided a set of 46 facial Action Units (AU) to describe thousands of momentary facial changes including emotional, conversational and other facial behaviours. An effective facial representation robust to variations, such as illumination, rotation and pose, plays a vital role to the success of emotion recognition. Although many applications employ geometric-based, appearance-based or hybrid methods for automatic facial expression recognition, it is still a difficult and challenging task to recognize facial expressions with high accuracy because of the subtlety and variability of facial expressions, especially during natural human robot interaction. Most of the popular systems are trained using posed emotions data, which restricts the system from recognising micro emotions.
In general, this project aims to provide flexible, cost-effective, robust and accurate fitting and effective facial feature extraction to deal with some of the above challenges e.g. diverse rotations, pose variations, scaling differences and micro expressions. This is a practical project that applies state-of-the-art machine learning and signal processing techniques to the emotion detection and recognition in a real life scenario. In addition, this project focuses on the development of automatic facial emotion recognition system which uses fusion of different emotional information for decision making. For demonstration, a prototype system needs to be implemented based on the proposed approaches for a real application, such as, Human-robot interactions.
Please note eligibility requirement:
* Academic excellence of the proposed student i.e. 2:1 (or equivalent GPA from non-UK universities [preference for 1st class honours]); or a Masters (preference for Merit or above); or APEL evidence of substantial practitioner achievement.
* Appropriate IELTS score, if required.
For further details of how to apply, entry requirements and the application form, see
https://www.northumbria.ac.uk/research/postgraduate-research-degrees/how-to-apply/
Please note: Applications that do not include a research proposal of approximately 1,000 words (not a copy of the advert), or that do not include the advert reference (e.g. SF18/CIS/MISTRY) will not be considered.
Start Date: 1 March 2019 or 1 June 2019 or 1 October 2019
Northumbria University takes pride in, and values, the quality and diversity of our staff. We welcome applications from all members of the community. The University holds and Athena SWAN Bronze award in recognition of our commitment to improving employment practices for the advancement of gender equality and is a member of the Euraxess network, which delivers information and support to professional researchers.
References
• Kamlesh Mistry, Li Zhang, Siew Chin Neoh, Chee Peng Lim, Benjamin Fielding, A micro-GA Embedded PSO Feature Selection Approach to Intelligent Facial Emotion Recognition, IEEE transaction on Cybernetics, 2017.
• Li Zhang, Kamlesh Mistry, Siew Chin Neoh, Chee Peng Lim, Intelligent Facial Emotion Recognition Using Moth-Firefly Optimization, Knowledge Based Systems, 2017.
• Li Zhang, Kamlesh Mistry, Ming Jiang, Siew Chin Neoh, Mohammed Alamgir Hossain, Adaptive facial point detection and emotion recognition for a humanoid robot, Computer Vision and Image Understanding 140 (2015) 93-114.
• Kamlesh Mistry, Jyoti Jasekar, Biju Issac, and Li Zhang, Extended LBP based Facial Expression Recognition System for Adaptive AI Agent Behaviour, 2018 International Joint Conference on Neural Networks (IJCNN), 2018.