Due to the ongoing healthcare crisis and staff shortages, alternative forms of providing assistance to an ageing population is becoming critical to support a depleted healthcare community. Social robots have the potential to support this community by completing some day-to-day tasks in domestic environments, including non-contact measurement of vital signs; assessment of mood and keeping track medication schedules. This would free caregivers to focus on more acute situations. The ability to successfully communicate and interact with each other in a seamless manner is integral to the cohabitation of humans and robots. Sociable robots should be capable of proactively engaging with people within accepted social norms to enhance the interaction process. These social norms exist between humans as a combination of pre-defined ‘rules’ and through reinforcement learning from other humans. Currently, this paradigm is one in which robots struggle with, specifically in recognising many of the paralinguistic (e.g. tone, nuance) and non-verbal (e.g. body language) cues of humans.
This project aims to address the aforementioned shortcoming, by developing sociable robots (using a Pepper robot) that will utilise Multimodal deep learning techniques to socially interact with humans in a more meaningful manner.
The objective is to develop a computational model for human-robot social interaction using paralinguistic and non-verbal cues. Using robotic sensory information, these cues will be identified by extracting key features associated with each cue. A dataset of paralinguistic and non-verbal cues will be developed as part of this phase of the project which can be disseminated and utilised by the wider research community. A multimodal deep learning computational model will then be developed to analyse the extracted features for identification and classification of various paralinguistic and non-verbal social cues so that appropriate onward action can be executed. Endowing robots with the ability to conduct human-robot social interactions will contribute to advancing the integration of robot systems in human-centric environments.