• National University of Singapore Featured PhD Programmes
  • University of Glasgow Featured PhD Programmes
  • Cardiff University Featured PhD Programmes
  • University of Oxford Featured PhD Programmes
  • University of Leeds Featured PhD Programmes
  • London School of Economics and Political Science Featured PhD Programmes
  • University of East Anglia Featured PhD Programmes
  • University of Leeds Featured PhD Programmes
Wellcome Trust Featured PhD Programmes
University of Oxford Featured PhD Programmes
University College London Featured PhD Programmes
King’s College London Featured PhD Programmes
University of Bristol Featured PhD Programmes

Multi-modal Intelligent Sensing and Recognition for Human-Robot Interaction/Collaboration

This project is no longer listed in the FindAPhD
database and may not be available.

Click here to search the FindAPhD database
for PhD studentship opportunities
  • Full or part time
    Dr Ju
    Dr C Yang
    Dr Bader-EL-Den
  • Application Deadline
    No more applications being accepted
  • Self-Funded PhD Students Only
    Self-Funded PhD Students Only

Project Description

PROJECT REF: CCTS3390217

In the human-robot interaction/collaboration, the robot is supposed to be able to detect, perceive and understand corresponding human motions in the environment to interact, co-operate, imitate or learn in an intelligent manner. Sensory information of both human motions and the environment is captured by various types of sensors such as cameras, markers, accelerometers, tactile sensors, etc [1]. Research applications of human motion analysis in human-robot interactions/collaborations include programming by demonstration, imitation, tele-operation, activity or context recognition and humanoid design [2]. In addition, the extraction of meaningful information about the environment through perceptual systems also plays a key role in scene representation and recognition to future make the robot interact with human in a more natural way [7]. The aim of scene representation for HRI is to describe the way in which human and robot tend to interact around a scene and to generate a representation tied to geography, indicating which types of motions might happen in which part of the scene. It can enable a robot to respond efficiently to user commands, which refer to spatial locations, object features or object labels without re-performing a visual search each time. The objectives of this project are:

1. To develop a multimodal-sensing platform for human-robot interaction and collaboration, using various types of sensors such as depth cameras, markers, accelerometers, tactile sensors, force sensors, bio-signal sensors, etc. to capture both human motions and the operation environment.

2. To investigate a more robust and less noisy representation of human action features, including the local and globe features, incorporating a variety of uncertainties, e.g., quality of images, individual action habits, different environments, etc.

3. To investigate an advanced motion analysis framework including hierarchical data fusion strategies and off-the-shelf probabilistic recognition algorithms, to synchronise and fuse the sensory information for the real-time analysis and automatic recognition of the human action with satisfactory accuracy and reliable fusion results. The priority is given to balancing the effectiveness and efficiency of the system.

4. We will investigate effective methods for scene representation using dynamic neural field including transient detectors, temporal variation model, etc. The scene representation will be incorporated into the motion analysis framework to achieve a more effective and stable system.

Funding Notes

Please use our online application form and state the project code (CCTS3390217) and title in the personal statement section.

References

References to recent published articles:

[1] Ju Z. and Liu H. Fuzzy Gaussian Mixture Models, Pattern Recognition, 45(3):1146-1158, 2012.

[2] Ju Z, Liu H. Human Hand Motion Analysis With Multisensory Information [J]. IEEE/ASME Transactions on Mechatronics, 19(2):456-466, 2014.

Related Subjects

How good is research at University of Portsmouth in Computer Science and Informatics?

FTE Category A staff submitted: 13.00

Research output data provided by the Research Excellence Framework (REF)

Click here to see the results for all UK universities

Cookie Policy    X