Coventry University Featured PhD Programmes
John Innes Centre Featured PhD Programmes
University of Kent Featured PhD Programmes
University of Kent Featured PhD Programmes
University College London Featured PhD Programmes

Multimodal active sensing and perception for autonomous intelligent collaborative and assistive robots

Project Description

Robotic active perception is a key factor to develop autonomous systems capable of safely and intelligently assist to and interaction with humans and their surrounding environment. Active perception allows robots to not only make accurate decisions, but also to autonomously perform actions that will lead the robot to improve its own performance during a specific task, such as human-robot collaboration and physical robot assistance.

Even though state-of-the-art robots are equipped with multimodal sensors (e.g., vision, touch, hearing, force, gyroscopes), perception in robotics has been investigated using one sensing modality at a time. This fact, limits robots’ potential for autonomous learning, control, decision-making and actions, which also reduce the capability of robots to behave safely and intelligently.

This project, which is in the intersection of robotics and machine learning, is focused on the design and development of novel and robust computational methods for active perception using multimodal data from vision, touch, force and inertial measurement sensors simultaneously. The methods developed during this project will be tested in different robotic platforms such as the KUKA robot arm, NAO humanoid robot, Baxter robot, Pioneer mobile platform and Quadcopters. Potential applications for these safe and intelligent robots are in human-robot interaction and collaboration, manufacturing, autonomous driving and flying, telecontrol and telepresence, assistive robots and wearable robots for physical assistance.

The research to be undertaken in this project has a strong multidisciplinary nature. The student is expected to collaborate with project partners from areas of electronic and electrical engineering, mechanical engineering, computer science, psychology, physiotherapy and industrial partners. Furthermore, the student is expected to attend multiple events such as conferences, project meetings, summer schools and workshops.

Candidates are expected to have or near to complete an undergraduate, MSc or MEng in Robotics, Computer Science, Electronics, Mechanics, Mathematics, Physics or related areas. Any English language requirements must be met at the deadline for applications.

Informal enquiries should be directed to Dr Uriel Martinez-Hernandez ()

Formal applications should be made via the University of Bath’s online application form for a PhD in Electronic & Electrical Engineering. Please ensure that you state the full project title and lead supervisor name on the application form.

More information about applying for a PhD at Bath may be found here:

Anticipated start date: 30 September 2019

Funding Notes

This project is eligible for inclusion in funding rounds scheduled for end of November 2018, January 2019, February 2019, March 2019 and April 2019. A full application must have been submitted before inclusion in a funding round.

Funding will cover Home/EU tuition fees, a maintenance stipend (£14,777 pa (2018/19 rate)) and a training support fee of £1,000 per annum for 3.5 years. Early application is strongly recommended.

How good is research at University of Bath in Electrical and Electronic Engineering, Metallurgy and Materials?

FTE Category A staff submitted: 20.50

Research output data provided by the Research Excellence Framework (REF)

Click here to see the results for all UK universities

Email Now

Insert previous message below for editing? 
You haven’t included a message. Providing a specific message means universities will take your enquiry more seriously and helps them provide the information you need.
Why not add a message here
* required field
Send a copy to me for my own records.

Your enquiry has been emailed successfully

FindAPhD. Copyright 2005-2019
All rights reserved.