Robotics interacting and collaborating seamlessly with humans is a notion that has captivated the society and industry for decades. Particularly, manufacturing, service and assistive robotics have shown great interest in the integration of intelligent devices that can safely and efficiently interact, collaborate and assist humans. For instances, robots co-workers that safely collaborate with humans in training and assembly tasks, robots that provided services at home such as cleaning and cooking, and wearable assistive robots that help humans to perform activities of daily living such as walking and handling objects.
The design and development of these type of robots require sophisticated cognitive architectures composed of perception (high-level) and control (low-level) layers. The high-level layer implements Artificial Intelligence algorithms for perception and decision making processes using data from multiple sensing modalities available in the robot. The decisions from this layer are used by the low-level layer to execute advanced control methods to provide the actual control to the robotic device, e.g., mobile platform, robotic arm, wearable robot. This process known as perception – action loop is crucial to deploy robots capable of understanding the state of the surrounding environment while safely interacting and collaborating with humans and other robots.
This multidisciplinary project, which covers knowledge from Artificial Intelligence, Robotics, Electronics & Mechanical Engineering and Neuroscience, seeks to develop advanced and intelligent robotics systems capable of making fast and accurate decisions and actions. Examples of applications that can be developed in this project are: 1) cyber-physical systems that, using wearable sensors and multimodal data, allow humans and robot to safely and closely work in a shared environment; 2) robots that are capable of learning to perform actions autonomously though the observation and learning of human actions; 3) wearable robots that recognise the intention of human movement to provide the required assistance; 4) robots that using touch, vision and audio sensors are capable of navigating and exploring autonomously the surrounding environment.
This project is associated with the UKRI CDT in Accountable, Responsible and Transparent AI (ART-AI), which is looking for its first cohort of at least 10 students to start in September 2019. Students will be fully funded for 4 years (stipend, UK/EU tuition fees and research support budget). Further details can be found at: http://www.bath.ac.uk/research-centres/ukri-centre-for-doctoral-training-in-accountable-responsible-and-transparent-ai/
The research to be undertaken in this project has a strong multidisciplinary nature. The student is expected to collaborate with partners from areas of computer science, electronic and electrical engineering, mechanical engineering and psychology. Furthermore, the student is expected to attend multiple events such as conferences, project meetings, summer schools and workshops.
Desirable qualities in candidates include intellectual curiosity, a strong background in maths and programming experience.
Applicants should hold, or expect to receive, a First Class or good Upper Second Class Honours degree in Computer Science, Robotics, Electronics, Mechanics, Mathematics, Physics or related areas. A master’s level qualification would also be advantageous.
Informal enquiries about the project should be directed to Dr Uriel Martinez Hernandez on email address [email protected]
Enquiries about the application process should be sent to [email protected]
Formal applications should be made via the University of Bath’s online application form for a PhD in Computer Science: https://samis.bath.ac.uk/urd/sits.urd/run/siw_ipp_lgn.login?process=siw_ipp_app&code1=RDUCM-FP01&code2=0013
Start date: 23 September 2019.