University of Edinburgh Featured PhD Programmes
University of Leeds Featured PhD Programmes
University of Glasgow Featured PhD Programmes

Shared Control for Brain-Actuated Robotic Arms towards Fusion of Artificial Intelligence and Human Intelligence


Centre for Accountable, Responsible and Transparent AI

Bath United Kingdom Artificial Intelligence Biomedical Engineering Biophysics Electrical Engineering Electronic Engineering Engineering Mathematics Human Computer Interaction Mechatronics Robotics Software Engineering

About the Project

Despite much progress, the development of a brain-actuated robotic arm for patients with motor impairments to perform daily living activities using a brain-computer interface (BCI) remains an ambitious target. One key problem is the poor decoding performance of BCIs, particularly in the case of non-invasive BCIs. In this project, we aim to develop a shared control strategy to realize flexible robotic arm control for reaching and grasping of multiple objects. With intelligent assistance provided by robot vision, users are only required to complete a reaching movement and target selection using a simple motor imagery based BCI with binary output. Along with user control, the robotic arm, which can identify and localize the potential targets within the workspace in the background, is capable of furnishing both trajectory correction in the reaching phase to reduce trajectory redundancy and autonomous grasping assistance in the grasping phase.

The ambition of this project is to merge artificial intelligence and human intelligence in the control of brain-actuated robotic arms. BCIs can recognize human motion intention, so human intelligence is reflected by the BCI to control the robotic arm. The robotic arm is an autonomous system, which has artificial or machine intelligence based on vision servo control. For people suffering from severe neuromuscular disorders or accident injuries, a brain-controlled robotic arm is expected to provide assistance in their daily life. A primary bottleneck to achieve this objective is that the information transfer rate of current BCIs is not high enough to produce multiple and reliable commands during online robotic control. In this project, machine autonomy is infused into a BCI-controlled robotic arm system, where a user and a machine can work together to reach and grasp multiple objects in a given task. The intelligent robotic system can autonomously localize the potential targets and provide trajectory correction and grasping assistance accordingly. Meanwhile, the user only needs to complete rough reaching movement and target selection with a basic binary motor imagery based BCI, which can reduce the task difficulty and retain the volitional involvement of users at the same time.

The shared control system consists of three subsystems: the BCI system, the robotic arm system, and the arbitrator. A depth camera will be mounted on the gripper. Human subjects convey their intent by performing motor imagery tasks, and advanced BCI algorithms will be used to recognize the motion intent. Obtained probability values after BCI decoding will be used to generate the user velocity commands. The depth camera can record the scene point clouds, followed by pose estimation of the potential target blocks. Some deep learning methods will be used in the robotic vision. The endpoint position of the robotic arm as well as the estimated locations of potential target blocks are monitored during movement, which is used to identify the user intent and determine the type of assistance. In the process of an object grasping task, two types of intelligent assistance comprising trajectory correction and grasping assistance are available. The arbitrator will make decisions on which kind of intelligent assistance to provide according to a set of predefined rules for user intent identification. After the type of assistance is determined, the corresponding velocity commands derived from the human and the robotic arm separately will then be blended and sent to the controller. The final velocity command sent to the controller is a blend of commands generated from the human and the robotic arm.

This research project will be carried out as part of an interdisciplinary integrated PhD in the UKRI Centre for Doctoral Training in Accountable, Responsible and Transparent AI (ART-AI). The ART-AI CDT aims at producing interdisciplinary graduates who can act as leaders and innovators with the knowledge to make the right decisions on what is possible, what is desirable, and how AI can be ethically, safely and effectively deployed.

Candidates are expected to have or near complete an MSc or MEng in Electrical Engineering, Control Engineering, Robotics, Mechatronics, Computer Science, Mathematics, Physics or related areas.

Informal enquiries about the project should be directed to Dr Dingguo Zhang.

Enquiries about the application process should be sent to .

Formal applications should be made via the University of Bath’s online application form: https://samis.bath.ac.uk/urd/sits.urd/run/siw_ipp_lgn.login?process=siw_ipp_app&code1=RDUCM-FP02&code2=0003

Start date: 4 October 2021.


Funding Notes

ART-AI CDT studentships are available on a competition basis for up to 4 years. Funding will cover tuition fees and maintenance at the UKRI doctoral stipend rate (£15,285 per annum in 2020/21, increased annually in line with the GDP deflator). We offer at least ten studentships each year, up to three of which can be awarded to international students.

We also welcome all-year-round applications from self-funded candidates and candidates who can source their own funding.

References

1. A Shared Control Strategy for Reach and Grasp of Multiple Objects Using Robot Vision and Non-invasive Brain-Computer Interface, Y Xu, L Cao, X Shu, D Zhang*, IEEE Transactions on Automation Science and Engineering, 2020, in press.
2. Shared control of a robotic arm using non-invasive brain–computer interface and computer vision guidance, Y Xu, C Ding, X Shu, K Gui, Y Bezsudnova, X Sheng, D Zhang *, Robotics and Autonomous Systems 115, 121-129
3. A Wearable SSVEP-Based BCI System for Quadcopter Control Using Head-Mounted Device Wang, M., Li, R., Zhang, R., Li, G. & Zhang, D.*, 10 Apr 2018, In : IEEE Access. 6, p. 26789-26798 10 p.
4. Human-to-human closed-loop control based on brain-to-brain interface and muscle-to-muscle interface, Mashat, M. E. M., Li, G. & Zhang, D.*, 1 Dec 2017, In : Scientific Reports. 7, 1, 11001.
5. Brain-Computer Interface Controlled Cyborg: Establishing a Functional Information Transfer Pathway from Human Brain to Cockroach Brain, Li, G. & Zhang, D.*, 16 Mar 2016, In : PLoS ONE. 11, 3, p. e0150667

Email Now

Insert previous message below for editing? 
You haven’t included a message. Providing a specific message means universities will take your enquiry more seriously and helps them provide the information you need.
Why not add a message here

The information you submit to University of Bath will only be used by them or their data partners to deal with your enquiry, according to their privacy notice. For more information on how we use and store your data, please read our privacy statement.

* required field

Your enquiry has been emailed successfully



Search Suggestions

Search Suggestions

Based on your current searches we recommend the following search filters.



FindAPhD. Copyright 2005-2021
All rights reserved.