Imperial College London Featured PhD Programmes
King’s College London Featured PhD Programmes
Queen’s University Belfast Featured PhD Programmes

Brain-Computer Interface (BCI) Controlled Robotic Exoskeletons


   Centre for Accountable, Responsible and Transparent AI


Bath United Kingdom Artificial Intelligence Biomedical Engineering Biophysics Electrical Engineering Electronic Engineering Engineering Mathematics Human Computer Interaction Mechatronics Robotics Software Engineering

About the Project

Despite much progress, the development of a brain-actuated robotic exoskeleton for patients with motor impairments to perform daily living activities using a brain-computer interface (BCI) remains an ambitious target. One key problem is the poor decoding performance of BCIs, particularly in the case of non-invasive BCIs. In this project, we aim to develop a shared control strategy to realize flexible exoskeleton control for reaching and grasping of multiple objects. With intelligent assistance provided by robot vision, users are only required to complete a reaching movement and target selection using a simple motor imagery based BCI with binary output. Along with user control, the exoskeleton, which can identify and localize the potential targets within the workspace in the background, is capable of furnishing both trajectory correction in the reaching phase to reduce trajectory redundancy and autonomous grasping assistance in the grasping phase.

The ambition of this project is to merge artificial intelligence and human intelligence in the control of brain-actuated exoskeletons. BCIs can recognize human motion intention, so human intelligence is reflected by the BCI to control the exoskeleton. The exoskeleton is an autonomous system, which has artificial or machine intelligence based on vision servo control. For people suffering from severe neuromuscular disorders or accident injuries, a brain-controlled exoskeleton is expected to provide assistance in their daily life. A primary bottleneck to achieve this objective is that the information transfer rate of current BCIs is not high enough to produce multiple and reliable commands during online robotic control. In this project, machine autonomy is infused into a BCI-controlled exoskeleton system, where a user and a machine can work together to reach and grasp multiple objects in a given task. The intelligent robotic system can autonomously localize the potential targets and provide trajectory correction and grasping assistance accordingly. Meanwhile, the user only needs to complete rough reaching movement and target selection with a basic binary motor imagery based BCI, which can reduce the task difficulty and retain the volitional involvement of users at the same time.

The shared control system consists of three subsystems: the BCI system, the exoskeleton system, and the arbitrator. A depth camera will be mounted on the gripper. Human subjects convey their intent by performing motor imagery tasks, and advanced BCI algorithms will be used to recognize the motion intent. Obtained probability values after BCI decoding will be used to generate the user velocity commands. The depth camera can record the scene point clouds, followed by pose estimation of the potential target blocks. Some deep learning methods will be used in the robotic vision. The endpoint position of the exoskeleton as well as the estimated locations of potential target blocks are monitored during movement, which is used to identify the user intent and determine the type of assistance. In the process of an object grasping task, two types of intelligent assistance comprising trajectory correction and grasping assistance are available. The arbitrator will make decisions on which kind of intelligent assistance to provide according to a set of predefined rules for user intent identification. After the type of assistance is determined, the corresponding velocity commands derived from the human and the exoskeleton separately will then be blended and sent to the controller. The final velocity command sent to the controller is a blend of commands generated from the human and the exoskeleton.

This research project will be carried out as part of an interdisciplinary integrated PhD in the UKRI Centre for Doctoral Training in Accountable, Responsible and Transparent AI (ART-AI). The ART-AI CDT aims at producing interdisciplinary graduates who can act as leaders and innovators with the knowledge to make the right decisions on what is possible, what is desirable, and how AI can be ethically, safely and effectively deployed.

Candidates are expected to have or near completion of an MSc or MEng in Electrical Engineering, Control Engineering, Robotics, Mechatronics, Computer Science, Mathematics, Physics or related areas.

Informal enquiries about the project should be directed to Dr Dingguo Zhang.

Formal applications should include a research proposal and be made via the University of Bath’s online application form. Enquiries about the application process should be sent to .

Start date: 3 October 2022.


Funding Notes

ART-AI CDT studentships are available on a competition basis and applicants are advised to apply early as offers are made from January onwards. Funding will cover tuition fees and maintenance at the UKRI doctoral stipend rate (£15,609 per annum in 2021/22, increased annually in line with the GDP deflator) for up to 4 years.
We also welcome applications from candidates who have their own funding.

References

1. A Shared Control Strategy for Reach and Grasp of Multiple Objects Using Robot Vision and Non-invasive Brain-Computer Interface, Y Xu, L Cao, X Shu, D Zhang*, IEEE Transactions on Automation Science and Engineering, 2020, in press.
2. Shared control of a robotic arm using non-invasive brain–computer interface and computer vision guidance, Y Xu, C Ding, X Shu, K Gui, Y Bezsudnova, X Sheng, D Zhang*, Robotics and Autonomous Systems 115, 121-129
3. A Wearable SSVEP-Based BCI System for Quadcopter Control Using Head-Mounted Device Wang, M., Li, R., Zhang, R., Li, G. & Zhang, D.*, 10 Apr 2018, In : IEEE Access. 6, p. 26789-26798 10 p.
4. Human-to-human closed-loop control based on brain-to-brain interface and muscle-to-muscle interface, Mashat, M. E. M., Li, G. & Zhang, D.*, 1 Dec 2017, In : Scientific Reports. 7, 1, 11001.
5. Brain-Computer Interface Controlled Cyborg: Establishing a Functional Information Transfer Pathway from Human Brain to Cockroach Brain, Li, G. & Zhang, D.*, 16 Mar 2016, In : PLoS ONE. 11, 3, p. e0150667

Email Now


Search Suggestions
Search suggestions

Based on your current searches we recommend the following search filters.

PhD saved successfully
View saved PhDs