Get free PhD updates, every week | SIGN UP NOW Get free PhD updates, every week | SIGN UP NOW

Assistive Robotics Experiences: Towards More Inclusive Human-robot Collaboration via High-level User Specifications and Multimodal Interactions [Self-Funded Students Only]

   Cardiff School of Computer Science & Informatics

About the Project

Human-robot collaboration (HRC) requires efficient mechanisms to communicate and exchange information from/to human partners to/from robots, e.g., to allow a human partner to command a robot, as well as to allow a robot to express its interpretations and intentions. Such communication and exchange of information can be achieved through different modalities, such as voice commands, hand gestures and, more recently, virtual/augmented reality interactions. Nonetheless, most of the existing approaches that employ such modalities require human partners to provide a detailed specification of the intended task, i.e., a person needs to tell a robot how to complete a given task. Such detailed specifications therefore assume human partners with full mobility, thus leading to non-inclusive HRC scenarios.

An alternative to enable more inclusive collaborations between humans and robots is to further improve and exploit robot’s decision-making capabilities. By doing so, human partners will be able to only provide high-level aspects of the tasks, i.e., a person will only need to tell the robot what to do, but not how to do it, thus enabling different forms of autonomy during remote operation, and avoiding to rely on human's full mobility and presence.

Our previous work has focused on increasing robot autonomy while allowing users to provide high-level commands via an Augmented Reality (AR) interface [1]. Our proposed AR interface has proven to be useful in allowing people to command robots and to visualise a robot's planned actions by enabling interaction with virtual objects placed in the real world. However, our preliminary results were still limited to users with full mobility. Therefore, we now propose to combine multiple interaction modalities to provide high-level commands to a robot to create more meaningful and inclusive assistive robotic experiences. As an example, our previously introduced AR interface could be complemented with the use of voice commands or gaze interaction as compelling modalities in contexts where hand input is unavailable, inconvenient, or not as ready to hand.

This research project aims to follow a user-centred, robot-agnostic and object-centric approach to investigate complementary interaction modalities that can support the collaboration with robots, particularly when a human partner has mobility limitations. Anticipated results will contribute a constellation of inclusive interaction methods for human-robot interaction with potential impact in industrial and wellbeing-driven design of social companion robots.

Please contact Dr Juan D. Hernandez Vega for further information on the project:

Academic Criteria

A 2:1 Honours undergraduate degree or a master's degree, in computing or a related subject. Applicants with appropriate professional experience are also considered. Degree-level mathematics (or equivalent) is required for research in some project areas. 

Applicants for whom English is not their first language must demonstrate proficiency by obtaining an IELTS score of at least 6.5 overall, with a minimum of 6.0 in each skills component. 

How to apply

Please contact the supervisors of the project prior to submitting your application to discuss and develop an individual research proposal that builds on the information provided in this advert. Once you have developed the proposal with support from the supervisors, please submit your application following the instructions provided below

This project is accepting applications all year round, for self-funded candidates via 

In order to be considered candidates must submit the following information: 

  • Supporting statement 
  • CV 
  • In the ‘Research Proposal’ section of the application enter the name of the project you are applying to and upload your Individual research proposal, as mentioned above in BOLD
  • Qualification certificates and Transcripts
  • Proof of Funding. For example, a letter of intent from your sponsor or confirmation of self-funded status (In the funding field of your application, insert Self-Funded)
  • References x 2 
  • Proof of English language (if applicable) 

Interview - If the application meets the entrance requirements, you will be invited to an interview.  

If you have any additional questions or need more information, please contact:  

Funding Notes

This project is offered for self-funded students only, or those with their own sponsorship or scholarship award.
Please note that a PhD Scholarship may also available for this PhD project. If you are interested in applying for a PhD Scholarship, please search FindAPhD for this specific project title, supervisor or School within its Scholarships category.


[1] J. D. Hernández, S. Sobti, A. Sciola, M. Moll, and L. E. Kavraki, “Increasing robot autonomy via motion planning and an augmented reality interface,” IEEE Robot. Autom. Lett., vol. 5, no. 2, pp. 1017–1023, Apr. 2020.
[2] Welge, J., & Hassenzahl, M. (2016). Better than human: about the psychological superpowers of robots. In International Conference on Social Robotics (pp. 993-1002). Springer, Cham.
[3] Tsui, K. M., Dalphond, J. M., Brooks, D. J., Medvedev, M. S., McCann, E., Allspaw, J., ... & Yanco, H. A. (2015). Accessible human-robot interaction for telepresence robots: A case study. Paladyn, Journal of Behavioral Robotics, 6(1).

How good is research at Cardiff University in Computer Science and Informatics?

Research output data provided by the Research Excellence Framework (REF)

Click here to see the results for all UK universities

Email Now

Search Suggestions
Search suggestions

Based on your current searches we recommend the following search filters.

PhD saved successfully
View saved PhDs