Looking to list your PhD opportunities? Log in here.
About the Project
Human-robot collaboration (HRC) requires efficient mechanisms to communicate and exchange information from/to human partners to/from robots, e.g., to allow a human partner to command a robot, as well as to allow a robot to express its interpretations and intentions. Such communication and exchange of information can be achieved through different modalities, such as voice commands, hand gestures and, more recently, virtual/augmented reality interactions. Nonetheless, most of the existing approaches that employ such modalities require human partners to provide a detailed specification of the intended task, i.e., a person needs to tell a robot how to complete a given task. Such detailed specifications therefore assume human partners with full mobility, thus leading to non-inclusive HRC scenarios.
An alternative to enable more inclusive collaborations between humans and robots is to further improve and exploit robot’s decision-making capabilities. By doing so, human partners will be able to only provide high-level aspects of the tasks, i.e., a person will only need to tell the robot what to do, but not how to do it, thus enabling different forms of autonomy during remote operation, and avoiding to rely on human's full mobility and presence.
Our previous work has focused on increasing robot autonomy while allowing users to provide high-level commands via an Augmented Reality (AR) interface [1]. Our proposed AR interface has proven to be useful in allowing people to command robots and to visualise a robot's planned actions by enabling interaction with virtual objects placed in the real world. However, our preliminary results were still limited to users with full mobility. Therefore, we now propose to combine multiple interaction modalities to provide high-level commands to a robot to create more meaningful and inclusive assistive robotic experiences. As an example, our previously introduced AR interface could be complemented with the use of voice commands or gaze interaction as compelling modalities in contexts where hand input is unavailable, inconvenient, or not as ready to hand.
This research project aims to follow a user-centred, robot-agnostic and object-centric approach to investigate complementary interaction modalities that can support the collaboration with robots, particularly when a human partner has mobility limitations. Anticipated results will contribute a constellation of inclusive interaction methods for human-robot interaction with potential impact in industrial and wellbeing-driven design of social companion robots.
Please contact Dr Juan D. Hernandez Vega for further information on the project: https://www.cardiff.ac.uk/people/view/2488123-hernandez-vega-juan
Academic Criteria
A 2:1 Honours undergraduate degree or a master's degree, in computing or a related subject. Applicants with appropriate professional experience are also considered. Degree-level mathematics (or equivalent) is required for research in some project areas.
Applicants for whom English is not their first language must demonstrate proficiency by obtaining an IELTS score of at least 6.5 overall, with a minimum of 6.0 in each skills component.
How to apply
Please contact the supervisors of the project prior to submitting your application to discuss and develop an individual research proposal that builds on the information provided in this advert. Once you have developed the proposal with support from the supervisors, please submit your application following the instructions provided below
This project is accepting applications all year round, for self-funded candidates via https://www.cardiff.ac.uk/study/postgraduate/research/programmes/programme/computer-science-and-informatics
In order to be considered candidates must submit the following information:
- Supporting statement
- CV
- In the ‘Research Proposal’ section of the application enter the name of the project you are applying to and upload your Individual research proposal, as mentioned above in BOLD
- Qualification certificates and Transcripts
- Proof of Funding. For example, a letter of intent from your sponsor or confirmation of self-funded status (In the funding field of your application, insert Self-Funded)
- References x 2
- Proof of English language (if applicable)
Interview - If the application meets the entrance requirements, you will be invited to an interview.
If you have any additional questions or need more information, please contact: COMSC-PGR@cardiff.ac.uk
Funding Notes
Please note that a PhD Scholarship may also available for this PhD project. If you are interested in applying for a PhD Scholarship, please search FindAPhD for this specific project title, supervisor or School within its Scholarships category.
References
[2] Welge, J., & Hassenzahl, M. (2016). Better than human: about the psychological superpowers of robots. In International Conference on Social Robotics (pp. 993-1002). Springer, Cham.
[3] Tsui, K. M., Dalphond, J. M., Brooks, D. J., Medvedev, M. S., McCann, E., Allspaw, J., ... & Yanco, H. A. (2015). Accessible human-robot interaction for telepresence robots: A case study. Paladyn, Journal of Behavioral Robotics, 6(1).
How good is research at Cardiff University in Computer Science and Informatics?
Research output data provided by the Research Excellence Framework (REF)
Click here to see the results for all UK universitiesEmail Now
Why not add a message here
The information you submit to Cardiff University will only be used by them or their data partners to deal with your enquiry, according to their privacy notice. For more information on how we use and store your data, please read our privacy statement.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Search suggestions
Based on your current searches we recommend the following search filters.
Check out our other PhDs in Cardiff, United Kingdom
Check out our other PhDs in United Kingdom
Start a New search with our database of over 4,000 PhDs

PhD suggestions
Based on your current search criteria we thought you might be interested in these.
Adaptive socio-context awareness mechanism for human-robot interaction [Self-Funded Students Only]
Cardiff University
Human-artificial intelligence agent interaction for climate friendly behaviour adoption [Self-Funded Students only]
Cardiff University
Developing the next generation of pedestrian behaviour models for revival of high streets and sustainable transport [Self-Funded Students Only]
Cardiff University