About the Project
Soon, we anticipate robot surrogates will become commonplace and feature heavily in our lives. For example, consider a robot surrogate that facilitates remote support of a family member with some common home tasks. The surrogate would have control over physical aspects of the environment, for instance, the robot may move around to fetch and hand over objects to our family member. While one could rely on the robot's decision-making capabilities to complete such tasks, an efficient modality is still needed to communicate such requests to the robot . Also, a surrogate may be controlled by more than one person at a time. This project involves fundamental research on human-robot interaction to better understand how a robot surrogate and remote user(s) establish common ground and understand each other, while also developing new algorithms to grant the robot surrogate autonomy over certain actions, thus dealing with uncertainty in commands from the human operator. In the case of multiple operators, we must determine how concurrent actions from multiple users are mapped to a single action in the surrogate. Previous research  begun to address this problem, demonstrating how weighting actions from individual remote operators impacts the perceived sense of agency. However, future research must consider factors around embodiment for remote operation , .
Aims: This project aims to establish how robotic surrogacy impacts sense of presence in virtual and mixed reality environments, and how understanding human-human cooperation may inform new AI algorithms for robots. This project will develop new technologies for facilitating Human-Robot Surrogate interaction in virtual and mixed reality.
Research Questions: How should a robot surrogate indicate uncertainty to its master? How can a robot deal with errors [JV1] in a human master? How can crowdsourced solutions to problems be reduced to a single solution for a robot to complete?
Methods: The PhD will involve designing, conducting, and evaluating several human centred experiments in the lab. We will apply a mix of quantitative and qualitative research methods. Equipment includes virtual/mixed reality head mounted displays (e.g., Oculus Quest 2[JV2] , Microsoft Hololens), eye tracking technology, the Care-O-Bot 4, and ANKI Vector robots.
Deliverables: Students are expected to complete work to a high standard and write up their results in publish level quality for submission to top journals and conferences in HCI and HRI (e.g., CHI Conference, HRI/IROS/ICRA Conferences, IJHCI journal).
Keywords: hci, hri, hrc, vr, surrogacy, remote control, robots,
For more information on the project, please contact Dr Finnegan, email@example.com
Academic criteria: A 1st Class Honours undergraduate degree or a master's degree, in computing or a related subject. Applicants with appropriate professional experience are also considered. Degree-level mathematics (or equivalent) is required for research in some project areas. Applicants for whom English is not their first language must demonstrate proficiency by obtaining an IELTS score of at least 6.5 overall, with a minimum of 6.0 in each skills component.
Application Information: If you would like to be considered for the School Funded Application, please submit your application before the 30th June 2021.
In the funding field of your application, insert “I am applying for 2021 PhD Scholarship in Computer Science and Informatics”, and specify the project title and supervisors of this project in the text box provided.
This project is also open to Self-Funded students.
Apply online: https://www.cardiff.ac.uk/study/postgraduate/research/programmes/programme/computer-science-and-informatics - Please read the "How to apply" instructions carefully prior to application.
Please note that a School-Funded PhD Scholarship is available for entry 2021/22. If you are interested in applying for a PhD Scholarship, please follow the instructions available on our website: View Website
In the Funding field of your application, insert "I am applying for 2021 PhD Scholarship" and specify the project title and supervisor of this project in the fields provided.
: T. Hagiwara, M. Sugimoto, M. Inami and M. Kitazaki, "Shared Body by Action Integration of Two Persons: Body Ownership, Sense of Agency and Task Performance," 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Osaka, Japan, 2019, pp. 954-955, doi: https://dx.doi.org/10.1109/VR.2019.8798222.
: Yang Tian, Yuming Bai, Shengdong Zhao, Chi-Wing Fu, Tianpei Yang, and Pheng Ann Heng. 2020. Virtually Extended Proprioception: Providing Spatial Reference in VR through an Appended Virtual Limb. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (CHI ‘20). Association for Computing Machinery, New York, NY, USA, 1–12. DOI: https://dx.doi.org/10.1145/3313831.3376557
: J. D. Hernández, S. Sobti, A. Sciola, M. Moll and L. E. Kavraki, "Increasing Robot Autonomy via Motion Planning and an Augmented Reality Interface," in IEEE Robotics and Automation Letters, vol. 5, no. 2, pp. 1017-1023, April 2020, doi: https://dx.doi.org/10.1109/LRA.2020.2967280
Why not add a message here
Based on your current searches we recommend the following search filters.
Based on your current search criteria we thought you might be interested in these.