About the Project
As manufacturing processes become more global, and as clearly demonstrated in the current pandemic, businesses require increasing connectivity and collaboration across their world-wide footprint, to enable smart working, accelerated decision making, and reduced impact on the environment. While current communication technology, such as smartphones and voice-over-IP, eg. Skype/Zoom, are able to facilitate communication between remote partners including voice and visuals, they are not able to achieve the strongest level of “presence” in the remote environment which is required to take decisions with the highest confidence. This is particularly necessary for manufacturers’ many management meetings, project reviews and other decision making fora. Furthermore, realistic representations of attendees in a single virtual decision making space, could save manufacturers significant travel, money, and time commitment costs, and indeed make decisions possible even when travel might be severely restricted.
High-fidelity multisensory Virtual Reality techniques offers the potential to capture complex physically activities fully in 3D and then enable a user to directly interact in that environment with other users, in a location of their own choosing. However, to enable decisions to be made with confidence in such virtual environments, it is essential that the environments are not only believable, but also authentically recreate and deliver remotely, and in real-time, the reconstructed real environment, the people attending, and any items being considered in the meeting, eg. a new car design.
This PhD will investigate novel methods of capturing, transmitting, and delivering high quality, multisensory, 3D data between remote locations. In particular, the research will consider and advance the state-of-the-art in real-time connectivity and authenticity associated with virtual attendees. This will include accurately matching the lighting and audio from a venue, life-like actions and interactions between the real and virtual participants, and accurate real-time tracking to enable natural movement and gesturing.
This project is funded via the Leete Award, generously awarded by The Worshipful Company of Engineers. It is also partially funded via EPSRC.
DESIRED STUDENT BACKGROUND:
Candidates should have a minimum of an upper second (2.1) honours degree (or equivalent) in a mathematical science: physics, mathematics, engineering and/or computer science.
Coding skills is also a requirement and previous experience with virtual reality and 3D environments would be advantageous.
Stipend: Standard Research Council Maintenance Award: £15,285 + £2,571.43 top up (top up is subject to funding)
Funding Eligibility: UK nationals only (must be eligible for home fees status).
Based on your current searches we recommend the following search filters.
Based on your current search criteria we thought you might be interested in these.