Postgrad LIVE! Study Fairs

Bristol

University of Leeds Featured PhD Programmes
University of West London Featured PhD Programmes
Engineering and Physical Sciences Research Council Featured PhD Programmes
King’s College London Featured PhD Programmes
University of Warwick Featured PhD Programmes

Building Full Body Illusion in Immersive Virtual Reality with Physically Based Avatars


Project Description

This project aims to apply the paradigm of physically based animation to control virtual reality (VR) avatars – characters that represent the player in VR.

Players immersed in VR heavily depend on the response of the environment that must be physically reliable, predictable, and thus dependable. Current VR experiences are often simplistic, cartoon like in the way they look, but the way they interact with the player is nearly always highly realistic, based on a precise physical model. A slightest deviation easily distorts the sense of immersion.
However, avatars are often exempt from physically based simulation which is otherwise ubiquitous for virtual reality environments. They just follow the player’s movement; in lack of full body information (typically systems only track head and hands positions), they are simplified, often reduced to only hands.

In this project you will attempt to reconstruct a full body pose from the fragmentary input available from typical VR controllers (head and hands positions, 6 DOF each), using a dynamic, time-constrained bio-physical model of human body movement. This inherently underdetermined configuration will be enhanced by contextual information based on the knowledge (or a guess) of the pose intended by the player – such as running, skiing or fighting – leading to the recreation of a physically reliable representation of the player’s intentions. For example, a player may pretend fighting or skiing by performing inherently limited representations of the actual movements, in VR she will experience a combat or downhill skiing in a physically reliable setup based on a biomechanical model of the activity.
This will lead to some interesting questions: can we build bodily self-awareness (in a sense similar to the famous “rubber hand” experiment) even if the reconstructed poses are substantially different from the actual ones, but based on player’s intentions and imagination? Can we enhance immersion this way? Will we create a new “uncanny valley”? How will the proposed approach affect VR-sickness (a condition similar to motion sickness, caused by extensive exposure to virtual environments)?

Candidates should have appropriate academic qualifications (first or upper second class honours or MSc degree), in Computer Science, Games Technology, Engineering, Mathematics, Physics or other relevant area, strong background in programming (either C++ or C#) and an interest in Game Design and Psychology.

Funding Notes

There is no funding for this project: applications can only be accepted from self-funded candidates

References

Martínez del Rincón, J, Lewandowski, M., Nebel, J.C. and Makris, D. (2014) Generalised Laplacian Eigenmaps for Modelling and Tracking Human Motions. IEEE Transactions on Cybernetics, 44(9), pp. 1646-1660. ISSN (print) 2168-2267, 2014

Xue Bin Peng, Pieter Abbeel, Sergey Levine, Michiel van de Panne (2018): DeepMimic: Example-Guided Deep Reinforcement Learning of Physics-Based Character Skills. Transactions on Graphics (Proc. ACM SIGGRAPH 2018)

Related Subjects

How good is research at Kingston University in Computer Science and Informatics?

FTE Category A staff submitted: 10.20

Research output data provided by the Research Excellence Framework (REF)

Click here to see the results for all UK universities

Email Now

Insert previous message below for editing? 
You haven’t included a message. Providing a specific message means universities will take your enquiry more seriously and helps them provide the information you need.
Why not add a message here
* required field
Send a copy to me for my own records.

Your enquiry has been emailed successfully





FindAPhD. Copyright 2005-2019
All rights reserved.