Birkbeck, University of London Featured PhD Programmes
Norwich Research Park Featured PhD Programmes
University of Kent Featured PhD Programmes
Norwich Research Park Featured PhD Programmes
The Hong Kong Polytechnic University Featured PhD Programmes

Deep View Synthesis for Virtual Reality Video

This project is no longer listed on and may not be available.

Click here to search for PhD studentship opportunities
  • Full or part time
    Dr Christian Richardt
    Dr Neill Campbell
  • Application Deadline
    No more applications being accepted
  • Competition Funded PhD Project (European/UK Students Only)
    Competition Funded PhD Project (European/UK Students Only)

Project Description

The goal of this project is to capture and reconstruct the visual appearance of dynamic real-world environments using deep-learning techniques to enable more immersive virtual reality video experiences.

State-of-the-art VR video approaches (e.g. Anderson et al., 2016) produce stereoscopic 360° video, which comprises separate 360° videos for the left and right eye (like 3D movies, but in 360°). The videos can, for example, be viewed on YouTube using a VR headset such as Google Cardboard or Daydream. Unfortunately, such videos only allow viewers to look in different directions, but they do not respond to any head motion such as moving left/right, forward/backwards or up/down. Truly immersive VR video, on the other hand, requires ‘freedom of motion’ in six degrees-of-freedom (‘6-DoF’), so that viewers see the correct views of an environment regardless of where they are (3 DoF) and where they are looking (+3 DoF).

This project aims to develop novel-view synthesis techniques using deep learning that are capable of producing high-quality, temporally-coherent, time-varying VR video of dynamic real-world environments from one or more standard or 360-degree video cameras. In particular, the goal is to convincingly reconstruct the visual dynamics of the real world, such as people and moving animals or plants, so that the reconstructed dynamic geometry can provide the foundation for a novel video-based rendering approach that synthesises visually plausible novel views with 6 degrees-of-freedom for the specific head position and orientation of a viewer in VR. This experience will provide correct motion parallax and depth perception to the viewer (like Luo et al., 2018) to ensure unparalleled realism and immersion.

Candidates should normally have a very good undergraduate degree (equivalent to First Class), or a Master’s degree in visual computing, computer science, or a related discipline. A strong mathematical background and strong previous programming experience, preferably in C++ and/or Python, is required. Candidates must have a strong interest in visual computing, and previous experience in computer vision, computer graphics, deep learning and image processing is highly desirable.

For more general information on studying for a PhD in computer science at Bath, see:

Candidate –

Applicants should hold, or expect to receive, a First Class or high Upper Second Class UK Honours degree (or the equivalent qualification gained outside the UK) in a relevant subject. A master’s level qualification would also be advantageous.

Applications –

Informal enquiries are welcomed and should be directed to Dr Christian Richardt ([Email Address Removed]).

Formal applications should be made via the University of Bath’s online application form:

Please ensure that you quote the supervisor’s name and project title in the ‘Your research interests’ section.

More information about applying for a PhD at Bath may be found here:

Anticipated start date: 30 September 2019.

Funding Notes

Candidates may be considered for a University Research Studentship which will cover UK/EU tuition fees, a training support fee of £1,000 per annum and a tax-free maintenance allowance at the UKRI Doctoral Stipend rate (£14,777 in 2018-19) for a period of up to 3.5 years.


R. Anderson, D. Gallup, J. T. Barron, J. Kontkanen, N. Snavely, C. Hernandez, S. Agarwal and S. M. Seitz, “Jump: Virtual Reality Video”. ACM Transactions on Graphics (Proceedings of SIGGRAPH Asia 2016)

B. Luo, F. Xu, C. Richardt, J.-H. Yong, “Parallax360: Stereoscopic 360° Scene Representation for Head-Motion Parallax”. IEEE Transactions on Visualization and Computer Graphics (IEEE VR 2018)

FindAPhD. Copyright 2005-2019
All rights reserved.