FindA University Ltd Featured PhD Programmes
FindA University Ltd Featured PhD Programmes
University of Edinburgh Featured PhD Programmes
University College London Featured PhD Programmes
University of Surrey Featured PhD Programmes

Optimised acquisition and coding for future immersive formats based on deep scene analysis


Project Description

Background
The University of Bristol, through Bristol Vision Institute (BVI), is a world leader in vision science and vision engineering. Bristol has a long and rich tradition at the forefront of the study of visual communications and image understanding, artificial vision systems, human and animal vision, all based on interdisciplinary collaborations across engineering, science, medicine and the creative arts (www.bristol.ac.uk/vision-institute).

This project involves two BVI research groups – the Visual Information Laboratory (VI-Lab) and the Experimental Psychology Vision Group (EPVG). VI-Lab (www.bristol.ac.uk/vi-lab) undertakes innovative, collaborative and interdisciplinary research resulting in world leading technology in the areas of computer vision, image and video communications, content analysis and distributed sensor systems. VI-Lab is one of the largest groupings of its type in the UK. The group’s research studios provide an excellently equipped facility for undertaking this research, containing state of the art cameras, displays and measurement equipment together with a comprehensive suite for psychophysical experimentation and subjective testing. EPVG has an international reputation for its work on human visual behaviour, with particular strengths in understanding basic perceptual processes, visual attention, visual memory, eye movements and natural scene statistics. This work includes understanding human interactions with real and virtual environments.

Project Description
There is a hunger for new, more immersive video content from consumers, producers and network operators. Efforts in this area have focused on extending the video parameter space with greater dynamic range, increased spatial and temporal resolutions, wider colour gamut, enhanced interactivity through 360 degree content, larger displays and, of course, the full stimulation of peripheral vision through head-mounted displays that provide a greater sense of immersion for many users. There is however, a very limited understanding of the interactions between the dimensions in this extended parameter space and their relationship to content statistics, visual engagement and delivery methods. The way we represent these immersive video formats is thus key in ensuring that content is delivered at an appropriate quality, which retains the intended immersive properties of the format, while retaining compatibility with the bandwidth and variable nature of the transmission channel. Major research innovations are needed to solve this problem.

The research challenges to be addressed are based on the hypothesis that, by exploiting the perceptual properties of the Human Visual System, and its content-dependent performance, we can obtain step changes in visual engagement while also managing bit rate. We must therefore:
i) understand the perceptual relationships between video parameters and content type; and
ii) develop new visual content representations that adapt to content statistics and their immersive requirements. Solutions to this problem will be addressed in this project exploiting machine (deep) learning methods to classify scene content and relate this to the extended video parameter space.

The person
We are seeking a person with an interest in technology for video communications that exploits perceptual processes. The person would ideally have an undergraduate or a Master’s degree in a relevant discipline such as Computer Science, or Electronic Engineering, or could be a Psychology graduate with a strong mathematical background. Due to the interdisciplinary nature of this work, applicants with different scientific backgrounds will also be considered.

Contact
For further details or to discuss this (or other relevant) areas please contact Professor David Bull email: including a full CV and any other relevant details.

Funding Notes

The post will start in September/October 2019 and will be funded over 3.5 years. The award will cover an enhanced EPSRC stipend, home-based student fees and a substantial budget for consumables, travel and subsistence. The person will also have extensive opportunities to work closely with BBC R&D throughout, including secondments and training. Standard EPSRC eligibility rules apply. This advert will remain open until a suitable candidate is recruited. Applicants are encouraged to contact us as soon as possible.

Email Now

Insert previous message below for editing? 
You haven’t included a message. Providing a specific message means universities will take your enquiry more seriously and helps them provide the information you need.
Why not add a message here
* required field
Send a copy to me for my own records.

Your enquiry has been emailed successfully





FindAPhD. Copyright 2005-2019
All rights reserved.