About the Project
Virtual humans (Mica, Siren and Senua) and digital characters are the future of interactive immersive experiences, movies and videogames. However, these are technically difficult to achieve, and are largely one-off creations, impossible to achieve at scale or by small companies. Convincing characters become even more challenging in AR and VR, where a user can act in unexpected ways and where real-time performance is key in order to reduce adverse visual effects and negative user responses.
Current video games allow fine grain shape/colour selection of multiple features (hair, mouth, eyes, tattoos, etc). Examples include Star Citizen, Fallout 4, Black Desert, Destiny, Xenoblade Chronicles X. These interfaces can be time consuming and make it difficult to visualise and ‘browse’ multiple options intelligently or be shown options the player may not have thought of - unless through a random generator which ‘seeds’ the appearance of the character and allows the user to edit from that point. In addition, building assets for video game characters can take professional artists days of highly-skilled labour. In the research community, Generative Adversarial Networks (GANs) have recently been shown to produce highly impressive results in image synthesis and exploration (e.g. [CycleGAN, CoGAN, PoseGAN]). This has also led to attempts to synthesise stylised Anime characters [AnimeGAN]. However, while there is promising work on the semantic control of these spaces (e.g. controls for generating faces with controls for smile intensity, hair colour) [CoGAN], these techniques – as well as other GAN-based image synthesis methods - have yet to be applied to the domain of 3D character models or character creation.
In this project, we will investigate the use of AI to enable rapid character and asset creation for personalised highly stylised avatars or for high quality rapid character creation for e.g. video games. These systems will have semantically meaningful controls for e.g. shape, size, style and appearance, and will allow users to guide the design of the character with additional data, e.g. upload photographs, concept art or even voice audio to personalise or rapidly accelerate character creation.
The project will be based in CAMERA (www.camera.ac.uk) and will also overlap with research and commercial projects within the centre.
Applicants should hold, or expect to receive, a First Class or high Upper Second Class UK Honours degree (or the equivalent qualification gained outside the UK) in a relevant subject and should have strong mathematical and programming ability (e.g. Python, C++, Matlab, Java). A master’s level qualification would also be advantageous. Non-UK applicants must meet our English language entry requirement http://www.bath.ac.uk/study/pg/apply/english-language/index.html.
How to apply:
Informal enquiries are welcomed and should be addressed to Prof Darren Cosker.
Formal applications should be made via the University of Bath’s online application form for a PhD in Computer Science
Please ensure that you quote the supervisor’s name and project title in the ‘Your research interests’ section.
More information about applying for a PhD at Bath may be found here:
Anticipated start date: 28 September 2020.
Funding will cover UK/EU tuition fees, a stipend (£15,285 per annum, 2020/21 rate) and a training support fee of £1,000 per annum for 3.5 years. Early application is strongly recommended.
Applicants classed as Overseas for tuition fee purposes are NOT eligible for funding; however, we welcome all-year-round applications from self-funded candidates and candidates who can source their own funding.
[PoseGAN] Pose Guided Person Image Generation. Ma et al. NIPS 2017. https://arxiv.org/pdf/1705.09368.pdf
[CycleGAN] Unparied Image to Image Translation using Cycle-Consistent Adversarial Networks, Zhu et al. ICCV 2017. https://arxiv.org/pdf/1703.10593.pdf
[CoGAN] Coupled Generative Adversarial Networks, Liu and Tuzel, NIPS 2016, https://arxiv.org/pdf/1606.07536.pdf
Based on your current searches we recommend the following search filters.
Based on your current search criteria we thought you might be interested in these.