Industry Partner: Sony Interactive Entertainment
PhD Supervisor: Damian Murphy; second supervisor tbc.
Description:
Engaging metaverse experiences require significant realism in the acoustics of the virtual environments. Whether creating a plausible virtual reality world or matching real-world acoustics in an augmented reality scene, realistic acoustic rendering is critical to immersion and sense of presence. Modern game engines are starting to provide solutions for geometric acoustic modeling using approximation methods such as image source and ray tracing. Methods based on wave-based acoustic modelling are more physically accurate but have a significant computational overhead and are difficult to achieve for real-time interactive audio. This PhD project looks at developing appropriate strategies for combining techniques in real time, as well as optimisations that can be achieved through perceptual limitations of human spatial hearing.
The project is well suited to a student with strong acoustics knowledge and programming skills, looking to explore acoustic rendering using 3D game engines. The project is supported by Sony Interactive Entertainment, and will include an opportunity to develop the research through a placement at their office in London.
Skills required:
Strong knowledge on acoustics and audio propagation effects. Acquaintance with game development techniques using 3D game engines such as Unity or Unreal.
Sound Interactions in the Metaverse (SIM)
This project is part of the Sound Interactions in the Metaverse (SIM) PhD Training Programme - to find out more details, please visit the SIM website.
Who should apply?
Candidates must have (or expect to obtain) a minimum of a UK upper second-class honours degree (2.1) or equivalent in Computer Science, Electronic Engineering, Music Technology or a related subject. Prior research or industry experience would also be an advantage. Candidates are expected to have a strong interest and experience in sound and audio technology, but may have formal training and qualifications from disciplines not directly associated with audio engineering or metaverse technologies.
Successful applicants will be chosen based on their potential to do excellent research and contribute to our goal of contributing to the positive development of interactive sound technologies for a future metaverse that is beneficial to society. We especially welcome applications from candidates belonging to groups that are currently under-represented in metaverse-related industries; these include (but are not limited to) women, individuals from under-represented ethnicities, members of the LGBTQ+ community, people from low-income backgrounds, and people with physical disabilities.
The successful applicant should have a strong interest in sound, music and immersive audio technology and good programming skills. This project is highly multi-disciplinary in its nature and we welcome applicants from a broad range of core research backgrounds and interests, extending from audio signal processing and machine learning to user experience design, human-computer interaction, as well as relevant creative practice.
How to apply
Please read the application guidance first so that you understand the various steps in the application process: To apply, please select the PhD in Music Technology for October 2023 entry.
On the postgraduate application form, please select ‘CDT in Sound Interactions in the Metaverse’ as your source of funding. You do not need to provide a research proposal, just enter the name of the project you wish to apply for.
We expect strong competition and encourage those interested to submit applications as early as possible. The deadline for applications is 12:00 noon (GMT) on Monday 20th March 2023.