Don't miss our weekly PhD newsletter | Sign up now Don't miss our weekly PhD newsletter | Sign up now

  The neuroaesthetics of immersive sounds - neurochemical profiling and neurophysiological characterization


   School of Stage and Screen

This project is no longer listed on FindAPhD.com and may not be available.

Click here to search FindAPhD.com for PhD studentship opportunities
  Dr Emma Margetson, Prof Saak Victor Ovsepian, Prof Stephen Kennedy  No more applications being accepted  Competition Funded PhD Project (Students Worldwide)

About the Project

This interdisciplinary PhD scholarship (Vice Chancellor Scholarship) aims to elucidate neurochemical dynamics and mechanisms underlying the aesthetic emotions elicited by exposure to different immersive sound scenarios. 

There is a substantial amount of research into neural and biochemical mechanisms underlying perceptual and cognitive musical processes, driving the activity of higher limbic and cortical networks into various emotional states and generating aesthetic feelings. However, there remains, a significant unexplored terrain about the specific characteristics of music directly linked to aesthetic experiences. Likewise, little is known about the mechanism of modulatory effect of the context, environment or the source of the music. 

We are looking for a PhD candidate with cross-disciplinary knowledge and experience in neuroscience and spatial sound (musical) experiences. To undertake a research project into the neural mechanisms responsible for processing immersive soundscapes, to improve our understanding the mechanisms and dynamics of how aesthetic encounters are processed in the brain.

The results of our experiments should yield important insights into the neurobiology and mechanisms of aesthetic experiences, providing more quantitative means for improving self-connection, creativity, prosocial behaviours and cognitive flexibility, leading to better mental health. 

Working with a supervisory team across the SOUND/IMAGE Research Centre and the Centre for Organized and Functional Molecules FES, this cross-disciplinary team will ensure the required skills in spatiality and creativity, molecular and cellular neurobiology with analysis of biomarkers, and neuroimaging for this interdisciplinary research project.

The focus of this project will feed into and complement ongoing research activities in the SOUND/IMAGE Research Centre, building upon the AHRC infrastructure investment in SHIFT (Shared Hub for Immersive Future Technologies), utilising and providing an additional new strand in the sharing of spatial and immersive experiences. The facilities and equipment which will be used as part of this research project may include: 

  • Spatial Audio Studio and VR Lab (a controlled listening environment utilising technologies such as Higher Order Ambisonics, Dolby Atmos, Auro 3D, DTS-X) – a system of 32.4 Genelec 8331 loudspeakers
  • Digital Immersive Theatre – offering 360 degree visuals and sound
  • Loudspeaker Orchestra (range of Genelec loudspeakers which can be set up in different spatial formats in a required setting)
  • IKO Loudspeaker – A loudspeaker of twenty speakers to project Higher Order Ambisonic sound spaces
  • State-of-the-art analytical laboratories and research facilities for quantitative biochemistry, molecular biology and sensing
Biological Sciences (4) Creative Arts & Design (9) Medicine (26)

Funding Notes

Bursary available (subject to satisfactory performance):
Year 1: £18,622 (FT) or pro-rata (PT) Year 2: In line with UKRI rate Year 3: In line with UKRI rate

In addition, the successful candidate will receive a contribution to tuition fees equivalent to the university’s Home rate, currently £4,712 (FT) or pro-rata (PT), for the duration of their scholarship. International applicants will need to pay the remainder tuition fee for the duration of their scholarship.
This fee is subject to an annual increase.

Where will I study?

Search Suggestions
Search suggestions

Based on your current searches we recommend the following search filters.