Coventry University Featured PhD Programmes
University of Leeds Featured PhD Programmes
University of Lincoln Featured PhD Programmes

Neuronal Basis of Cross-modal sensory integration for navigation

School of Biological & Chemical Sciences

This project is no longer listed on and may not be available.

Click here to search for PhD studentship opportunities
Dr Guifen Chen No more applications being accepted Funded PhD Project (European/UK Students Only)

About the Project

A PhD position is available to start in September 2020 at Queen Mary University of London (QMUL) in Dr Guifen Chen’s group, to work on the project “Neuronal Basis of Cross-modal sensory integration for navigation”.
Research environment
The School of Biological and Chemical Sciences at Queen Mary is one of the UK’s elite research centres, according to the 2014 Research Excellence Framework (REF). We offer a multi-disciplinary research environment and have approximately 160 PhD students working on projects in the biological, chemical and psychological sciences. Our students have access to a variety of research facilities supported by experienced staff, as well as a range of student support services.

Further details about Dr Guifen Chen’s group are available on the staff profile:
Training and development
Our PhD students become part of Queen Mary’s Doctoral College which provides training and development opportunities, advice on funding, and financial support for research. Our students also have access to a Researcher Development Programme designed to help recognise and develop key skills and attributes needed to effectively manage research, and to prepare and plan for the next stages of their career.
Project details
The ability to navigate is an essential skill required by both humans and animals. Neural activity in the hippocampus and an adjacent cortical area, the entorhinal cortex, has been shown to correlate with the physical location and orientation of an organism. However, it is not clear how an internal unified representation of space, which supports navigation, is generated by integrating sensory inputs across modalities. The overall aim of this project is to decipher the neural mechanisms of cross-modal navigation in the hippocampal-entorhinal network.

Several spatial cells have been discovered, including place cells in the hippocampus and grid cells in the medial entorhinal cortex. The activity of these cells provides an animal with an internal representation of space as it explores an environment. However, a key unsolved problem is how spatial cells integrate multiple sensory inputs (such as visual, auditory, olfactory, etc) in order to form spatial representations.

Virtual reality (VR) offers a powerful tool for investigating spatial cognition, allowing environmental manipulations that are impossible in the real world. We have recently developed a two-dimensional VR (2D VR) system for mice with mainly visual and motor/proprioceptive inputs, allowing a close approximation of spatial representation in the real world.

The goal of this project is to study the integration of sensory inputs across modalities during spatial navigation. In particular we aim to 1) Develop a new 2D VR, where both visual and auditory cues can be individually controlled and manipulated, and train animals to navigate in this multisensory VR. 2) Characterise activity of spatial cells in this multisensory VR, in particular the topography of these spatial cells with respect to their response to sensory inputs. 3) Examine how these spatial cells interact in order to integrate visual and auditory inputs with motion-related cues, forming a unified spatial representation. The primary techniques that will be used include in vivo electrophysiological single-unit recording using tetrodes and Neuropixel probes. Theoretical models will be constructed based on the experimental results, which will be used to explain experimental outcomes and make further predictions and hypotheses for future directions.

See more details on Dr Chen’s webpage:
Eligibility and applying
Applications are invited from outstanding candidates with a keen interest in neuroscience and with or expecting to receive a first or upper-second class honours degree and a masters degree in an area relevant to the project (for example Neuroscience, Life Sciences, Medicine, Psychology, Physics, Maths or Computer Science). A masters degree is desirable, but not essential.
Candidates with programming skills and/or a strong math background are desirable.

Applicants from outside of the UK are required to provide evidence of their English language ability. Please see our English language requirements page for details.

Informal enquiries about the project can be sent to Dr Guifen Chen ([Email Address Removed]). Formal applications must be submitted through our online form by the stated deadline:

The School of Biological and Chemical Sciences is committed to promoting diversity in science; we have been awarded an Athena Swan Silver Award. We positively welcome applications from underrepresented groups.

Funding Notes

This studentship is open to UK/EU applicants and is funded by QMUL. It will cover tuition fees, and provide an annual tax-free maintenance allowance for 3 years at the Research Council rate (£17,285 in 2020/21).


1. Rowland, D. C., Roudi, Y., Moser, M.-B. & Moser, E. I. Ten Years of Grid Cells. Annu Rev Neurosci 39, 1–22 (2015).
2. Chen, G., Lu, Y., King, J. A., Cacucci, F. & Burgess, N. Differential influences of environment and self-motion on place and grid cell firing. Nat Commun 10, 630 (2019).
3. Chen, G., King, J. A., Lu, Y., Cacucci, F. & Burgess, N. Spatial cell firing during virtual navigation of open arenas by head-restrained mice. Elife 7, e34789 (2018).
4. Aronov, D., Nevers, R. & Tank, D. W. Mapping of a non-spatial dimension by the hippocampal–entorhinal circuit. Nature 543, 719–722 (2017).
5. Doeller, C. F., Barry, C. & Burgess, N. Evidence for grid cells in a human memory network. Nature 463, 657–661 (2010).
Search Suggestions

Search Suggestions

Based on your current searches we recommend the following search filters.

FindAPhD. Copyright 2005-2020
All rights reserved.