Dr Joost Rommers (University of Aberdeen) http://www.joostrommers.com
Dr Soren K Andersen (University of Aberdeen) https://www.abdn.ac.uk/psychology/people/profiles/skandersen
Human language is the most advanced communication system that evolution has produced. It unfolds on a rapid time scale (listeners process around 200 syllables per minute), necessitating research methods with a high temporal resolution. Previous work has successfully applied one such method, eye-tracking, to study language using the ‘visual world’ paradigm wherein listeners hear spoken words or sentences and view a visual display of objects (e.g., Tanenhaus et al., 1995). This paradigm has revealed that listeners quickly relate language to the surrounding world: upon hearing a spoken word, the eyes fixate on a corresponding object or on related objects within a few hundred milliseconds.
However, speakers and listeners can process objects even before fixating on them, because attention can also be allocated covertly, without moving the eyes. Furthermore, the neural mechanisms underlying language-mediated shifts of attention are unknown, and it is anticipated that uncovering these mechanisms will provide deeper insights into how language influences attention. For example, does language enhance attention to relevant objects in a display, or does it suppress attention to irrelevant objects?
Fortunately, electrical brain activity (EEG) provides direct measures of covert attention allocation that distinguish between enhancement and suppression. These include time-domain measures of event-related potential components as well as spectro-temporal measures based on steady-state visual evoked potentials elicited by stimuli flickering at different frequencies (e.g., Andersen & Müller, 2010). Recent work has shown that it is feasible to exploit these techniques even in speaking participants, despite the added noise from speech muscle activity (Rommers, Meyer, & Praamstra, 2017).
The present project will focus on the listener side. Specifically, the PhD candidate will use advanced temporal and spectral analyses of electrophysiological signals to investigate how the brain relates ongoing language input to the surrounding world. This project will involve developing a high level of expertise in digital signal processing and programming (Matlab, R), both skills highly useful in industry and academia. The project would be suitable for candidates with a background in psychology, neuroscience, or engineering who have interests in language, attention and electrophysiology.
Application Procedure: http://www.eastscotbiodtp.ac.uk/how-apply-0
Please send your completed EASTBIO application form, along with academic transcripts and CV to Alison McLeod at [email protected]
. Two references should be provided by the deadline using the EASTBIO reference form. Please advise your referees to return the reference form to [email protected]
Andersen, S. K., & Müller, M. M. (2010). Behavioral performance follows the time course of neural facilitation and suppression during cued shifts of feature-selective attention. Proceedings of the National Academy of Sciences, 107(31), 13878-13882.
Rommers, J., Meyer, A. S., & Praamstra, P. (2017). Lateralized electrical brain activity reveals covert attention allocation during speaking. Neuropsychologia, 95, 101–110.
Tanenhaus, M. K., Spivey-Knowlton, M. J., Eberhard, K. M., & Sedivy, J. C. (1995). Integration of visual and linguistic information in spoken language comprehension. Science, 268(5217), 1632-1634.