About the Project
Gaze-control technology has proved invaluable in assisting people with neurodegenerative conditions, such as Motor Neuron Disease (MND) or Cerebral Palsy (CP), at times dramatically improving their quality of life by extending their ability to communicate independently. Even though there is a rather vast literature on gaze-control focusing on performance (e.g. gaze-typing speed), it generally ignores the goal-directed movements that underlie this performance. There is a wealth of physiological data (from eye movement, pupil and head movement tracking) that could be recorded during day-to-day use of gaze-control technology and used to infer cognitive function. An understanding of how these data relate to cognitive function would improve the ability to detect change in neurodegenerative conditions, whether it is deterioration or improvement due to the use of gaze-control technology [1, 2].
The aims of the project will be to:
(i) link behaviour recorded while using gaze-control technology to performance in standardised eye movements tasks measuring motor control and cognitive function (e.g. memory, visual attention and executive control). We will do so by extracting low-level (e.g. eye movement latency) and higher-level metrics (e.g. k coefficient of focal to ambient behaviour) developed in investigations of scene exploration [e.g. 3].
(ii) use a bottom-up approach based on machine learning to detect eye movement features that are predictive of changes in performance in standardized measures of cognitive and motor control. This approach will be developed in collaboration with Professors Huiyu Zhou and Yu-Dong Zhang from the School of Informatics, University of Leicester.
Specifically, there is a plethora of gaze data (eye + head movement) that could be used to detect changes in motor control that could be indicative of significant clinical outcomes related to loss of mobility or cognition, which is especially relevant in rapidly evolving conditions such as MND. Prior studies that looked at oculomotor function in neurodegenerative disease have understandably used a small number of trials to avoid testing fatigue , whereas recording every-day use of a gaze-controlled interface would generate several times this amount of data in a single day, greatly increasing the ability to detect change reliably.
In this project, we will adopt a multi-disciplinary approach, combining gaze-control technology, optometry, experimental psychology, artificial intelligence, and augmentative and alternative communication (AAC) technology. The student will be working with a multidisciplinary team, meshing the latest AAC and machine learning technology with the potential to deliver clinically relevant insights, assist users with interface use and inform the design of communication interfaces to provide solutions that are resilient to disease progression, variability between individuals and between conditions.
UK/EU applicants only.
Applicants are required to hold/or expect to obtain a UK Bachelor Degree 2:1 or better in a relevant subject.
The University of Leicester English language requirements apply where applicable: https://le.ac.uk/study/research-degrees/entry-reqs/eng-lang-reqs/ielts-65
How to apply:
To apply for the PhD please refer to the guidelines and use the application link at https://le.ac.uk/study/research-degrees/funded-opportunities/bbsrc-mibtp
Please also submit your MIBTP notification form at https://warwick.ac.uk/fac/cross_fac/mibtp/pgstudy/phd_opportunities/application/
Project / Funding Enquiries: [Email Address Removed]
Application enquiries to [Email Address Removed]
1. Souto, D, Marsh O, Paterson KB. Use-dependent plasticity in assistive interfaces: Gaze-typing improves inhibitory control. Poster presented at the European Conference on Eye Movements, Alicante, Spain, 2019.
2. Nelles G, Pscherer A, De Greiff A, Forsting M, Gerhard H, Esser J, Diener HC. Eye-movement training-induced plasticity in patients with post-stroke hemianopia. Journal of Neurology 256: 726–733, 2009.
3. Krejtz K, Duchowski A, Krejtz I, Szarkowska A, Kopacz A. Discerning ambient/focal attention with coefficient K. ACM Transactions on Applied Perception (TAP) 13: 11, 2016.
4. Anderson TJ, MacAskill MR. Eye movements in patients with neurodegenerative disorders. Nature Reviews Neurology 9: 74–85, 2013.