FindAPhD LIVE! Study Fair

Oxford | Leeds

Max Planck Society Featured PhD Programmes
Norwich Research Park Featured PhD Programmes
University of Leeds Featured PhD Programmes
King’s College London Featured PhD Programmes
Cardiff University Featured PhD Programmes

Estimating driver state from facial behaviour and other signals

This project is no longer listed in the FindAPhD
database and may not be available.

Click here to search the FindAPhD database
for PhD studentship opportunities
  • Full or part time
    Prof T Cootes
  • Application Deadline
    No more applications being accepted
  • Funded PhD Project (European/UK Students Only)
    Funded PhD Project (European/UK Students Only)

Project Description

Automated driving refers to the capability of a vehicle to handle a set of driving tasks normally performed by a human. This set of tasks ranges from individual manoeuvers such as emergency braking or obstacle avoidance in safety and driver assistance applications to handling all driving tasks that can be managed by a human driver in fully automated driving. Increasing levels of automation require the automated driving system to be aware of the cognitive and physiological state of the driver, for instance when handing over control to the driver is required, or when the driver wishes to take over control from the system. Further situations include medical emergencies, during which a driver may become incapacitated and unable to control the vehicle, requiring immediate action by the system.
In this PhD project we propose to build a driver state estimation method capable of inferring cognitive and physical driver states based on both driving behaviour as well as on the driver’s facial appearance, cardiac, electro-dermal, respiratory and electro-muscular activity.
The challenges of this PhD project will relate to extracting and combining the information from the sensory data provided by the vehicle using methods from Computer Vision and Signal Processing (face tracking, deep neural networks) and Bayesian Machine Learning (probabilistic inference).
The first stage of the project should focus on the development of a model capable of predicting driver states such as “vigilant”, “excited”, “stressed” or “incapacitated” before moving on to more subtle emotional states.
A significant amount of existing driving simulator data will be available from the outset of the project and there is the potential for testing the developed methods during further driving simulator sessions.

Funding Notes

Studentship funded by Toyota Motor Europe will cover fees and stipend for 3.5 years. Expected start date September 2018.

Closing date for applications 24 May 2018.

Applicants are expected to hold, or about to obtain, a minimum upper second class undergraduate degree (or equivalent) in a numerate discipline (such as maths, physics, engineering, computer science etc.). A Masters degree in a relevant subject and/or experience in computer vision and programming is desirable.

Please select PhD Biomedical Imaging Sciences on the online application form.

References

G. Rajamanoharan and T.F. Cootes, "Multi-View Constrained Local Models for Large Head Angle Facial Tracking", ICCV Workshops, 2015
A.Caunce, C.Taylor and T.Cootes, "Using Detailed Independent 3D sub-models to improve facial feature localisation and pose estimation.", International Journal on Artificial Intelligence Tools, Vol.22, No.6, 2013
R. Almutiry, S. Couth, E. Poliakoff, S. Kotz, M. Silverdale, and T. Cootes, "Facial Behaviour Analysis in Parkinson's Disease", in Medical Imaging and Augmented Reality: 7th International Conference, MIAR 2016, pp. 329-339.
C.Lindner, P.A.Bromiley, M.C.Ionita and T.F. Cootes,"Robust and Accurate Shape Model Matching using Random Forest Regression-Voting", IEEE Trans. PAMI, Vol.37, No.9, pp.1862-1874, 2015



FindAPhD. Copyright 2005-2018
All rights reserved.

Let us know you agree to cookies

We use cookies to give you the best online experience. By continuing, we'll assume that you're happy to receive all cookies on this website. To read our privacy policy click here

Ok