• University of Stirling Featured PhD Programmes
  • University of Macau Featured PhD Programmes
  • University of Surrey Featured PhD Programmes
  • Northumbria University Featured PhD Programmes
  • University of Exeter Featured PhD Programmes
  • University of Manchester Featured PhD Programmes
  • University of Birmingham Featured PhD Programmes
University of Warwick Featured PhD Programmes
Imperial College London Featured PhD Programmes
Engineering and Physical Sciences Research Council Featured PhD Programmes
University of Southampton Featured PhD Programmes
University of Bristol Featured PhD Programmes

Explainable Deep Learning for Situational Understanding Problems

This project is no longer listed in the FindAPhD
database and may not be available.

Click here to search the FindAPhD database
for PhD studentship opportunities
  • Full or part time
    Prof AD Preece
    Dr Y Hicks
  • Application Deadline
    No more applications being accepted
  • Funded PhD Project (Students Worldwide)
    Funded PhD Project (Students Worldwide)

Project Description

Background
Huge advances have been made in machine learning in recent years due to breakthroughs in deep neural networks, called Deep Learning (DL).
However, a key problem with DL approaches is that they are generally seen as being "black boxes": while they may work well in particular applications, it is usually unclear how they work, leading to challenges in improving their performance when they fail, and issues of user trust.
There is consequently great interest in researching techniques to improve the interpretability of DL approaches to allow DL systems to generate explanations of how they reached a decision.
To be useful, such explanations need to be generated in human-understandable terms, for example, identifying image features that were significant in a classification decision, or providing a brief textual description.
Project aims and methods
The goal of this PhD is to make progress in this challenging area of DL, with a particular focus on situational understanding problems where the DL system is intended to assist a human decision maker in domains such as emergency response, security and policing.
Situational understanding requires three key elements in terms of machine learning, both of which need to be explainable:
• learning of temporal relationships, including predictions of likely future states (i.e., based on the current situation, what is likely to happen next),
• learning at multiple hierarchical scales, from detection of low-level objects to identification of high-level relationships,
• learning from the fusion of multiple data streams of different modalities (e.g., imagery, text, GPS, etc).
As an example, consider the problem of managing a large-scale event in a city centre, where streams of CCTV imagery, social media text, and real-time location data may be used to predict potential overcrowding and consequential disruption.
This PhD will be carried out within the Distributed Analytics and Information Sciences International Technology Alliance (DAIS ITA), a collaboration between Cardiff University, IBM, Airbus, BAE Systems, University College London, University of California Los Angeles, and other UK and US partners.
At Cardiff University the PhD will be supervised by members of the Crime and Security Research Institute, the School of Computer Science and Informatics, and the School of Engineering’s Sensors, Signals and Imaging Group.

Funding Notes

Funding is available only for UK/EU students. Stipend is payable at the RCUK rate (currently £14,553).

How good is research at Cardiff University in Computer Science and Informatics?

FTE Category A staff submitted: 13.73

Research output data provided by the Research Excellence Framework (REF)

Click here to see the results for all UK universities

Let us know you agree to cookies

We use cookies to give you the best online experience. By continuing, we'll assume that you're happy to receive all cookies on this website. To read our privacy policy click here

Ok