Looking to list your PhD opportunities? Log in here.
About the Project
Project Reference Number: SCEBE-20-025-SNSF
Background
There are many image modalities such as conventional imaging, Ultrasound (US), Magnetic Resonance Imaging (MRI), and Computed Tomography (CT) that are routinely utilised by the clinicians to determine the presence and nature of a disease. Artificial Intelligence (AI) application in medical imaging is based on complex models that provide state of the art results in image classification to positively identify the onset of a disease. Recent AI techniques have transformed the classification process and can provide results comparable to or better than human expert. However, compared to earlier machine learning models that had explainability, it is no longer possible to observe and trace the steps followed by a model.
The complexity of the models and their ‘black box’ nature makes it difficult to understand why the model made a particular decision which also affects the accompanying trust. This information is very important due to the criticality of the correct diagnosis of a disease where an incorrect diagnosis can be fatal, or result in unnecessary or inappropriate treatment affecting patient outcomes. Explainable AI aims to make it easier for the human in the loop to understand not only ‘what’ decision was made but ‘why’ it was made. This can not only help in making the process understandable but also increases the degree of trust and confidence level in the output of an AI based system.
Aims
This project will develop algorithms and models for making medical image classification process understandable for medical professionals. The major objectives are as follows:
(i) Critically analysing the state of the art and the gap of knowledge in medical image classification and explainable AI in medical applications;
(ii) Designing a medical image analysis framework with justifiable explainability levels for medical experts;
(iii) Experimentally evaluating the effectiveness and performance of explainable medical image classification system
There is also a possibility of working with National Health Service (NHS) Scotland for this project.
How to Apply
This project is available as a 3 years full-time or 6 years part-time PhD study programme.
Candidates are encouraged to contact the research supervisors for the project before applying.
Please note that emails to the supervisory team or enquires submitted via this project advert do not constitute formal applications; applicants should apply using our Application Process page, choosing Computing and their preferred intake date.
Please send any other enquires regarding your application to: researchapplications@gcu.ac.uk
Funding Notes
References
Director of Studies Dr Sajid Nazir - sajid.nazir@gcu.ac.uk
GCU Research Online URL: https://researchonline.gcu.ac.uk/en/persons/sajid-nazir
2nd Supervisor Dr Diane Dickson - Diane.Dickson@gcu.ac.uk
GCU Research Online URL: https://www.gcu.ac.uk/hls/staff/dianedickson/
3rd Supervisor Professor Mike Mannion - M.A.G.Mannion@gcu.ac.uk
GCU Research Online URL: https://www.gcu.ac.uk/cebe/staff/prof%20mike%20mannion/
Email Now
Why not add a message here
The information you submit to Glasgow Caledonian University will only be used by them or their data partners to deal with your enquiry, according to their privacy notice. For more information on how we use and store your data, please read our privacy statement.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Search suggestions
Based on your current searches we recommend the following search filters.
Check out our other PhDs in Glasgow, United Kingdom
Check out our other PhDs in United Kingdom
Start a New search with our database of over 4,000 PhDs

PhD suggestions
Based on your current search criteria we thought you might be interested in these.
Unsupervised, explainable image segmentation/classification using Hyperspectral data
University of Reading
Artificial Intelligence with Human In The Loop for Automated Medical Image Contouring in Precision Oncology
Cardiff University
Cognitive intelligence to support explainable, automated industrial decision support
University of Strathclyde