The burden of multiple long-term conditions and an ageing population is placing ever growing pressure on health care systems globally. Health care systems struggle to coordinate care, identify those most at need and diagnose early enough in the disease pathway to prevent illness or optimise treatment. Health care is rapidly digitalising through the use of electronic health records, digital diagnostics and, increasingly, mobile and sensor based technologies creating potential new opportunities for early diagnosis and person-centred care.
Artificial Intelligence (AI) is set to transform health care. Innovation projects are underway in Aberdeen, to develop computer assisted technology to detect abnormalities in X-ray or CT scan radiology images assisting in the early diagnosis of breast cancer [iCAIRDNorth - Industrial Centre for Artificial Intelligence Research in Digital Diagnostics]. We are developing clinical tools to better predicted risk of readmission to hospital using AI to drawing together information from across the electronic health records and laboratory test results. [Aberdeen Centre for Health Data Science]
However, there is increasing recognition that AI systems should not be dealt with as a “black box”. For successful translation into clinical practice, trust, transparency and accountability of AI systems are emerging as key components . Clear definition of AI system strengths and weaknesses, accuracy, bias and reliability are important. The optimal methods to explain AI systems for health care remain uncertain but there is growing evidence from other domains including the automobile industry and Aberdeen have a programme of work developing and evaluating methods for AI accountability in driverless car technology.
This PhD offers a mixed methods approach, bringing together a multidisciplinary supervisory team with expertise in computing science, qualitative methodology and health informatics within the Centre for Health Data Science. It comprises three phases:
1. Evidence review: using systematic literature review methodology to synthesise a growing published discussion about trust in AI for healthcare and identify the key emerging themes.
2. Explanation and simulation: Building on computing science methodologies of argumentation , dialogue  and natural language generation , the PhD will apply and advance techniques for explaining why diagnostic actions are, or are not, taken in an AI system diagnostic care pathway (workflow). The PhD will use ‘desk top’ simulations including theoretical and by developing a simple diagnostic process AI system ‘model’ using synthetic health care data to allow a care pathway to be simulated. Then applying explanatory methods to real world exemplars drawn from existing AI projects within the Aberdeen Centre for Health Data Science including, for example, iCAIRD which is developing AI diagnostics with a focus on radiology driven diagnosis in cancer and Modernising Outpatients programme developing AI to support patient centred care at the interface between hospitals and the general practitioner.
3. Qualitative interviews: the PhD will use qualitative interviews to help understand what would support users to trust AI in clinical decision-making, and use codesign techniques to work with clinicians, patients and healthcare planners to refine the AI system explanations for different audiences.
The PhD also offers the opportunity for an industrial placement with Canon Medical (Dominic Ashmole, Aberdeen Office) to understand how the methodology being developed in the PhD could be introduced into state of the art AI assisted health digital technologies. The PhD will include public engagement activity as part of the Aberdeen Centre for Health Data Science engagement programme.
As the diagnosis process becomes increasingly automated, the techniques developed in this PhD will be fundamental to improve acceptance by regulatory bodies, professionals and society at large. The opportunity for impact is high and the PhD student will be able to engage with the UK health data science landscape through our Health Data Research UK (HDR UK) partner site status.
This project is advertised in relation to the research areas of APPLIED HEALTH SCIENCE. Formal applications can be completed online: https://www.abdn.ac.uk/pgap/login.php
. You should apply for Degree of Doctor of Philosophy in Applied Health Science, to ensure that your application is passed to the correct person for processing.
NOTE CLEARLY THE NAME OF THE SUPERVISOR AND EXACT PROJECT TITLE ON THE APPLICATION FORM.
Candidates should contact the lead supervisor to discuss the project in advance of submitting an application, as supervisors will be expected to provide a letter of support for suitable applicants. Candidates will be informed after the application deadline if they have been shortlisted for interview.
 Our data-driven future in healthcare Academy of Medical Science December 2018 https://acmedsci.ac.uk/file-download/45020825
 Phan Minh Dung. 1995. On the acceptability of arguments and its fundamental role in nonmonotonic reasoning, logic programming and n-person games. Artif. Intell. 77, 2 (September 1995), 321-357.
 Henry Prakken, Chris Reed, and Douglas Walton. 2005. Dialogues about the burden of proof. In Proceedings of the 10th international conference on Artificial intelligence and law (ICAIL '05). ACM, New York, NY, USA, 115-124.
 Ehud Reiter and Robert Dale. 2000. Building Natural Language Generation Systems. Cambridge University Press, New York, NY, USA.