Risk CDT - Balancing ethicality, legality, security and safety critical decisions
PLEASE APPLY ONLINE TO THE SCHOOL OF ENGINEERING, PROVIDING THE PROJECT TITLE, NAME OF THE PRIMARY SUPERVISOR AND SELECT THE PROGRAMME CODE "EGPR" (PHD - SCHOOL OF ENGINEERING)
This is a project within the multi-disciplinary EPSRC and ESRC Centre for Doctoral Training (CDT) on Quantification and Management of Risk & Uncertainty in Complex Systems & Environments, within the Institute for Risk and Uncertainty. The studentship is granted for 4 years and includes, in the first year, a Master in Decision Making under Risk & Uncertainty. The project includes extensive collaboration with prime industry to build an optimal basis for employability.
Managing critical and major incidents such as natural and environmental disasters requires emergency services to make decisions that balance conflicting ethical, legal, security and safety considerations. For example, deciding whether to send emergency responders into a potentially hostile zone in order to rescue members of the public. These dynamic environments are characterised by time pressure, risk, uncertainty, availability and trustworthiness of information, accountability and shifting goals, all of which place key decision makers under undue cognitive load, compromising decision-making and public safety. Our long-term aim is to improve decision-making in emergency situations by developing an “autonomous decision assistant” that will take the information, goals, and ethical requirements that such human decision-makers have, and will propose courses of action it thinks are most appropriate at any given moment.
The PhD project provides an integral step toward this aim by capturing a subset of the key decisions made during emergency management within an intelligent software agent and modifying decisions based on considerations of ethicality, legality, security and safety. In order to assess the efficacy of the advisory and decision-making capabilities of the software, the student will develop and evaluate case studies with the support of the industrial partner, Merseyside Fire and Rescue Service, and associated emergency response agencies. Accordingly, the PhD project will address the following goals:
1. Analyse, understand, and model human decision-making from critical incident exercises, and develop a basic decision-making agent;
2. Evaluate (using both experts and researchers) the agent decision-making in an “off-line” mode, varying the quality/veracity of information, the priority of goals, etc.;
3. Analyse, understand, and model ethical (and security) concerns from critical incident exercises, how these impact upon decisions, and extend the decision-making agent to incorporate these aspects; and
4. Conduct a full “off-line” evaluation (again, by both experts and researchers) of the agent decision-makers, varying the ethical constraints, the context, and the reliability of the outcomes, etc.
This will require a multidisciplinary approach between Computer Science and Psychology in order to model human emergency management decisions within an intelligent software agent.
Applicants should have either a first class undergraduate degree or Distinction in a masters degree. The project would suit students with a background in Computer Science but with a strong interest in human decision-making or with a background in Psychology but with a strong interest in Computer Science and Artificial Intelligence. Experience of working with emergency services is desirable but not essential.
The PhD Studentship (Tuition fees + stipend of £ 14,296 annually over 4 years) is available for Home/EU students. In addition, a budget for use in own responsibility will be provided.