Dr Neill Campbell (Department of Computer Science, University of Bath)
Prof Alistair Forbes (National Physical Laboratory)
Organisations across all sectors are interested in employing machine learning algorithms, but can these methods be trusted? Modern deep learning is computationally powerful and scalable but is unable to offer guarantees on confidence and data efficiency. We will overcome these issues by combining deep learning with the desirable properties (quantified uncertainty and incorporation of domain knowledge) of traditional generative probabilistic models.
Traditional (generative) data analysis methods rely on validated deterministic and statistical models of the physical system under study. These confer good predictive capabilities presented, importantly, with a supporting quantification of the associated uncertainty.
In contrast, many recent (discriminative) black box deep learning and AI approaches make inferences from training data with no underlying physical model of the system and no supporting uncertainty quantification. They employ extremely flexible, intrinsic empirical models that can match any training set, but cannot place guarantees on their predictive capability.
With a saturated set of training data, it can be expected that the trained model approximates well the true underlying physical model, at least in the region from which the training data is drawn, leading to some confidence in the inferences made by the algorithm. But for practical applications, there will be no way of knowing how well the trained response reflects real attributes of the physical system, and the uncertainties associated with their outputs are unquantifiable.
This project will investigate theoretical, algorithmic and practical issues associated with data analytics based on learning paradigms that a) make efficient use of the data and any prior model constraints and b) support uncertainty quantification. In essence, we will seek to combine the desirable properties of generative probabilistic models with the efficiency and computational power of modern deep learning.
This project will be supervised jointly as a new strategic collaboration with the National Physical Laboratory who represent a range of stakeholders (Advanced Manufacturing, Energy & Environment and Life Sciences & Health) with a strong demand for verifiable and trusted machine learning.
Informal enquiries should be directed to Dr Neill Campbell, [Email Address Removed]
Formal applications should be made via the University of Bath’s online application form for a PhD in Computer Science:
More information about applying for a PhD at Bath may be found here:
Anticipated start date: 1 October 2018
Note: Applications may close early if a suitable candidate is found; therefore, early application is strongly recommended.
UK and EU students applying for this project may be considered for a University Research Studentship with the National Physical Laboratory. The studentship will cover Home/EU tuition fees, a training support fee of £1,000 per annum and a tax-free maintenance allowance at the RCUK Doctoral Stipend rate (£14,777 in 2018-19 for a period of 3.5 years.
Note: ONLY UK and EU applicants are eligible for this studentship; unfortunately, applicants who are classed as Overseas for fee paying purposes are NOT eligible for funding.
Lawrence, A., Ek, C. H., Campbell, N.D.F., “Latent Structure Learning using Gaussian and Dirichlet Processes”, NIPS 2017 Workshop on Advances in Modelling and Learning Interactions from Complex Data
Bodin, E., Malik, I., Ek, C. H., Campbell, N.D.F., “Nonparametric Inference for Auto-Encoding Variational Bayes”, NIPS 2017 Workshop on Advances in Approximate Bayesian Inference