Meet over 65 universities on 27 & 28 April > REGISTER NOW
Coventry University Featured PhD Programmes
University of Reading Featured PhD Programmes

Mathematical methods for differential privacy in clinical research

Department of Mathematical Sciences

This project is no longer listed on and may not be available.

Click here to search for PhD studentship opportunities
Dr Matthew Nunes , Dr Sandipan Roy No more applications being accepted Funded PhD Project (European/UK Students Only)
Bath United Kingdom Applied Statistics Data Analysis Data Science Machine Learning Medical Statistics Probability Statistics

About the Project

The University of Bath is inviting applications for the following PhD project commencing in October 2021.

Funding is available to candidates who qualify for ‘Home’ fee status. Following the UK’s departure from the European Union, the rules governing fee status have changed and, therefore, candidates from the EU/EEA are advised to check their eligibility before applying. Please see the Funding Eligibility section below for more information.

When working with patient data in a healthcare setting, respecting patient consent and privacy is imperative and subject to strong legal and regulatory constraints. Moreover, not all data modalities collected in modern clinical studies are uniformly suited to perform anonymization / de-identification – for example, only a small number of genetic markers, or peripheral information present in medical images such as MRIs may be sufficient to uniquely re-identify individuals contributing information to a dataset. Therefore, patient privacy requirements may severely constrain our ability to link data across studies and building complex data assets to continuously improve our understanding of disease development and how patients respond to treatments.

Differential privacy defines a framework to protect patient privacy when computing a specific set of data summaries for datasets containing sensitive personal information through a randomized algorithm. In this setting, we can characterize the level of privacy guaranteed through the level of randomization applied. This project focuses on the application of differential privacy to interrogate large pooled and multi-modal patient-level datasets to compute, for example, clinical endpoints or treatment outcomes. Pools of clinical studies both provide unique challenges and opportunities in this setting: On one hand, knowledge about randomization through study design may allow us to suppress columns with little impact on the summary to be calculated. On the other hand, patient datasets are much smaller and more expensive to generate compared to other types of data commonly considered in the differential privacy setting, with only hundreds or thousands of participants.

The aim is to develop methods which can adapt their level of randomization / privacy budget to settings occurring in clinical trial / life-science scenarios. We will consider the following approaches: (i) methods based on random projections to represent information in a way that makes individuals harder to identify, (ii) machine learning algorithms and (iii) other probabilistic /statistical approaches. The key idea will be to explore mechanisms through which we can improve the analytic utility of our datasets with a limited budget for privacy.

Candidate Requirements:

Applicants should hold, or expect to receive, a First Class or good Upper Second Class Honours degree (or the equivalent) in Mathematics, Statistics or another relevant discipline. A master’s level qualification in Mathematics or Statistics would be beneficial. Some experience with machine learning is desirable.

Enquiries and Applications:

Informal enquiries are welcomed and should be directed to Dr Matthew Nunes, [Email Address Removed]. 

Formal applications should be made via the University of Bath’s online application form for a PhD in Statistics (full-time).

More information about applying for a PhD at Bath may be found on our website

Funding Eligibility:

In order to be considered for a studentship, you must qualify as a ‘Home’ student. In determining ‘Home’ student status, we follow the UK government’s fee regulations and guidance which, when available, will be set out by the UK Council for International Student Affairs (UKCISA) on their website. At the time of advertising this project, the fee regulations for 2021/22 have not yet been published, but we expect (subject to confirmation) that the main categories of students generally eligible for ‘Home’ fee status will be:

  • UK nationals (who have lived in the UK, EU, EEA or Switzerland continuously since September 2018)
  • Irish nationals (who have lived in the UK or Ireland continuously since September 2018)
  • EU/EEA applicants with settled status in the UK under the EU Settlement Scheme (who have lived in the UK continuously since September 2018)
  • EU/EEA applicants with pre-settled status in the UK under the EU Settlement Scheme (who have lived in the UK, EU, EEA, Switzerland or Gibraltar continuously since September 2018)
  • Applicants with indefinite leave to enter/remain in the UK (who have been resident in the UK continuously since September 2018)

EU/EEA citizens who live outside the UK are unlikely to be eligible for ‘Home’ fees and funding.

Additional information may be found on our fee status guidance webpage, on the GOV.UK website and on the UKCISA website

Funding Notes

A studentship is available for up to 4 years funded by the Centre for Doctoral Training in Statistical Applied Maths at Bath (SAMBa) and Novartis. Funding covers ‘Home’ tuition fees, a stipend (£15,285 per annum, 2020/21 rate) and research/training expenses (£1,000 per annum). Eligibility criteria apply - see above.


Dwork, C., Roth, A. (2014). The Algorithmic Foundations of Differential Privacy,
Wood, A. et al. (2019). Differential Privacy: A Primer for a Non-Technical Audience,
Chaudhuri, K., Monteleoni, C., Sarwate, A. (2011). Differentially Private Empirical Risk Minimization,
Dankar, F., Emam, K. (2013). Practicing Differential Privacy in Health Care: A Review, Transactions on Data Privacy.
O'Keefe, C., Rubin, D. (2015). Individual privacy versus public good: protecting confidentiality in health research,
Taylor, L., Zhou, X., Rise, P. (2018). A tutorial in assessing disclosure risk in microdata,
Jordon, J., Yoon, J., Schaar, M. (2019). PATE-GAN: Generating Synthetic Data with Differential Privacy Guarantees.
Homer, N. et al. (2008). Resolving Individuals Contributing Trace Amounts of DNA to Highly Complex Mixtures Using High-Density SNP Genotyping Microarrays,
Search Suggestions

Search Suggestions

Based on your current searches we recommend the following search filters.

FindAPhD. Copyright 2005-2021
All rights reserved.