Don't miss our weekly PhD newsletter | Sign up now Don't miss our weekly PhD newsletter | Sign up now

  Mathematical methods for differential privacy in clinical research


   Department of Mathematical Sciences

This project is no longer listed on FindAPhD.com and may not be available.

Click here to search FindAPhD.com for PhD studentship opportunities
  Dr Matthew Nunes, Dr Sandipan Roy  No more applications being accepted  Funded PhD Project (European/UK Students Only)

About the Project

The University of Bath is inviting applications for the following PhD project commencing in October 2021.

Funding is available to candidates who qualify for Home fee status. Following the UK’s departure from the European Union, the rules governing fee status have changed and, therefore, candidates from the EU/EEA are advised to check their eligibility before applying. Please see the Funding Eligibility section below.

Machine learning has great potential to improve data-driven decision-making in healthcare and drug development by enabling deep analytics on complex data types (for example medical images) when these are collected at scale. Such data can greatly enrich our understanding of the mechanisms behind disease and treatment. This PhD project is set up in collaboration with Novartis, a global pharmaceutical company, and provides a unique opportunity to both develop new methods and being part of applying them to advance the science of medicine.

Respecting patient privacy is imperative and subject to strong legal and regulatory constraints, however, not all data types are uniformly usable in a privacy-preserving fashion. For example, only a small number of genetic markers, or peripheral information present in medical images such as MRIs may be sufficient to uniquely re-identify individuals contributing information to a dataset. Differential privacy defines a framework to protect individual privacy when computing a specific set of data summaries for datasets containing sensitive personal information through a randomized algorithm. We can characterize the level of privacy guaranteed through the level of randomization applied. This project focuses on the application of differential privacy to interrogate large pooled and multi-modal patient-level datasets to compute, for example, clinical endpoints or treatment outcomes. Pools of clinical studies both provide unique challenges and opportunities in this setting: On one hand, knowledge about randomization through study design may allow us to suppress columns with little impact on the summary to be calculated. On the other hand, patient datasets are much smaller and more expensive to generate compared to other types of data commonly considered in the differential privacy setting, with only hundreds or thousands of participants.

The aim is to develop methods which can adapt their level of randomization / privacy budget to healthcare scenarios, optimizing privacy and data utility simultaneously. We will consider (i) methods based on random projections to represent information in a way that makes individuals harder to identify, (ii) machine learning algorithms and (iii) other probabilistic /statistical approaches.

Candidate Requirements:

Applicants should hold, or expect to receive, a First Class or good Upper Second Class Honours degree (or the equivalent) in Mathematics, Statistics or another relevant discipline. A master’s level qualification in Mathematics or Statistics would be beneficial. Some experience with machine learning is desirable.

Enquiries and Applications:

Informal enquiries are welcomed and should be directed to Dr Matthew Nunes, [Email Address Removed]. 

Formal applications should be made via the University of Bath’s online application form for a PhD in Statistics (full-time).

See our website for more information about applying for a PhD at Bath.

NOTE: Applications may close earlier than the advertised deadline if a suitable candidate is found; therefore early application is recommended.

Funding Eligibility:

In order to be considered for a studentship, you must qualify as a Home student. In determining Home student status, we follow the UK government’s fee regulations and guidance which, when available, will be set out by the UK Council for International Student Affairs (UKCISA) on their website. Although not yet confirmed, we expect that the main categories of students generally eligible for Home fee status will be:

  • UK nationals (who have lived in the UK, EU, EEA or Switzerland continuously since September 2018)
  • Irish nationals (who have lived in the UK or Ireland continuously since September 2018)
  • EU/EEA applicants with pre-settled status or settled status in the UK under the EU Settlement Scheme (who have lived in the UK, EU, EEA, Switzerland or Gibraltar continuously since September 2018)
  • Applicants with indefinite leave to enter/remain in the UK (who have been resident in the UK continuously since September 2018)

EU/EEA citizens who live outside the UK are unlikely to be eligible for Home fees and funding.

Additional information may be found on our fee status guidance webpage, on the GOV.UK website and on the UKCISA website


Computer Science (8) Mathematics (25)

Funding Notes

A studentship is available for up to 4 years funded by the Centre for Doctoral Training in Statistical Applied Maths at Bath (SAMBa) and Novartis. Funding covers Home tuition fees, a stipend (£15,609 per annum, 2021/22 rate) and research/training expenses (£1,000 per annum). Eligibility criteria apply - see above.

References

Dwork, C., Roth, A. (2014). The Algorithmic Foundations of Differential Privacy, https://dx.doi.org/10.1561/0400000042
Wood, A. et al. (2019). Differential Privacy: A Primer for a Non-Technical Audience, https://dx.doi.org/10.2139/ssrn.3338027
Chaudhuri, K., Monteleoni, C., Sarwate, A. (2011). Differentially Private Empirical Risk Minimization, https://www.ncbi.nlm.nih.gov/pubmed/21892342
Dankar, F., Emam, K. (2013). Practicing Differential Privacy in Health Care: A Review, Transactions on Data Privacy.
O'Keefe, C., Rubin, D. (2015). Individual privacy versus public good: protecting confidentiality in health research, https://dx.doi.org/10.1002/sim.6543
Taylor, L., Zhou, X., Rise, P. (2018). A tutorial in assessing disclosure risk in microdata, https://dx.doi.org/10.1002/sim.7667
Jordon, J., Yoon, J., Schaar, M. (2019). PATE-GAN: Generating Synthetic Data with Differential Privacy Guarantees.
Homer, N. et al. (2008). Resolving Individuals Contributing Trace Amounts of DNA to Highly Complex Mixtures Using High-Density SNP Genotyping Microarrays, https://dx.doi.org/10.1371/journal.pgen.1000167

How good is research at University of Bath in Law?


Research output data provided by the Research Excellence Framework (REF)

Click here to see the results for all UK universities

Where will I study?

Search Suggestions
Search suggestions

Based on your current searches we recommend the following search filters.