Don't miss our weekly PhD newsletter | Sign up now Don't miss our weekly PhD newsletter | Sign up now

  Foundations of Stochastic Gradient Descent (and Generalization)


   Division of Medical Sciences

This project is no longer listed on FindAPhD.com and may not be available.

Click here to search FindAPhD.com for PhD studentship opportunities
  Assoc Prof Patrick Rebeschini  No more applications being accepted  Funded PhD Project (Students Worldwide)

About the Project

The DPhil in Computational Discovery is a multidisciplinary programme spanning projects in Advanced Molecular Simulations, Machine Learning and Quantum Computing to develop new tools and methodologies for life sciences discovery.

This innovative course has been developed in close partnership between Oxford University and IBM Research. Each research project has been co-developed by Oxford academics working with IBM scientists. Students will have a named IBM supervisor/s and many opportunities for collaboration with IBM throughout the studentship.

The scientific focus of the programme is at the interface between Physical and Life Sciences. By bringing together advances in data and computing science with large complex sets of experimental data more realistic and predictive computational models can be developed. These new tools and methodologies for computational discovery can drive advances in our understanding of fundamental cellular biology and drug discovery. Projects will span the emerging fields of Advanced Molecular Simulations, Machine Learning and Quantum Computing addressing both fundamental questions in each of these fields as well as at their interfaces.

Students will benefit from the interdisciplinary nature of the course cohort as well as the close interactions with IBM Scientists.

Applicants who are offered places will be provided with a funding package that will include fees at the Home rate, a stipend at the standard Research Council rate (currently £17,668 pa) + £2,400 for four years. 

There are 16 projects available and you may identify up to three projects to be considered for in your application. The details of Project 11 are listed below.

There is no application fee to apply to this course. For information on how to apply and entry requirements, please see DPhil in Computational Discovery | University of Oxford.

Project 11

Foundations of Stochastic Gradient Descent (and Generalization)

Stochastic gradient descent is one of the most widely used algorithmic paradigms in modern machine learning. Despite its popularity, there are many open questions related to its generalization capabilities. For instance, while there is preliminary evidence that early-stopped gradient descent applied to over-parameterized models is robust with respect to label mispecifications, a complete theory that can account for this phenomenon is currently lacking. The goal of this project is to rigorously investigate the robustness properties of early-stopped gradient descent from a theoretical point of view in simplified settings involving linear models, and to establish novel connections of such a methodology with the field of distributionally-robust optimization. The project will combine tools from the study of random structures in high-dimensional probability (e.g., concentration inequalities, theory of optimal transport) with the general framework of gradient and mirror descent methods from optimization and online learning (e.g., regularization). 

The project is mathematically-oriented and it is based on topics covered in the lecture notes of the Oxford course https://www.stats.ox.ac.uk/~rebeschi/teaching/AFoL/22/.

Numerical experiments might be used (but not necessarily) to validate theoretical findings, most likely using Python.

Computer Science (8) Mathematics (25)
Search Suggestions
Search suggestions

Based on your current searches we recommend the following search filters.

 About the Project