Don't miss our weekly PhD newsletter | Sign up now Don't miss our weekly PhD newsletter | Sign up now

  Instruction Following for Resource Efficient NLP Models


   School of Engineering Mathematics and Technology

   Monday, January 27, 2025  Competition Funded PhD Project (Students Worldwide)

About the Project

The project:

Large language models (LLMs) are known for their ability to follow instructions and learn from examples provided inside a prompt. This provides an extremely powerful interface for people to interact with machines and ask them to perform tasks, such as to collate and summarise information in a particular format. However, LLMs demand a huge amount of resources in terms of computing power, memory and energy, and can be outperformed by smaller, specialised models for tasks like text classification in a specific domain [1]. This project explores how to develop instruction following capabilities for smaller neural network models, aiming to create efficient models that can be adapted to specialised tasks by end users through prompts. The project will investigate approaches such as distillation from LLMs to small models [2] to combine multiple user instructions, and the potential to learn from explanations [3] provided by a human-in-the-loop to correct a model’s errors. The supervisor has several ongoing industry and academic partnerships, which will allow the PhD student to target their research towards a real-world information processing application in healthcare, medical research, finance, or intelligence analysis.

References:

  • [1] J. Kocoń, et al., “ChatGPT: Jack of all trades, master of none,” Information Fusion, Volume 99, p. 101861, 2023.
  • [2] L. Vöge, V. Gurgul, S. Lessmann, “Leveraging Zero-Shot Prompting for Efficient Language Model Distillation,” arXiv preprint arXiv:2403.15886, 2024
  • [3] R. Menon, S. Ghosh, S. Srivastava, “CLUES: A benchmark for learning classifiers using natural language explanations,” ACL, 2022.
  • Related recent work from our lab on interactive machine learning for NLP:
  • H. Fang, J. Gor, E. Simpson, “Efficiently acquiring human feedback with Bayesian deep learning,” in Proc. 1st Workshop on Uncertainty-aware NLP, 2024.
  • Y. Ye, E. Simpson, “Towards abstractive timeline summarisation using preference-based reinforcement learning,” in Proc. ECAI, 2023.

Candidate requirements:   

Applicants must hold/achieve a minimum of a merit at master’s degree level (or international equivalent) in a science, mathematics or engineering discipline. Applicants without a master's qualification may be considered on an exceptional basis, provided they hold a first-class undergraduate degree. Please note, acceptance will also depend on evidence of readiness to pursue a research degree. 

If English is not your first language, you need to meet this profile level: Profile E 

Further information about English language requirements and profile levels

Contacts:  

For questions about the research topic, please contact the project supervisor.

For questions about eligibility and the application process please contact Engineering Postgraduate Research Admissions  

How to apply:  

Prior to submitting an online application, you will need to contact the project supervisor to discuss. 

Online applications are made at http://www.bris.ac.uk/pg-howtoapply. Please select Engineering Mathematics (PhD) on the Programme Choice page. You will be prompted to enter details of any studentship you would like to be considered for in the Funding and Research Details sections of the form. 

Computer Science (8)

Funding Notes

This project does not guarantee funding. Please discuss with your potential supervisor if you would like to be considered for any available studentships.


Open Days


Register your interest for this project



Where will I study?