Coventry University Featured PhD Programmes
Heriot-Watt University Featured PhD Programmes
University of Kent Featured PhD Programmes
Imperial College London Featured PhD Programmes
Birkbeck, University of London Featured PhD Programmes

PhD Studentship in Machine Listening - investigating tools for sound event detection and audio context recognition from everyday sound scenes

This project is no longer listed in the FindAPhD
database and may not be available.

Click here to search the FindAPhD database
for PhD studentship opportunities
  • Full or part time
    Dr E Benetos
  • Application Deadline
    No more applications being accepted
  • Funded PhD Project (Students Worldwide)
    Funded PhD Project (Students Worldwide)

About This PhD Project

Project Description

Applications are invited for a fully-funded PhD studentship in Machine Listening / Computer Audition within the School of Electronic Engineering and Computer Science at Queen Mary University of London, to conduct research in the area of computational sound scene analysis. This research will investigate and prototype tools for sound event detection and audio context recognition from everyday sound scenes. This PhD position is linked with the EPSRC-funded project “Integrating sound and context recognition for acoustic scene analysis” on developing technologies for context-aware sound recognition. The successful candidate will investigate, propose and develop machine learning and digital signal processing methods for sound recognition, suitable for complex and time-varying acoustic environments.

All nationalities are eligible to apply for this studentship, which will start in Autumn 2018. The studentship is for three years, and covers student fees as well as a tax-free stipend of £16,777 per annum.

Candidates must have a first-class honours degree or equivalent, and/or a good MSc Degree in Computer Science, Electronic Engineering, Audio/Music Technology, Acoustics, or a related discipline. Candidates should have good programming experience in Python, Matlab, C/C++ or similar. Knowledge of machine learning and/or digital signal processing is desirable. Experience in research and a track record of publications is advantageous. There is scope to tailor the research to the interests and skills of the successful candidate.

The PhD supervisor will be Dr Emmanouil Benetos. This project is based in the Centre for Digital Music (C4DM) and Centre for Intelligent Sensing (CIS) of Queen Mary University of London. C4DM is a world-leading multidisciplinary research group in the field of Digital Music & Audio Technology; CIS has highly reputed research expertise in multi-sensor data processing, distributed signal processing, vision and audio analysis. Both groups are part of the School of Electronic Engineering and Computer Science (EECS). Details about the School can be found at http://www.eecs.qmul.ac.uk; details about C4DM at http://c4dm.eecs.qmul.ac.uk; and details about CIS at http://cis.eecs.qmul.ac.uk/ . Informal enquiries about the studentship can be made by email to Dr Benetos ([Email Address Removed]).

To apply, please follow the on-line process at (https://www.qmul.ac.uk/postgraduate/research/subjects/); click on the list of Research Degree Subjects, select ‘Electronic Engineering’ in the ‘A-Z list of research opportunities’, and follow the instructions on the right-hand side of the web page.

Please note that instead of the ‘Research Proposal’ we request a ‘Statement of Research Interests’. Your statement should answer two questions: (i) Why are you interested in the topic described above? (ii) What relevant experience do you have? Your statement should be brief: no more than 500 words or one side of A4 paper. In addition we would also like you to send a sample of your written work (e.g. excerpt of final year dissertation or published academic paper). More details can be found at: http://www.eecs.qmul.ac.uk/phd/how-to-apply

The closing date for the applications is 6/07/2018.
Interviews are expected to take place in mid-July 2018.


FindAPhD. Copyright 2005-2019
All rights reserved.