European Molecular Biology Laboratory (Heidelberg) Featured PhD Programmes
University of Southampton Featured PhD Programmes
Bournemouth University Featured PhD Programmes

Markerless Tracking for the Clinical Analysis of Patients with Symptomatic Gait

This project is no longer listed on and may not be available.

Click here to search for PhD studentship opportunities
  • Full or part time
    Dr T Stewart
    Dr H Wang
  • Application Deadline
    No more applications being accepted
  • Competition Funded PhD Project (European/UK Students Only)
    Competition Funded PhD Project (European/UK Students Only)

Project Description

Clinical gait analysis has several limitations including high BMI (skin artefact), patient sensitivity, co-morbidity (other health issues), and importantly the time it takes to apply markers to the patient. Recent advancements in gaming technologies that involve markerless tracking may have potential to improve efficiencies in the clinical setting. The student will utilise advancements in machine learning towards investigating markerless tracking in a clinical setting.

Biotribology involves the analysis of motion and load to optimise friction, lubrication and wear in natural and artificial joints. The complex movement patterns that occur locally between the joint surfaces can be analysed using gait analysis with subsequent multibody dynamic modelling of joint reaction forces. This data can then be used for the design of joint replacements or other tissue engineered solutions that enable movement to be returned to the patient following joint disease. For these solutions to work effectively the patient requires a dynamic movement pattern that encourages improved lubrication mechanisms. Presently, this is limited by the ability to collect large numbers of patient movement data under a broad range of activities.

Pose reconstruction or tracking has been widely studied in computer vision. It aims to recover the skeletal motions of people in videos captured in different settings, e.g. monocular vs multi-view camera, controlled environment vs in-the-wild. The research in this field has been driven by data-driven approaches such as traditional machine learning (statistical or others) and cutting-edge deep learning. Its applications include action recognition for surveillance purposes, autonomous vehicles on pedestrian behavioural prediction, action sensing in Kinect for games, etc. This line of research is superior in term of its physical non-invasiveness, as oppose to other motion tracking methods, but inferior in terms of accuracy, being subject to occlusions, lighting and other conditions.

The potential exists to utilise markerless tracking to open up gait analysis to a broader clinical spectrum of patients thus enabling more patients to benefit from improved joint therapies. This project aims to, based on existing pose reconstruction research, propose new reliable, accurate and non-invasive way (e.g. camera-based) to accurately reconstruct poses and motions, with the main purpose of serving medical diagnosis for injuries and potential pathological motion abnormalities. The project will lead to direct benefits to the patient and the healthcare system.

Funding Notes

UK/EU – Engineering & Physical Sciences Research Council Collaborative Studentships paying academic fees of £4,600 for Session 2020/21, together with a maintenance grant of £15,009 for Session 2020/21 paid at standard Research Council rates for 3.5 years. UK applicants will be eligible for a full award paying tuition fees and maintenance. European Union applicants will be eligible for an award paying tuition fees only, except in exceptional circumstances, or where residency has been established for more than 3 years prior to the start of the course. Funding is awarded on a competitive basis.

How good is research at University of Leeds in General Engineering?

FTE Category A staff submitted: 44.80

Research output data provided by the Research Excellence Framework (REF)

Click here to see the results for all UK universities

FindAPhD. Copyright 2005-2020
All rights reserved.