Get free PhD updates, every week | SIGN UP NOW Get free PhD updates, every week | SIGN UP NOW

Vision-based AI for automatic detection of individual and social behaviour in Rodents


   UKRI Centre for Doctoral Training in Socially Intelligent Artificial Agents

This project is no longer listed on FindAPhD.com and may not be available.

Click here to search FindAPhD.com for PhD studentship opportunities
  Mr Jared de Bruin  No more applications being accepted  Funded PhD Project (Students Worldwide)

About the Project

For instructions on how to apply, please see: PhD Studentships: UKRI Centre for Doctoral Training in Socially Intelligent Artificial Agents

Supervisors:

  • Marwa Mahmoud: School of Computing Science
  • Cassandra Sampaio Batista: School of Psychology

Rodents are the most extensively used models to understand the cellular and molecular underpinnings of behaviour, neurodegenerative and psychiatric disorders, as well as, for the development of interventions and pharmacological treatments. Screening behavioural phenotypes in rodents is very time consuming, as a large battery of cognitive and motor behavioural tests is necessary. Further, standard behavioural testing usually requires the removal of the animal from their home-cage environment and individual testing, therefore excluding the assessment of spontaneous social interactions. Monitoring of home-cage spontaneous behaviours, such as eating, grooming, sleeping and social interactions, has already proven to be sensitive to different models of neurodevelopmental and neurodegenerative disorders. For instance, home-cage monitoring can distinguish different mouse strains and models of autistic-like behaviour (Jhuang et al., 2010) and detect early alterations in sleep patterns before behavioural alterations in a rodent model of amyotrophic lateral sclerosis (ALS) (Golini et al., 2020).

Most of the traditional home-cage monitoring systems use sensors and therefore are restricted on the type of activities that it can detect, requiring the animals to interact with the sensors (Goulding et al., 2008; Kiryk et al., 2020; Voikar and Gaburro, 2020). The recent development of vision-based computing and machine learning opens up the possibility to monitor and potentially label all home-cage behaviours automatically (Jhuang et al., 2010; Mathis et al., 2018). Still, most automatic detection machine learning-based work has focused on movements, mainly joints and movements trajectory (Mathis et al., 2018) rather than social or group behaviour.

Aims/objectives/novelty.

The aim of this PhD is to leverage the advancements of computer vision for animal behaviour understanding (Pessanha et. al. 2020) and build machine learning models that can automatically interpret and classify different individual and social behaviours by analysing videos collected using continuous monitoring.

Objectives:

  1. Define a set of behavioural and social cues that are relevant to understanding their interactions and group behaviour. This will include building a dataset of their spontaneous social behaviour.
  2. Developing computer vision and machine learning models to automatically detect and classify these behaviours.
  3. Validate and evaluate the developed tools on disorder models (e.g learning deficits, stroke, etc.)?

Expected outcome/impact

The models developed in this project will have wide applications, both in academic research as well as industry, not only by providing tools for automatic behavioural phenotyping but also as means to measure animal welfare during these experiments and procedures.


References

Golini, E., Rigamonti, M., Iannello, F., De Rosa, C., Scavizzi, F., Raspa, M., Mandillo, S., 2020. A Non-invasive Digital Biomarker for the Detection of Rest Disturbances in the SOD1G93A Mouse Model of ALS. Front Neurosci 14, 896.
Goulding, E.H., Schenk, A.K., Juneja, P., MacKay, A.W., Wade, J.M., Tecott, L.H., 2008. A robust automated system elucidates mouse home cage behavioral structure. Proc Natl Acad Sci U S A 105, 20575-20582.
Jhuang, H., Garrote, E., Mutch, J., Yu, X., Khilnani, V., Poggio, T., Steele, A.D., Serre, T., 2010. Automated home-cage behavioural phenotyping of mice. Nat Commun 1, 68.
Kiryk, A., Janusz, A., Zglinicki, B., Turkes, E., Knapska, E., Konopka, W., Lipp, H.P., Kaczmarek, L., 2020. IntelliCage as a tool for measuring mouse behavior – 20 years perspective. Behav Brain Res 388, 112620.
Mathis, A., Mamidanna, P., Cury, K.M., Abe, T., Murthy, V.N., Mathis, M.W., Bethge, M., 2018. DeepLabCut: markerless pose estimation of user-defined body parts with deep learning. Nat Neurosci 21, 1281-1289.
Voikar, V., Gaburro, S., 2020. Three Pillars of Automated Home-Cage Phenotyping of Mice: Novel Findings, Refinement, and Reproducibility Based on Literature and Experience. Front Behav Neurosci 14, 575434.
Pessanha F., McLennan K., Mahmoud M. Towards automatic monitoring of disease progression in sheep: A hierarchical model for sheep facial expressions analysis from video in IEEE International Conference on Automatic Face and Gesture Recognition, Buenos Aires, May 2020
PhD saved successfully
View saved PhDs