About the Project
Earables are devices that can be worn in, on, or around the ear and represent the next generation of commodity wearable devices. There is a plethora of phenomena that can be sensed on, in, or around the ear and earables can be utilised for a wide variety of applications . This includes physiological sensing for health applications (e.g., heart rate  or respiration ), activity detection for fitness (e.g., exercise detection ), and novel methods of interaction (e.g., facial expressions , ear rumbling , or mouth gestures ). Earables are portable and their small and lightweight form factor allows them to be worn for prolonged periods throughout the day whilst being unobtrusive, discreet, inconspicuous, and non-stigmatizing. Additionally, earables can leverage an existing ubiquitous platform that many people use in the form of earphones or hearing aids.
There are several exciting opportunities to explore in the earable space , one of which is investigating how earables fit in the commodity wearable ecosystem, which primarily consists of smartphones and smartwatches. Due to their unique position on the head, earables can complement other data streams to provide additional context and a more holistic view of the individual during a wide range of activities . In this project, we will push the boundaries of what is possible on the earable platform by developing novel sensing and interactive systems that combine data from multiple wearable sensors. We will explore their application and usage using both lab studies and in-the-wild deployments. Please get in touch for more details and to discuss the project in more depth.
Informal enquiries about the project should be directed to Dr Clarke.
Applicants should hold, or expect to receive, a First or Upper Second Class Honours degree or a Master's degree in a relevant subject.
Formal applications should be accompanied by a research proposal and made via the University of Bath’s online application form for the PhD in Computer Science programme. Further information about the application process can be found here.
Start date: On or after 8 January 2024.
 Röddiger, T., Clarke, C., Breitling, P., Schneegans, T., Zhao, H., Gellersen, H., & Beigl, M. (2022). Sensing with Earables: A Systematic Literature Review and Taxonomy of Phenomena. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, 6(3), 1-57. Available at: https://dl.acm.org/doi/pdf/10.1145/3550314.
 Passler, S., Müller, N., & Senner, V. (2019). In-ear pulse rate measurement: a valid alternative to heart rate derived from electrocardiography?. Sensors, 19(17), 3641.
 Röddiger, T., Wolffram, D., Laubenstein, D., Budde, M., & Beigl, M. (2019, September). Towards respiration rate monitoring using an in-ear headphone inertial measurement unit. In Proceedings of the 1st International Workshop on Earable Computing (pp. 48-53).
 Strömbäck, D., Huang, S., & Radu, V. (2020). Mm-fit: Multimodal deep learning for automatic exercise logging across sensing devices. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, 4(4), 1-22.
 Matthies, D. J., Strecker, B. A., & Urban, B. (2017, May). Earfieldsensing: A novel in-ear electric field sensing to enrich wearable gesture input through facial expressions. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (pp. 1911-1922).
 Röddiger, T., Clarke, C., Wolffram, D., Budde, M., & Beigl, M. (2021, May). EarRumble: Discreet Hands-and Eyes-Free Input by Voluntary Tensor Tympani Muscle Contraction. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (pp. 1-14).
 Sun, W., Li, F. M., Steeper, B., Xu, S., Tian, F., & Zhang, C. (2021, April). Teethtap: Recognizing discrete teeth gestures using motion and acoustic sensing on an earpiece. In 26th International Conference on Intelligent User Interfaces (pp. 161-169).
 Wang, Y., Ding, J., Chatterjee, I., Salemi Parizi, F., Zhuang, Y., Yan, Y., ... & Shi, Y. (2022, April). FaceOri: Tracking Head Position and Orientation Using Ultrasonic Ranging on Earphones. In CHI Conference on Human Factors in Computing Systems (pp. 1-12).