Don't miss our weekly PhD newsletter | Sign up now Don't miss our weekly PhD newsletter | Sign up now

  Combining Earables with Commodity Wearables for Novel Sensing and Interaction


   Centre for Accountable, Responsible and Transparent AI

This project is no longer listed on FindAPhD.com and may not be available.

Click here to search FindAPhD.com for PhD studentship opportunities
  Dr Christopher Clarke, Dr Christof Lutteroth, Dr Adwait Sharma  No more applications being accepted  Self-Funded PhD Students Only

About the Project

Earables are devices that can be worn in, on, or around the ear and represent the next generation of commodity wearable devices. There is a plethora of phenomena that can be sensed on, in, or around the ear and earables can be utilised for a wide variety of applications [1]. This includes physiological sensing for health applications (e.g., heart rate [2] or respiration [3]), activity detection for fitness (e.g., exercise detection [4]), and novel methods of interaction (e.g., facial expressions [5], ear rumbling [6], or mouth gestures [7]). Earables are portable and their small and lightweight form factor allows them to be worn for prolonged periods throughout the day whilst being unobtrusive, discreet, inconspicuous, and non-stigmatizing. Additionally, earables can leverage an existing ubiquitous platform that many people use in the form of earphones or hearing aids.

There are several exciting opportunities to explore in the earable space [1], one of which is investigating how earables fit in the commodity wearable ecosystem, which primarily consists of smartphones and smartwatches. Due to their unique position on the head, earables can complement other data streams to provide additional context and a more holistic view of the individual during a wide range of activities [8]. In this project, we will push the boundaries of what is possible on the earable platform by developing novel sensing and interactive systems that combine data from multiple wearable sensors. We will explore their application and usage using both lab studies and in-the-wild deployments. Please get in touch for more details and to discuss the project in more depth.

Informal enquiries about the project should be directed to Dr Clarke. 

Applicants should hold, or expect to receive, a First or Upper Second Class Honours degree or a Master's degree in a relevant subject.

Formal applications should be accompanied by a research proposal and made via the University of Bath’s online application form for the PhD in Computer Science programme. Further information about the application process can be found here.

Start date: On or after 8 January 2024.

Computer Science (8) Engineering (12)

Funding Notes

We welcome applications from candidates who can source their own funding. Tuition fees for the 2023/4 academic year are £4,700 (full-time) for Home students and £26,600 (full-time) for International students. For information about eligibility for Home fee status: https://www.bath.ac.uk/guides/understanding-your-tuition-fee-status/.

References

[1] Röddiger, T., Clarke, C., Breitling, P., Schneegans, T., Zhao, H., Gellersen, H., & Beigl, M. (2022). Sensing with Earables: A Systematic Literature Review and Taxonomy of Phenomena. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, 6(3), 1-57. Available at: https://dl.acm.org/doi/pdf/10.1145/3550314.
[2] Passler, S., Müller, N., & Senner, V. (2019). In-ear pulse rate measurement: a valid alternative to heart rate derived from electrocardiography?. Sensors, 19(17), 3641.
[3] Röddiger, T., Wolffram, D., Laubenstein, D., Budde, M., & Beigl, M. (2019, September). Towards respiration rate monitoring using an in-ear headphone inertial measurement unit. In Proceedings of the 1st International Workshop on Earable Computing (pp. 48-53).
[4] Strömbäck, D., Huang, S., & Radu, V. (2020). Mm-fit: Multimodal deep learning for automatic exercise logging across sensing devices. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, 4(4), 1-22.
[5] Matthies, D. J., Strecker, B. A., & Urban, B. (2017, May). Earfieldsensing: A novel in-ear electric field sensing to enrich wearable gesture input through facial expressions. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (pp. 1911-1922).
[6] Röddiger, T., Clarke, C., Wolffram, D., Budde, M., & Beigl, M. (2021, May). EarRumble: Discreet Hands-and Eyes-Free Input by Voluntary Tensor Tympani Muscle Contraction. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (pp. 1-14).
[7] Sun, W., Li, F. M., Steeper, B., Xu, S., Tian, F., & Zhang, C. (2021, April). Teethtap: Recognizing discrete teeth gestures using motion and acoustic sensing on an earpiece. In 26th International Conference on Intelligent User Interfaces (pp. 161-169).
[8] Wang, Y., Ding, J., Chatterjee, I., Salemi Parizi, F., Zhuang, Y., Yan, Y., ... & Shi, Y. (2022, April). FaceOri: Tracking Head Position and Orientation Using Ultrasonic Ranging on Earphones. In CHI Conference on Human Factors in Computing Systems (pp. 1-12).

How good is research at University of Bath in Computer Science and Informatics?


Research output data provided by the Research Excellence Framework (REF)

Click here to see the results for all UK universities
Search Suggestions
Search suggestions

Based on your current searches we recommend the following search filters.