Weekly PhD Newsletter | SIGN UP NOW Weekly PhD Newsletter | SIGN UP NOW

Combining Earables with Commodity Wearables for Novel Sensing and Interaction


   Department of Computer Science

This project is no longer listed on FindAPhD.com and may not be available.

Click here to search FindAPhD.com for PhD studentship opportunities
  Dr Christopher Clarke, Dr Christof Lutteroth, Dr Adwait Sharma  No more applications being accepted  Competition Funded PhD Project (UK Students Only)

About the Project

The University of Bath is inviting applications for the following PhD project commencing in October 2023.

Overview of the Research:

Earables are devices that can be worn in, on, or around the ear and represent the next generation of commodity wearable devices. There is a plethora of phenomena that can be sensed on, in, or around the ear and earables can be utilised for a wide variety of applications [1]. This includes physiological sensing for health applications (e.g., heart rate [2] or respiration [3]), activity detection for fitness (e.g., exercise detection [4]), and novel methods of interaction (e.g., facial expressions [5], ear rumbling [6], or mouth gestures [7]). Earables are portable and their small and lightweight form factor allows them to be worn for prolonged periods throughout the day whilst being unobtrusive, discreet, inconspicuous, and non-stigmatizing. Additionally, earables can leverage an existing ubiquitous platform that many people use in the form of earphones or hearing aids.

There are several exciting opportunities to explore in the earable space [1], one of which is investigating how earables fit in the commodity wearable ecosystem, which primarily consists of smartphones and smartwatches. Due to their unique position on the head, earables can complement other data streams to provide additional context and a more holistic view of the individual during a wide range of activities [8]. In this project, we will push the boundaries of what is possible on the earable platform by developing novel sensing and interactive systems that combine data from multiple wearable sensors. We will explore their application and usage using both lab studies and in-the-wild deployments. Please get in touch for more details and to discuss the project in more depth.

Project keywords: Earables; Wearables; Human-computer interaction; Interaction; Sensing.

Candidate Requirements:

Applicants should hold, or expect to receive, a First Class or good Upper Second Class Honours degree (or the equivalent) in a relevant subject. A master’s level qualification would also be advantageous.

Non-UK applicants must meet our English language entry requirement.

Enquiries and Applications:

Informal enquiries are welcomed and should be directed to Dr Christopher Clarke on email address [Email Address Removed].

Formal applications should be submitted via the University of Bath’s online application form for a PhD in Computer Science prior to the application deadline of Sunday 22 January 2023.

More information about applying for a PhD at Bath may be found on our website.

Funding Eligibility:

To be eligible for funding, you must qualify as a Home student. The eligibility criteria for Home fee status are detailed and too complex to be summarised here in full; however, as a general guide, the following applicants will normally qualify subject to meeting residency requirements: UK and Irish nationals (living in the UK or EEA/Switzerland), those with Indefinite Leave to Remain and EU nationals with pre-settled or settled status in the UK under the EU Settlement Scheme. This is not intended to be an exhaustive list. Additional information may be found on our fee status guidance webpage, on the GOV.UK website and on the UKCISA website.

Exceptional Overseas students (e.g. with a UK Master’s Distinction or international equivalent and relevant research experience), who are interested in this project, should contact the lead supervisor in the first instance to discuss the possibility of applying for supplementary funding.

Equality, Diversity and Inclusion:

We value a diverse research environment and aim to be an inclusive university, where difference is celebrated and respected. We welcome and encourage applications from under-represented groups.

If you have circumstances that you feel we should be aware of that have affected your educational attainment, then please feel free to tell us about it in your application form. The best way to do this is a short paragraph at the end of your personal statement.


Funding Notes

A studentship includes Home tuition fees, a stipend (£17,668 per annum, 2022/23 rate) and research/training expenses (£1,000 per annum) for up to 3.5 years. Eligibility criteria apply – see Funding Eligibility section above.

References

[1] Röddiger, T., Clarke, C., Breitling, P., Schneegans, T., Zhao, H., Gellersen, H., & Beigl, M. (2022). Sensing with Earables: A Systematic Literature Review and Taxonomy of Phenomena. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, 6(3), 1-57. Available at: https://dl.acm.org/doi/pdf/10.1145/3550314.
[2] Passler, S., Müller, N., & Senner, V. (2019). In-ear pulse rate measurement: a valid alternative to heart rate derived from electrocardiography?. Sensors, 19(17), 3641.
[3] Röddiger, T., Wolffram, D., Laubenstein, D., Budde, M., & Beigl, M. (2019, September). Towards respiration rate monitoring using an in-ear headphone inertial measurement unit. In Proceedings of the 1st International Workshop on Earable Computing (pp. 48-53).
[4] Strömbäck, D., Huang, S., & Radu, V. (2020). Mm-fit: Multimodal deep learning for automatic exercise logging across sensing devices. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, 4(4), 1-22.
[5] Matthies, D. J., Strecker, B. A., & Urban, B. (2017, May). Earfieldsensing: A novel in-ear electric field sensing to enrich wearable gesture input through facial expressions. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (pp. 1911-1922).
[6] Röddiger, T., Clarke, C., Wolffram, D., Budde, M., & Beigl, M. (2021, May). EarRumble: Discreet Hands-and Eyes-Free Input by Voluntary Tensor Tympani Muscle Contraction. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (pp. 1-14).
[7] Sun, W., Li, F. M., Steeper, B., Xu, S., Tian, F., & Zhang, C. (2021, April). Teethtap: Recognizing discrete teeth gestures using motion and acoustic sensing on an earpiece. In 26th International Conference on Intelligent User Interfaces (pp. 161-169).
[8] Wang, Y., Ding, J., Chatterjee, I., Salemi Parizi, F., Zhuang, Y., Yan, Y., ... & Shi, Y. (2022, April). FaceOri: Tracking Head Position and Orientation Using Ultrasonic Ranging on Earphones. In CHI Conference on Human Factors in Computing Systems (pp. 1-12).

How good is research at University of Bath in Computer Science and Informatics?


Research output data provided by the Research Excellence Framework (REF)

Click here to see the results for all UK universities
PhD saved successfully
View saved PhDs