SenseScene – ambient sensing and mobile assistive intelligent agents to aid individuals with visual sensory impairments


   Faculty of Computing, Engineering and the Built Environment

  Dr Joseph Rafferty  Monday, February 26, 2024  Competition Funded PhD Project (Students Worldwide)

About the Project

Globally, around 1.3 billion people have various levels of visual impairments, ranging from complete blindness to mild vision issues. Even mild vision problems can affect a person's ability to do everyday tasks. Visual impairments can be present from birth or develop later in life, often due to factors like aging, diseases, injuries, or genetics. People over the age of 50 are more prone to vision issues, particularly conditions like macular degeneration and diabetes-related problems.

These visual impairments are a big concern because they are linked to complications from injuries. For example, elderly individuals may fall, while diabetics can experience foot injuries, and these complications can reduce both lifespan and quality of life. Having visual impairments increases the risk of such injuries, therefore, solutions that can support the reduction of the risk are essential.

In the past, technology has been used to assist people with visual impairments. Solutions have included software to read text aloud, voice control, electronic vision enhancement systems, and eye implants. However, these solutions have limitations, like not being able to help with complex activities or requiring wearable devices and surgery.

This project aims to explore how pervasive computing and ambient assistive technologies can be used as a solution to improve the lives of people with visual impairments.

The goal of the project is to support the completion of daily tasks and the navigation of unfamiliar spaces while addressing the shortcomings of current solutions for individuals with visual impairments.

The project will aim to provide a technological solution that will support individuals with visual impairments within their daily lives. More specifically, the project will focus on creating a conversational assistive agent that can interact with an ambient sensing system. This agent will make indoor navigation and interacting with the environment more natural for individuals with visual impairments.

Computer Science (8)

References

[1] R. R. A. Bourne et al., “Magnitude, temporal trends, and projections of the global prevalence of blindness and distance and near vision impairment: a systematic review and meta-analysis,” Lancet Glob Health, vol. 5, no. 9, pp. e888–e897, Sep. 2017, doi: 10.1016/S2214-109X(17)30293-0.
[2] J. E. Keeffe, C. F. Jin, L. M. Weih, C. A. McCarty, and H. R. Taylor, “Vision impairment and older drivers: who’s driving?,” British Journal of Ophthalmology, vol. 86, no. 10, pp. 1118 LP – 1121, Oct. 2002.
[3] C. Huisingh, E. B. Levitan, M. R. Irvin, P. MacLennan, V. Wadley, and C. Owsley, “Visual Sensory and Visual-Cognitive Function and Rate of Crash and Near-Crash Involvement Among Older Drivers Using Naturalistic Driving Data,” Invest Ophthalmol Vis Sci, vol. 58, no. 7, p. 2959, 2017, doi: 10.1167/iovs.17-21482.
[4] S. Blancafort Alias et al., “Exploring vision-related quality of life: a qualitative study comparing patients’ experience of cataract surgery with a standard Monofocal IOL and an enhanced Monofocal IOL,” Clinical Ophthalmology, pp. 1641–1652, 2022.
[5] C. Sabanayagam and C.-Y. Cheng, “Global causes of vision loss in 2015: are we on track to achieve the Vision 2020 target?,” Lancet Glob Health, vol. 5, no. 12, pp. e1164–e1165, Dec. 2017, doi: 10.1016/S2214-109X(17)30412-6.
[6] J. Rafferty, J. Synnott, C. Nugent, G. Morrison, and E. Tamburini, Fall detection through thermal vision sensing, vol. 10070 LNCS. 2016. doi: 10.1007/978-3-319-48799-1_10.
[7] B. Gopinath, C. M. McMahon, G. Burlutsky, and P. Mitchell, “Hearing and vision impairment and the 5-year incidence of falls in older adults,” Age Ageing, vol. 45, no. 3, pp. 409–414, 2016.
[8] E. E. Low, E. Inkellis, and S. Morshed, “Complications and revision amputation following trauma-related lower limb loss,” Injury, vol. 48, no. 2, pp. 364–370, Feb. 2017, doi: 10.1016/J.INJURY.2016.11.019.
[9] R. E. Pecoraro, G. E. Reiber, and E. M. Burgess, “Pathways to Diabetic Limb Amputation: Basis for Prevention,” Diabetes Care, vol. 13, no. 5, pp. 513 LP – 521, May 1990.
[10] B. Kuriakose, R. Shrestha, and F. E. Sandnes, “Tools and technologies for blind and visually impaired navigation support: a review,” IETE Technical Review, vol. 39, no. 1, pp. 3–18, 2022.
[11] D. S. Kunapareddy, N. P. K. Putta, V. A. Maddala, R. K. Bethapudi, and S. R. Vanga, “Smart Vision based Assistant for Visually Impaired,” in 2022 International Conference on Applied Artificial Intelligence and Computing (ICAAIC), 2022, pp. 1178–1184.
[12] R. Thomas, L. Barker, G. Rubin, and A. Dahlmann-Noor, “Assistive technology for children and young people with low vision,” Cochrane database of systematic reviews, no. 6, 2015.
[13] A. Zolyomi, A. Shukla, and J. Snyder, “Technology-Mediated Sight : A Case Study of Early Adopters of a Low Vision Assistive Technology,” Proceedings of the 19th International ACM SIGACCESS Conference on Computers and Accessibility - ASSETS ’17, pp. 220–229, 2017, doi: 10.1145/3132525.3132552.
[14] B. Li et al., “Vision-based Mobile Indoor Assistive Navigation Aid for Blind People,” IEEE Trans Mob Comput, vol. 1233, no. c, pp. 1–1, 2018, doi: 10.1109/TMC.2018.2842751.
Search Suggestions
Search suggestions

Based on your current searches we recommend the following search filters.