The Robotic Perception Lab at the University of Sydney, Australia is offering a fully funded PhD position in tracking and mapping for augmented reality for the visual impaired, in partnership with ARIA LLC.
Applicants with a strong background in mechatronic engineering, computer science, machine learning, or similar programs are encouraged to apply.
Project ARIA, Augmented Reality in Audio, seeks to endow the visually impaired with a richer sense of their surroundings using a wearable augmented reality device. Building on technologies from robotics, augmented reality, and spatialised audio display, ARIA will deliver next-generation solutions to improve quality of life for millions of people affected by vision impairment worldwide.
There are multiple Postdoctoral and PhD roles open as part of Project ARIA, all postings will be available at https://ariaresearch.com.au. This Tracking and Mapping-focused position will aim at developing algorithms for robust detection and tracking of static and moving objects from images as well as the integration into a egocentric map representation. The main challenge is to develop a scene representation that can be easily translated to pleasant and informative sound. Other key constraints are available such as limited power consumption and compute, and tolerance to latency.
The project's aims include:
- Developing robust techniques for objects segmentation and tracking,
- Integrating information from object recognition and localisation modules,
- Proposing a novel representation of the dynamic scenes to be translated to sound, and
- Meeting the specifications of the device and embedded processing.