Pointing Tasks using Spatial Audio on Smartphones for People with Vision ImpairmentsDownload PDF

Anonymous

19 Dec 2021 (modified: 05 May 2023)Submitted to GI 2022Readers: Everyone
Keywords: Human-centered computing - Auditory Feedback, Computing methodologies - Mixed / Augmented reality, Human-centered computing - Empirical studies in accessibility
Abstract: We present an experimental investigation of spatial audio feedback using smartphones to support direction localization in pointing tasks for people with visual impairments (PVIs). We do this using a mobile game based on a bow-and-arrow metaphor. Our game provides a combination of spatial and non-spatial (sound beacon) audio to help the user locate the direction of the target. Our experiments with sighted, sighted-blindfolded, and visually impaired users shows that (a) the efficacy of spatial audio is relatively higher for PVIs than for blindfolded sighted users during the initial reaction time for direction localization, (b) the general behavior between PVIs and blind-folded individuals is statistically similar, and (c) the lack of spatial audio significantly reduces the localization performance even in sighted blind-folded users. Based on our findings, we discuss the system and interaction design implications for making future mobile-based spatial interactions accessible to PVIs.
4 Replies

Loading