Gaze is more than just a point: Rethinking visual attention analysis using peripheral vision-based gaze mapping

Published: 01 Jan 2023, Last Modified: 06 Mar 2025ETRA 2023EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: In mobile eye-tracking, visual attention is commonly evaluated using fixation-based measures, which can be mapped to predefined objects of interest for task-specific attention analysis. Even though attention can be directed independently from the fovea, little research can be found on the quantification of peripheral vision for attention analysis. In this work, we discuss the benefits of enhancing traditional mapping methods with near-peripheral information and expand previous research by presenting a novel machine learning-based gaze measure, the visual attention index (VAI), for the analysis of visual attention using dynamic stimuli. Results are discussed using the data of two multi-object mobile eye tracking use cases and visualized using radar graphs.We show that by combining foveal and peripheral vision the VAI is effective for the comparison of visual attention over multiple tasks, trials and subjects, which offers new possibilities for a more realistic and detailed depiction of visual attention in multi-object tasks.
Loading