Neuromorphic-Enabled Implementation of Extremely Low-Power Gaze Estimation

Published: 01 Jan 2024, Last Modified: 13 May 2025VR Workshops 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Event camera has great potential in the field of eye tracking, while current event-based gaze estimation suffers from complex imaging settings and participation of RGB modality. We propose a novel architecture for completely event-based low-power spiking gaze estimation using only one eye signal. Our architecture employs a wake-up module to judge the state of inputs, and then enters one of the three modules including hibernation, lightweight SNN segmentation network, and image processing module to obtain the gaze estimation results. We prove that our method performs better in terms of accuracy and power consumption on Angelopoulos's dataset.
Loading