GAZEL: Runtime Gaze Tracking for Smartphones

Joonbeom Park, Seonghoon Park, Hojung Cha

Published: 01 Jan 2021, Last Modified: 16 Apr 2025PerCom 2021EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Although work has been conducted on smartphone gaze tracking, the existing techniques are not pervasively used because of their heavy weight and low accuracy. Our preliminary analysis shows that these techniques would work better if their models were trained with data from tablets which have large screens. In this paper, we propose GAZEL, a runtime smartphone gaze-tracking scheme that achieves high accuracy on real devices. The key idea of GAZEL, a tablet-to-smartphone transfer learning, is to train a CNN model with data collected from tablets and then transplant the model to a smartphone. To achieve the goal, we designed a new CNN-based model architecture that is head pose resilient and light enough to operate at runtime. We also exploit implicit calibration to alleviate errors caused by differences in users' visual and device characteristics. The experiment results with commercial smartphones show that GAZEL achieves 27.5% better accuracy on smartphones compared to the state-of-the-art techniques and provides gaze tracking at up to 18 fps which is practically usable at runtime.
Loading