Appearance-Based Gaze Estimation for Driver MonitoringDownload PDF

Published: 20 Oct 2022, Last Modified: 05 May 2023Gaze Meets ML 2022 PosterReaders: Everyone
Keywords: Driver attention, L3 autonomy, Takeover request, CNN-based gaze estimation
TL;DR: We studied CNN-based gaze estimation using real and synthetic data and discussed the influence of evaluation metrics for assessing driver attention in take over request level 3 autonomy.
Abstract: Driver inattention is a leading cause of road accidents through its impact on reaction time in the face of incidents. In the case of Level-3 (L3) vehicles, inattention adversely impacts the quality of driver take over and therefore the safe performance of L3 vehicles. There is a high correlation between a driver’s visual attention and eye movement. Gaze angle is an excellent surrogate for assessing driver attention zones, in both cabin interior and on-road scenarios. We propose appearance-based gaze estimation approaches using convolutional neural networks (CNNs) to estimate gaze angle directly from eye images and also from eye landmark coordinates. The goal is to improve learning by utilizing synthetic data with more accurate annotations. Performance analysis shows that our proposed landmark-based model, trained synthetically, is capable of predicting gaze angle in the real data with a reasonable angular error. In addition, we discuss evaluation metrics are application specific and there is a crucial requirement for a more reliable assessment metric rather than common mean angular error to measure the driver's gaze direction in L3 autonomy for a control takeover request at a proper time corresponding to the driver's attention focus to avoid ambiguities.
Submission Type: Full Paper
Travel Award - Academic Status: Not a student
Travel Award - Institution And Country: N/A
Travel Award - Low To Lower-middle Income Countries: N/A
Camera Ready Latexfile: zip
5 Replies