To Explain or Not to Explain: A Study on the Necessity of Explanations for Autonomous VehiclesDownload PDF

Published: 23 Nov 2022, Last Modified: 07 Jul 2024TEAReaders: Everyone
Keywords: autonomous vehicle, explanation necessity, explanation AI, advanced driver assistance system
Abstract: Explainable AI, in the context of autonomous systems, like self-driving cars, has drawn broad interests from researchers. Recent studies have found that providing explanations for autonomous vehicles' actions has many benefits (e.g., increased trust and acceptance), but put little emphasis on when an explanation is needed and how the content of explanation changes with driving context. In this work, we investigate which scenarios people need explanations and how the critical degree of explanation shifts with situations and driver types. Through a user experiment, we ask participants to evaluate how necessary an explanation is and measure the impact on their trust in self-driving cars in different contexts. Moreover, we present a self-driving explanation dataset with first-person explanations and associated measures of the necessity for 1103 video clips, augmenting the Berkeley Deep Drive Attention dataset. Our research reveals that driver types and driving scenarios dictate whether an explanation is necessary. In particular, people tend to agree on the necessity for near-crash events but hold different opinions on ordinary or anomalous driving situations.
TL;DR: We conduct preliminary study on when explanation for AI decision is necessary in self-driving context
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](
4 Replies