Exploring Practitioner Perspectives On Training Data Attribution Explanations

Published: 27 Oct 2023, Last Modified: 22 Nov 2023NeurIPS XAIA 2023EveryoneRevisionsBibTeX
TL;DR: Interviewing ML practitioners to explore the human factor of training data attribution explanations.
Abstract: Explainable AI (XAI) aims to provide insight into opaque model reasoning to humans and as such is an interdisciplinary field by nature. In this paper, we interviewed 10 practitioners to understand the possible usability of training data attribution (TDA) explanations and to explore the design space of such an approach. We confirmed that training data quality is often the most important factor for high model performance in practice and model developers mainly rely on their own experience to curate data. End-users expect explanations to enhance their interaction with the model and do not necessarily prioritise but are open to training data as a means of explanation. Within our participants, we found that TDA explanations are not well-known and therefore not used. We urge the community to focus on the utility of TDA techniques from the human-machine collaboration perspective and broaden the TDA evaluation to reflect common use cases in practice.
Submission Track: Full Paper Track
Application Domain: None of the above / Not applicable
Clarify Domain: Human Computer Interaction
Survey Question 1: Training data attribution (TDA) explains model behaviour by attributing a model decision to training samples. We conducted a qualitative interview study on practitioners' perspectives on TDA explanations to study the human factor of TDA.
Survey Question 2: We study an explainability method itself (training data attribution). Our objective is to understand what kind of practical challenges TDA could address and identify the steps required for bridging the gap between research and practical application.
Survey Question 3: In our work, we discuss training data attribution (TDA), which includes methods like influence functions.
Submission Number: 52
Loading