Position: Explainable AI Cannot Advance Without Better User Studies

Published: 01 May 2025, Last Modified: 23 Jul 2025ICML 2025 Position Paper Track posterEveryoneRevisionsBibTeXCC BY 4.0
TL;DR: A position paper arguing for better-designed user studies in explainable AI.
Abstract: In this position paper, we argue that user studies are key to understanding the value of explainable AI methods, because the end goal of explainable AI is to satisfy societal desiderata. We also argue that the current state of user studies is detrimental to the advancement of the field. We support this argument with a review of general and explainable AI-specific challenges, as well as an analysis of 607 explainable AI papers featuring user studies. We demonstrate how most user studies lack reproducibility, discussion of limitations, comparison with a baseline, or placebo explanations and are of low fidelity to real-world users and application context. This, combined with an overreliance on functional evaluation, results in a lack of understanding of the value explainable AI methods, which hinders the progress of the field. To address this issue, we call for higher methodological standards for user studies, greater appreciation of high-quality user studies in the AI community, and reduced reliance on functional evaluation.
Lay Summary: In this paper we argue that user studies are essential for assessing the value of explainable AI, as the ultimate goal of explainable AI is to meet societal needs. We show how current user studies fall short, as they are poorly designed, hard to reproduce, and rarely reflect real-world users or applications. This, combined with an overreliance on evaluation without users, weakens our understanding of explainable AI’s actual impact. We call for better user studies, more recognition of quality human subject-based research, and less dependence on purely functional evaluations.
Verify Author Names: My co-authors have confirmed that their names are spelled correctly both on OpenReview and in the camera-ready PDF. (If needed, please update ‘Preferred Name’ in OpenReview to match the PDF.)
No Additional Revisions: I understand that after the May 29 deadline, the camera-ready submission cannot be revised before the conference. I have verified with all authors that they approve of this version.
Pdf Appendices: My camera-ready PDF file contains both the main text (not exceeding the page limits) and all appendices that I wish to include. I understand that any other supplementary material (e.g., separate files previously uploaded to OpenReview) will not be visible in the PMLR proceedings.
Latest Style File: I have compiled the camera ready paper with the latest ICML2025 style files <https://media.icml.cc/Conferences/ICML2025/Styles/icml2025.zip> and the compiled PDF includes an unnumbered Impact Statement section.
Paper Verification Code: ODE2N
Permissions Form: pdf
Primary Area: Research Priorities, Methodology, and Evaluation
Keywords: explainability, interpretability, user study, research methodology
Submission Number: 187
Loading