Two-Stage Holistic and Contrastive Explanation of Image ClassificationDownload PDF

Published: 08 May 2023, Last Modified: 03 Nov 2024UAI 2023Readers: Everyone
Keywords: Explainability, Explainable AI, whole-output explanation, contrastive explanation, image classification
TL;DR: We propose a contrastive whole-output explanation method for image classification.
Abstract: The need to explain the output of a deep neural network classifier is now widely recognized. While previous methods typically explain a single class in the output, we advocate explaining the whole output, which is a probability distribution over multiple classes. A whole-output explanation can help a human user gain an overall understanding of model behaviour instead of only one aspect of it. It can also provide a natural framework where one can examine the evidence used to discriminate between competing classes, and thereby obtain contrastive explanations. In this paper, we propose a contrastive whole-output explanation (CWOX) method for image classification, and evaluate it using quantitative metrics and through human subject studies. The source code of CWOX is available at https://github.com/vaynexie/CWOX.
Supplementary Material: pdf
Other Supplementary Material: zip
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/two-stage-holistic-and-contrastive/code)
0 Replies

Loading