Keywords: Actual Causality, Explainable AI, Medical AI
TL;DR: We present a definition and algorithm of computing explanations for the absence of a feature in medical datasets
Abstract: Using AI models in healthcare is gaining popularity. To improve clinician confidence in the results of automated triage and to
provide further information about the suggested diagnosis, an explanation produced by a separate post-hoc explainability tool often accompanies the classification of an AI model. If no abnormalities are detected, however, it is not clear what an explanation should be. A human clinician might be able to describe certain salient features of tumors that are not in scan, but existing Explainable AI tools cannot do that, as they cannot point to features that are absent from the input. In this paper, we present a definition of and algorithm for providing explanations of absence; that is, explanations of negative classifications in the context of healthcare AI.
Our approach is rooted in the concept of explanations in actual causality. It uses the model as a black-box and is hence portable and works with proprietary models. Moreover, the computation is done in the preprocessing stage, based on the model and the dataset. During the execution, the algorithm only projects the precomputed explanation template on the current image.
We implemented this approach in a tool, nito, and trialed it on a number of medical datasets to demonstrate its utility on the
classification of solid tumors. We discuss the differences between the theoretical approach and the implementation in the domain of classifying solid tumors and address the additional complications posed by this domain. Finally, we discuss the assumptions we make in our algorithm and its possible extensions to explanations of absence for general image classifiers.
Latex Source Code: zip
Signed PMLR Licence Agreement: pdf
Readers: auai.org/UAI/2025/Conference, auai.org/UAI/2025/Conference/Area_Chairs, auai.org/UAI/2025/Conference/Reviewers, auai.org/UAI/2025/Conference/Submission384/Authors, auai.org/UAI/2025/Conference/Submission384/Reproducibility_Reviewers
Submission Number: 384
Loading