Feature Perturbation Augmentation for Reliable Evaluation of Importance EstimatorsDownload PDF

Published: 04 Mar 2023, Last Modified: 14 Oct 2024ICLR 2023 Workshop on Trustworthy ML PosterReaders: Everyone
Keywords: deep learning, neural network, post-hoc interpretability, explainability, trustworthy ml, data augmentation, perturbation artifacts
Abstract: Post-hoc explanation methods attempt to make the inner workings of deep neural networks more comprehensible and trustworthy, which otherwise act as black box models. However, since a ground truth is in general lacking, local post-hoc explanation methods, which assign importance scores to input features, are challenging to evaluate. One of the most popular evaluation frameworks is to perturb features deemed important by an explanation and to measure the change in prediction accuracy. Intuitively, a large decrease in prediction accuracy would indicate that the explanation has correctly quantified the importance of features with respect to the prediction outcome (e.g., logits). However, the change in the prediction outcome may stem from perturbation artifacts, since perturbed samples in the test dataset are out of distribution (OOD) compared to the training dataset and can therefore potentially disturb the model in an unexpected manner. To overcome this challenge, we propose feature perturbation augmentation (FPA) which creates and adds perturbed images during the model training. Our computational experiments suggest that FPA makes the considered models more robust against perturbations. Overall, FPA is an intuitive and straightforward data augmentation technique that renders the evaluation of post-hoc explanations more trustworthy.
TL;DR: Feature Perturbation Augmentation (FPA) removes perturbation artifacts and enables reliable evaluation of post-hoc interpretability methods for DNNs.
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/feature-perturbation-augmentation-for/code)
0 Replies

Loading