Keywords: anomaly-detection, deep-learning, explanations, interpretability, xai, one-class-classification, deep-anomaly-detection, novelty-detection, outlier-detection
Abstract: Deep one-class classification variants for anomaly detection learn a mapping that concentrates nominal samples in feature space causing anomalies to be mapped away. Because this transformation is highly non-linear, finding interpretations poses a significant challenge. In this paper we present an explainable deep one-class classification method, Fully Convolutional Data Description (FCDD), where the mapped samples are themselves also an explanation heatmap. FCDD yields competitive detection performance and provides reasonable explanations on common anomaly detection benchmarks with CIFAR-10 and ImageNet. On MVTec-AD, a recent manufacturing dataset offering ground-truth anomaly maps, FCDD sets a new state of the art in the unsupervised setting. Our method can incorporate ground-truth anomaly maps during training and using even a few of these (~5) improves performance significantly. Finally, using FCDD's explanations we demonstrate the vulnerability of deep one-class classification models to spurious image features such as image watermarks.
One-sentence Summary: We introduce an approach to explainable deep anomaly detection based on fully convolutional neural networks.
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
Code: [![github](/images/github_icon.svg) liznerski/fcdd](https://github.com/liznerski/fcdd) + [![Papers with Code](/images/pwc_icon.svg) 1 community implementation](https://paperswithcode.com/paper/?openreview=A5VV3UyIQz)
Data: [CIFAR-10](https://paperswithcode.com/dataset/cifar-10), [CIFAR-100](https://paperswithcode.com/dataset/cifar-100), [Fashion-MNIST](https://paperswithcode.com/dataset/fashion-mnist), [ImageNet](https://paperswithcode.com/dataset/imagenet), [MNIST](https://paperswithcode.com/dataset/mnist), [MVTecAD](https://paperswithcode.com/dataset/mvtecad)
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/arxiv:2007.01760/code)
8 Replies
Loading