Hierarchical Gaussian Mixture Normalizing Flows Modeling for Multi-Class Anomaly Detection

22 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: Multi-Class Anomaly Detection, Hierarchical Gaussian Mixture Normalizing Flows Modeling
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
TL;DR: We propose a novel model with stronger unified AD performance, HGAD (Hierarchical Gaussian mixture). The proposed HGAD performs much better for multi-class anomaly detection
Abstract: One of the most challenges for anomaly detection (AD) is how to design one unified AD model, where the model is trained with normal instances from multiple classes with the objective to detect anomalies in these classes. For such a challenging task, popular normalizing flow (NF) based AD methods may fall into a ”homogeneous mapping” issue, where the NF-based AD models are biased to generate large log-likelihoods for both normal and abnormal samples, and thereby lead to a high missing rate of anomalies. In this paper, we propose a novel model with stronger unified AD performance, HGAD (Hierarchical Gaussian mixture). The proposed HGAD performs much better for multi-class anomaly detection by three key improvements. First, we propose to model NF-based AD networks with inter-class Gaussian mixture prior for more effectively capturing the complex multiclass distribution. Second, we propose a mutual information maximization loss to introduce the class repulsion property to the model for better structuring the latent feature space, where the class centers are repulsed from each other. In this way, different class centers are more distinguishable and more conducive to avoid the bias issue. Third, we introduce an intra-class mixed class centers learning strategy that can prompt the model to learn diverse normal patterns even within one class. Together with the inter-class Gaussian mixture modeling, we form a hierarchical Gaussian mixture normalizing flows modeling method to accomplish the multiclass AD task. We evaluate our method on four real-world AD benchmarks, where we can significantly improve the previous NF-based AD methods and also outperform the SOTA unified AD methods. Code will be available online.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
Supplementary Material: pdf
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 4336
Loading