DiffCont: Continual Anomaly Detection with Diffusion Models and Outlier Rejection

ICLR 2026 Conference Submission18770 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Continual Learning, Anomaly Detection, Diffusion Models, UNet, Outlier Rejection, Openset Recognition
TL;DR: Continual Anomaly Detection: Efficient distillation of diffusion U-Nets with outlier rejection achieves competitive performance on MVTec-AD
Abstract: We address continual anomaly detection in industrial inspection where new data flows in periodically in the form on new categories or newer types of defective samples. In such a setting, we would want the model to generalize to newer categories and shifting data distributions whilst maintaining its performance on previously learnt categories. Building on a diffusion-based U-Net with conditional generation and an auxiliary classifier, we introduce a Weibull-guided pipeline that serves two roles: (i) as a representative conditional generative distillation model, where a per-class Weibull model is fit on a projected embedding space and accepts only strong exemplar replay samples via threshold; (ii) as a per-class anomaly detection model, where the same heavy-tailed Weibull fit on the tail of normal embeddings converts distances into outlier probabilities for open-set anomaly detection. On MVTec-AD dataset split into 15 class-incremental experiences, our method achieves superior performance across experiences while presenting a light-weight anomaly detection workflow fit for industrial use.
Primary Area: generative models
Submission Number: 18770
Loading