Reconstruction as Sequence for Efficient Unified Unsupervised Anomaly Detection

21 Sept 2023 (modified: 25 Mar 2024)ICLR 2024 Conference Withdrawn SubmissionEveryoneRevisionsBibTeX
Keywords: Anomaly Detection, Reconstruction, Transformer, Unsupervised Learning
Abstract: Unsupervised anomaly detection is highly desirable in industrial manufacturing processes due to the rarity of anomalies in real-world scenarios. Recent research has been focused on developing a unified framework for achieving multi-class anomaly detection. However, existing advanced feature-reconstruction-based methods often suffer from a lack of sufficient contextual awareness, thereby compromising the quality of the reconstruction. To address this challenge, we introduce a novel Reconstruction as Sequence (RAS) framework, which enhances the contextual correspondence during feature reconstruction through a sequence modelling perspective. In particular, based on the transformer technique, we integrate a specialized RASFormer block into the RAS framework. This block enables the capture of spatial relationships among different image regions and enhances temporal dependencies throughout the reconstruction process. By incorporating the RASFormer block, our RAS method achieves superior contextual awareness capabilities, leading to exceptional performance and faster inference speed. Experimental results show that our proposed RAS method significantly outperforms competing methods while exhibiting a maximal improvement of 29\% in inference throughput. These results indicate the best trade-off between effectiveness and efficiency, further demonstrating the superiority and practicality of our method.
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 3654
Loading