AnoFormer: Time Series Anomaly Detection using Transformer-based GAN with Two-Step MaskingDownload PDF

16 May 2022 (modified: 05 May 2023)NeurIPS 2022 SubmittedReaders: Everyone
Keywords: Anomaly detection, Transformer, Masking, Time series, Entropy
Abstract: Time series anomaly detection is a task that determines whether an unseen signal is normal or abnormal, and it is a crucial function in various real-world applications. Typical approach is to learn normal data representation using generative models, like Generative Adversarial Network (GAN), to discriminate between normal and abnormal signals. Recently, a few studies actively adopt transformer to model time series data, but there is no transformer-based GAN framework for time series anomaly detection. As a pioneer work, we propose a new transformer-based GAN framework, called AnoFormer, and its effective training strategy for better representation learning. Specifically, we improve the detection ability of our model by introducing two-step masking strategies. The first step is \textit{Random masking}: we design a random mask pool to hide parts of the signal randomly. This allows our model to learn the representation of normal data. The second step is \textit{Exclusive and Entropy-based Re-masking}: we propose a novel refinement step to provide feedback to accurately model the exclusive and uncertain parts in the first step. We empirically demonstrate the effectiveness of re-masking step that our model generates more normal-like signals robustly. Extensive experiments on various datasets show that AnoFormer significantly outperforms the state-of-the-art methods in time series anomaly detection.
TL;DR: We propose a simple yet effective transformer-based GAN framework having a generator and a discriminator for unsupervised time series anomaly detection, called AnoFormer.
Supplementary Material: pdf
11 Replies

Loading