Abstract: In complex industrial environments, unsupervised time series anomaly detection (AD) with instrumentation and measurement data has significant applications in enhancing reliability and safety. Differentiating noise and anomalies has been a critical topic in AD. Indiscriminately fitting data with deep learning models is susceptible to overfitting. We assume that noise is ubiquitous, while anomaly is manifesting within small segments. To this end, we propose the hypothesis of multigranularity structural deviation in anomalous data from the perspective of suddenness in anomaly signals. Based on this hypothesis, we design a multigranularity time series AD model that builds global- and local-granularity reconstructions for the data and measures the anomalies through the deviation of these reconstructions. The global-granularity reconstruction is computed by all observed data that are equally used for reconstructions, while the local-granularity reconstruction is mainly computed on nearby points. Based on transformers, we propose a multigranularity attention (MGA) module with global and local attention to generate multigranularity outputs. The module employs the Gaussian kernel to control the weights of the nearby points in local attention and applies Kullback-Leibler (KL) divergence to regulate the attention matrices of the two reconstructions from degeneration. The deviation of the two reconstructions can not only detect anomaly data with suddenness changes but also provide intuitive interpretability. Results on five public datasets showcase that our model can obtain 11.38% improvement on average and obtain 38% improvement on the server machine dataset (SMD).
Loading