Abstract: Multivariate time series from the real world has great application values, where its accurate anomaly identification has become an important research topic. Although existing methods have achieved promising performance to some extent, there still exist some limitations, such as the temporal dependency of long sequences and complex relationships among multiple features. To address these problems, a self-attention-based unsupervised multivariate reconstruction network called ALAE is proposed in this paper. ALAE first establishes a LSTM autoencoder framework to capture the temporal dependencies of long sequences. Then, a multi-head weighted self-attention mechanism is proposed to explore the complex relationships between multiple features in high-dimensional data. This attention mechanism also resolves the issue of insufficient information caused by the fixed vector between the encoder and decoder. Finally, a more robust strategy is used to calculate the anomaly score. Abundant experiments conducted on six public multivariate time series datasets, including financial risks, social governance, and NASA, show that ALAE performs better than seven baseline algorithms. Noteworthy, ablation experiments reveal the validity of integrating the autoencoder with the attention mechanism.
0 Replies
Loading