Frequency-enhanced Comprehensive Dependency Attention for Time Series Anomaly Detection

Published: 2025, Last Modified: 08 Oct 2025ICASSP 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Deep time series anomaly detection (TSAD) essentially relies on learning data "normality". Current approaches leverage various neural network architectures, including RNNs, CNNs, Transformers, and graph neural networks, effectively modeling temporal and inter-variable dependencies within time series data. However, these approaches often overlook the equally crucial asynchronous inter-variable dependencies, potentially missing anomalous patterns that arise from intricate, temporally offset interactions among variables. Therefore, we propose a novel time series anomaly detection method by perceiving comprehensive intrinsic dependencies of time series data. Specifically, we propose a Comprehensive Dependency-aware Attention (CDA) that computes attention between each element and its criss-cross counterparts (i.e., intra-variable values across the whole temporal dimension and synchronous values of different variables) and captures asynchronous inter-variable dependency by repeating this step. Additionally, to facilitate more effective downstream dependency learning, we propose a Frequency Domain Rectifier (FDR) to capture the signal’s underlying patterns and reduce noise in the frequency domain. We replace the self-attention in the classic Transformer with CDA, and introduce the FDR module to construct a new method for TSAD, i.e., Frequency-enhanced Comprehensive Dependency Attention-based Time Series Anomaly Detection (FCDATA). Extensive experiments show that: (i) our detection method significantly outperforms state-of-the-art competitors by 15% in the F1-score; and (ii) by incorporating the proposed network architecture, existing anomaly detectors achieve over 20% performance gain compared to current temporal network structures.
Loading