Time Series Anomaly Detection with a Transformer Residual Autoencoder-Decoder

Published: 01 Jan 2023, Last Modified: 09 Feb 2025ICONIP (4) 2023EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Time series anomaly detection is of great importance in a variety of domains such as finance fraud, industrial production, and information systems. However, due to the complexity and multiple periodicity of time series, extracting global and local information from different perspectives remains a challenge. In this paper, we propose a novel Transformer Residual Autoencoder-Decoder Model called \({\textbf {TRAD}}\) for time series anomaly detection, which is based on a multi-interval sampling strategy incorporating with residual learning and stacked autoencoder-decoder to promote the ability to learn global and local information. Prediction error is applied to calculate anomaly scores using the proposed model from different scales, and the aggregated anomaly scores are utilized to infer outliers of the time series. Extensive experiments are conducted on five datasets and the results demonstrate that the proposed model outperforms the previous state-of-the-art baselines.
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview