quantile-Long Short Term Memory: A Robust, Time Series Anomaly Detection Method

Published: 01 Jan 2024, Last Modified: 18 Apr 2025IEEE Trans. Artif. Intell. 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Anomalies refer to the departure of systems and devices from their normal behavior in standard operating conditions. An anomaly in an industrial device can indicate an upcoming failure, often in the temporal direction. In this article, we make two contributions: 1) we estimate conditional quantiles in the popular long short term memory networks (LSTMs) architecture, propose a novel anomaly detection method, quantile-long short term memory (q-LSTM), and consider three different ways to define anomalies based on the estimated quantiles; and 2) we use a new learnable activation function (AF), parametric Elliot function (PEF), in q-LSTM architecture to model temporal long-range dependency. Unlike sigmoid and tanh, the derivative of the PEF depends on the input as well as on the parameter, which help in mitigating the vanishing gradient problem and therefore facilitates in escaping early saturation. The proposed algorithms are compared with other well-known anomaly detection algorithms, such as isolation forest (iForest), elliptic envelope, autoencoder, and modern deep learning models, namely deep autoencoding Gaussian mixture model (DAGMM) and generative adversarial networks (GANs). The algorithms are evaluated using various performance metrics, such as precision and recall. The algorithms have been tested on multiple industrial time-series datasets such as Yahoo, Amazon Web Services (AWS), General Electric (GE), and machine sensors. We have found that the LSTM-based quantile algorithms are very effective and outperformed the existing algorithms in identifying anomalies.
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview