Skip-patching spatial-temporal discrepancy-based anomaly detection on multivariate time series

Published: 01 Jan 2024, Last Modified: 13 Nov 2024Neurocomputing 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Anomaly detection in the Industrial Internet of Things (IIoT) is a challenging task that relies heavily on the efficient learning of multivariate time series representations. We introduce Skip-patching and Spatial–Temporal discrepancy mechanisms to improve the efficiency of detecting anomalies. Traditional feature extraction is hindered by redundant information in limited datasets. The situation is that feature generation from stable operational processes results in low-quality representations. To address this challenge, we propose the Skip-Patching mechanism. This approach involves selectively extracting features from partial data patches, prompting the model to learn more meaningful knowledge through self-supervised learning. It also effectively doubles the training sample size by creating independent sub-groups of patches. Despite the complex spatial and temporal relationships in IIoT systems, existing methods mainly extracted features from a single domain, either temporal or spatial (sensor-wise), or simply cascaded two features, i.e., one after one, which limited anomaly detection capabilities. To address this, we introduce the Spatial–Temporal Association Discrepancy component, which leverages discrepancies between spatial and temporal features to enhance latent representation learning. Our Skip-Patching Spatial–Temporal Anomaly Detection (SSAD) framework combines these two components to provide a more diverse and comprehensive learning process. Tested across four multivariate time series anomaly detection benchmarks, SSAD demonstrates superior performance, confirming the efficacy of combining Skip-patching and Spatial–Temporal features to enhance anomaly detection in IIoT systems.
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview