Keywords: Multivariate Time Series, Self-supervised Learning, Contrastive Learning, Representation Learning
TL;DR: We propose PLanTS, a periodicity-aware self-supervised framework that models latent states and their transitions, achieving state-of-the-art performance across diverse multivariate time series tasks.
Abstract: Multivariate time series (MTS) data are ubiquitous in domains such as healthcare, climate science, and industrial monitoring, but their high dimensionality, scarce labels, and non-stationary nature pose significant challenges for conventional machine learning methods. While recent self-supervised learning (SSL) approaches mitigate label scarcity by data augmentations or time point-based contrastive strategy, they overlook the intrinsic periodic structure of MTS and fail to capture the dynamic evolution of latent states. We propose PLanTS, a periodicity-aware self-supervised learning framework that explicitly models irregular latent states and their transitions. We first designed a periodicity-aware multi-granularity patching mechanism and a generalized contrastive loss to preserve both instance-level and state-level similarities across multiple temporal resolutions. To further capture temporal dynamics, we design a next-transition prediction pretext task that encourages representations to encode predictive information about future state evolution. We evaluate PLanTS across a wide range of downstream tasks—including classification, forecasting, trajectory tracking, and anomaly detection. PLanTS consistently improves the representation quality over existing SSL methods and demonstrates superior computational efficiency compared to baseline methods.
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Submission Number: 14544
Loading