Self-Supervised Time Series Representation Learning with Temporal-Instance Similarity DistillationDownload PDF

26 May 2022 (modified: 04 May 2025)ICML 2022 Pre-training WorkshopReaders: Everyone
Keywords: self-supervised learning, time series, classification, anomaly detection, forecasting, similarity distillation, teacher-student architecture
Abstract: We propose a self-supervised method for pre-training universal time series representations in which we learn contrastive representations using similarity distillation along the temporal and instance dimensions. We analyze the effectiveness of both dimensions, and evaluate our pre-trained representations on three downstream tasks: time series classification, anomaly detection, and forecasting.
TL;DR: We propose a self-supervised methods for pre-training universal time series representations using similarity distillation.
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 8 code implementations](https://www.catalyzex.com/paper/self-supervised-time-series-representation/code)
0 Replies

Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview