Iterative Bilinear Temporal-Spectral Fusion for Unsupervised Representation Learning in Time SeriesDownload PDF

29 Sept 2021 (modified: 13 Feb 2023)ICLR 2022 Conference Withdrawn SubmissionReaders: Everyone
Keywords: Time Series, Unsupervised Learning, Representation Learning
Abstract: Unsupervised representation learning for multivariate time series has practical significances, but it is also a challenging problem because of its complex dynamics and sparse annotations. Existing works mainly adopt the framework of contrastive learning and involve the data augmentation techniques to sample positives and negatives for contrastive training. However, their designs of representation learning framework have two drawbacks. First, we revisit the augmentation methods for time series of existing works and note that they mostly use segment-level augmentation derived from time slicing, which may bring about sampling bias and incorrect optimization with false negatives due to the loss of global context. Second, they all pay no attention to incorporate the spectral information and temporal-spectral relations in feature representation. To address these problems, we propose a novel framework, namely Bilinear Temporal-Spectral Fusion (BTSF). In contrast to segment-level augmentation, we utilize the instance-level augmentation by simply applying dropout on the entire time series for better preserving global context and capturing long-term dependencies. Also, an iterative bilinear temporal-spectral fusion module is devised to explicitly encode the affinities of abundant time-frequency pairs and iteratively refine representations of time series through cross-domain interactions with Spectrum-to-Time (S2T) and Time-to-Spectrum (T2S) Aggregation modules. Finally, we make sufficient assessments including alignment and uniformity to prove the effectiveness of our bilinear feature representations produced by BTSF. Extensive experiments are conducted on three major practical tasks for time series such as classification, forecasting and anomaly detection, which is the first to evaluate on all three tasks. Results shows that our BTSF achieves the superiority over the state-of-the-art methods and surpasses them by a large margin across downstream tasks. Code will be released.
One-sentence Summary: A general and novel unsupervised representation learning framework for time series.
Supplementary Material: zip
42 Replies

Loading