Time Series Representation Models for Multivariate Time Series Forecasting and Imputation

27 Sept 2024 (modified: 05 Feb 2025)Submitted to ICLR 2025EveryoneRevisionsBibTeXCC BY 4.0
Keywords: time series, attention, forecasting, imputation, multivariate
TL;DR: A multilayered representation leraning architecture for multivariate time series forecasting and imputation.
Abstract: We introduce a multilayered representation learning architecture called Time Series Representation Model (TSRM) for multivariate time series forecasting and imputation. The architecture is structured around hierarchically ordered encoding layers, each dedicated to an independent representation learning task. Each encoding layer contains a representation layer designed to capture diverse temporal patterns and an aggregation layer responsible for combining the learned representations. The architecture is fundamentally based on a Transformer encoder-like configuration, with self-attention mechanisms at its core. The TSRM architecture outperforms state-of-the-art approaches on most of the seven established benchmark datasets considered in our empirical evaluation for both forecasting and imputation tasks while significantly reducing complexity in the form of learnable parameters. The source code is available at https://anonymous.4open.science/r/TSRM-D7BE.
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 9697
Loading