Reimagining Time Series Foundation Models: Metadata and State-Space Model Perspectives

Published: 10 Oct 2024, Last Modified: 26 Nov 2024NeurIPS 2024 TSALM WorkshopEveryoneRevisionsBibTeXCC BY 4.0
Keywords: time series foundation model; state-space model; transfomer; time series modeling
TL;DR: We compare state-space models with transformer models for time series forecasting, and explore language metadata and timestamps as side channels for time series foundation models.
Abstract: The success of foundation models in natural language processing has sparked a growing interest in developing analogous models for time series (TS) analysis. These time series foundation models (TSFM), pre-trained on vast amounts of TS data, demonstrate capabilities of zero-shot and few-shot inference on unseen datasets. However, the intrinsic heterogeneity of TS data presents unique challenges: accurate inference often necessitates a deep understanding of the underlying data-generating process and the sensing apparatus, which cannot be readily inferred from the raw data alone. Furthermore, recent advances in state-space models raise the question of whether they may offer advantages over transformer-based architectures for TS analysis. This paper investigates these questions in two key areas: (a) a fair comparison of methods for integrating metadata into TSFMs and (b) the comparative effectiveness of state-space models (SSM) versus transformer models for TS forecasting. Our results, based on experiments across 11 datasets, suggest advantages for SSM building blocks as well as for incorporating the notion of real-world timestamps. More specifically, on our curated in-domain and out-of-domain datasets, an SSM approach incorporating timestamps outperforms three existing TSFMs on forecasting tasks while using 6,000$\times$ fewer trainable parameters and 10$\times$ less training data. The paper aims to highlight the potential for SSM building blocks and general directions for future TSFM research.
Submission Number: 61
Loading