Keywords: Time Series, Time Series Forecasting, TSFM, Foundation Models, State Space Models, Gift-Eval, Chronos
Abstract: Foundation models (FMs) have transformed natural language processing (NLP), but their successes have not yet translated to the time series domain. Existing time series foundation models (TSFMs) struggle with generalization across varying context and target lengths, lack adaptability to different sampling rates, and are computationally inefficient. We introduce FlowState, a novel TSFM architecture that addresses these challenges through two key innovations: a state space model (SSM) based encoder and a functional basis decoder. This design enables continuous-time modeling, adjustment to various sampling rates, and flexible forecasting horizons without retraining, paving the way for a ``BERT moment'' for TSFM.
We further propose a parallel training strategy that enhances robustness and accelerates training. Despite being the smallest model, FlowState achieves state-of-the-art results on the GIFT and the Chronos benchmarks, while demonstrating superior adaptability to unseen sampling rates.
Submission Number: 33
Loading