Mamba4Cast: Efficient Zero-Shot Time Series Forecasting with State Space Models

Published: 10 Oct 2024, Last Modified: 11 Nov 2024NeurIPS 2024 TSALM Workshop SpotlightEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Zero-shot Forecasting, Foundation Models, State Space Models, Prior-Data Fitted Networks, Pre-trained Models, Linear RNNs, Synthetic Data, Time Series Forecasting, Fast Inference
TL;DR: Mamba4Cast is a Mamba-based zero-shot foundation model for time series forecasting that is trained entirely on synthetic data, performing competitively against other state-of-the-art foundation models while maintaining fast inference speeds.
Abstract: This paper introduces Mamba4Cast, a zero-shot foundation model for time series forecasting. Based on the Mamba architecture and inspired by Prior-data Fitted Networks (PFNs), Mamba4Cast generalizes robustly across diverse time series tasks without the need for dataset specific fine-tuning. Mamba4Cast's key innovation lies in its ability to achieve strong zero-shot performance on real-world datasets while having much lower inference times than time series foundation models based on the transformer architecture. Trained solely on synthetic data, the model generates forecasts for entire horizons in a single pass, outpacing traditional auto-regressive approaches. Our experiments show that Mamba4Cast performs competitively against other state-of-the-art foundation models in various data sets while scaling significantly better with the prediction length. The source code can be accessed at https://anonymous.4open.science/r/mamba4cast
Submission Number: 83
Loading