LLM-Integrated Bayesian State Space Models for Multimodal Time-Series Forecasting

Published: 23 Sept 2025, Last Modified: 09 Oct 2025BERT2SEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Multimodal Time-series Forecasting, Large Language Models, Bayesian State Space Models
TL;DR: We integrate LLMs with a Bayesian state space model to jointly perform numeric and textual forecasting.
Abstract: Forecasting in the real world often requires combining structured time-series data with unstructured textual information, yet most existing methods treat these modalities in isolation. We address this gap with the LLM-integrated Bayesian State Space Model (LBS), a probabilistic framework for multimodal temporal forecasting. At a high level, LBS consists of two components: (1) a state space model (SSM) backbone captures the temporal dynamics of latent states from which both numerical and textual observations are generated, and (2) a pretrained large language model (LLM) is adapted to encode textual inputs for posterior state estimation and decode textual forecasts consistent with the latent trajectory. This design enables flexible lookback and forecast windows, principled uncertainty quantification, and improved temporal generalization thanks to the well-suited inductive bias of SSMs toward modeling dynamical systems. Experiments on the TimeText Corpus benchmark demonstrate that LBS improves the previous state-of-the-art by 13.20% while providing human-readable textual summaries. Our work is the first to unify LLMs and SSMs for joint numerical and textual prediction, offering a novel foundation for multimodal temporal reasoning.
Submission Number: 22
Loading