Enhance Time Series Modeling by Integrating LLM

Published: 10 Oct 2024, Last Modified: 26 Nov 2024NeurIPS 2024 TSALM WorkshopEveryoneRevisionsBibTeXCC BY 4.0
Keywords: LLM; Time Series
TL;DR: We propose LLM-TS Integrator, a framework that effectively integrates the capabilities of LLMs with traditional TS modeling.
Abstract: Time series~(TS) modeling is critical in dynamic systems like weather prediction and anomaly detection. Recent work leverages Large Language Models (LLMs) for TS modeling due to their strong pattern recognition abilities. However, these approaches often prioritize LLMs as the predictive backbone, neglecting traditional TS models' mathematical aspects, like periodicity. Conversely, ignoring LLMs overlooks their pattern recognition strengths. To bridge this gap, we propose \textit{LLM-TS Integrator}, a framework that integrates LLM capabilities with traditional TS modeling. At its core is the \textit{mutual information} module, where a traditional TS model is enhanced with LLM-derived insights, improving predictive performance by maximizing mutual information between TS representations and their LLM-generated textual counterparts. We also address the varying importance of samples for traditional prediction and mutual information maximization. To handle this, we introduce the \textit{sample reweighting} module, which assigns dual weights to each sample—one for prediction loss and another for mutual information loss—dynamically optimized through bi-level optimization. Our method achieves state-of-the-art or comparable performance across five key TS tasks: short-term and long-term forecasting, imputation, classification, and anomaly detection. Our code is \href{https://anonymous.4open.science/r/llm_ts_anonymous-F07D/README.MD}{available}.
Submission Number: 6
Loading