Keywords: Large Language Models; Temporal Knowledge Graph; Time Series Forecasting
Abstract: Large Language Models (LLMs) based on Transformer have shown advantages in various domains by their powerful representation learning and context understanding capabilities. Recently, researchers have begun to explore their applications in time series forecasting. Although existing methods can achieve cross-modality embedding of time series into LLMs, the self-attention in LLMs is essentially “token-to-token”. Position encoding can only reflect the sequential relationships between tokens, and the model cannot capture the temporal dependencies and correlations between features, thus not achieving excellent forecasting accuracy. Therefore, we propose the Temporal Knowledge Graph with LLM (TKG-LLM), which innovatively designs the TKG to capture the temporal structural information. We first build the TKG containing temporal edges to capture dependencies between time series and feature edges to capture relationships between features. Next, we apply the Graph Convolution Network (GCN) to encode the graph, generating node embeddings rich in temporal structural information. Finally, we fuse the time series embeddings with the graph node embeddings to enhance representational capabilities and utilize the enhanced embeddings for dynamic prompt selection to improve forecasting performance. Additionally, to better capture the multi-scale characteristics of time series and thereby improve the accuracy of forecasts. The time series is decomposed into three components: trend, season, and residual through Wavelet Decomposition (Daubechies 4) into TKG-LLM to capture multi-scale temporal features and sudden changes accurately. We demonstrate through visualizing experimental results that Wavelet Decomposition exhibits superior performance when dealing with non-stationary time series. Our empirical experiments on multiple benchmark datasets demonstrate that the proposed TKG-LLM achieves superior forecasting performance compared to baselines. Furthermore, our ablation experiment results verify the effectiveness of using the Temporal Knowledge Graph as enhanced prompt learning.
Supplementary Material: zip
Primary Area: learning on time series and dynamical systems
Submission Number: 4400
Loading