Transformer-PLM Enhanced Multimodal Time Series Forecasting via Decoupled Dual-Temporal Graph Adaptation

Published: 2026, Last Modified: 27 Mar 2026IEEE Signal Process. Lett. 2026EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: With the proliferation of multimodal data in real-world applications, integrating time series with auxiliary modalities has become critical for accurate forecasting. Although Transformers and pre-trained language model (PLM) have enabled initial explorations of multi-domain multimodal time series analysis, several pressing challenges still remain. Specifically, coarse-grained alignment may hinder long-range semantic capture, while distribution shifts in intra-modality introduce fluctuating noise. Inspired by GNNs’ capability to model spatio-temporal dependencies and contextual interactions, we propose Decoupled Dual Adaptive Temporal Graph (DDATG), a universal GNN plugin for Transformer-PLM based adaptive text-time series bimodal learning. Our framework: (1) Reconstructs global temporal patterns from decoupled local residual terms in temporal modality, enhancing local-global semantic discovery and diversifying attention mechanisms; (2) Explicitly constructs pointwise contextual connections and strengthens aggregation in textual modality, facilitating inter-modal semantic alignment. Extensive experiments across Transformer variants and domain-specific datasets demonstrate the effectiveness of DDATG.
Loading