Keywords: Time-Series Forecasting, Contextual Enhancement, Foundation Models
Abstract: Time-series evolution is driven by intrinsic dynamics and heterogeneous contextual factors whose relevance can be spurious or unstable across settings, which limits correlation-driven designs that impose fixed structures on contextual inputs. We propose a causally grounded foundation model for context-aware forecasting that augments a generative pre-trained transformer with a context-aware attention module to adaptively integrate external signals. From an information-theoretic perspective, a query-modulation mechanism conditions temporal queries on global context summaries, reducing redundancy and sharpening focus, while a neural structural-equation component injects inductive bias toward genuinely influential features. To align attribution with behavior, counterfactual perturbations enforce consistency between structural importance and predictive responses; an entropy-based regularizer further encourages sparse, interpretable attributions. The overall design targets accuracy, robustness to distributional shifts, and explanatory clarity in the presence of noisy or weakly relevant context. Extensive experiments on diverse real-world benchmarks demonstrate consistent gains over strong baselines in point forecasting and calibration, alongside clearer and more stable explanations of context contributions.
Primary Area: learning on time series and dynamical systems
Submission Number: 24086
Loading