Influence-Aware Forecasting: Breaking the Self-Stimulation Barrier in Time Series

18 Sept 2025 (modified: 11 Feb 2026)Submitted to ICLR 2026EveryoneRevisionsBibTeXCC BY 4.0
Keywords: Time Series Forecasting, Multimodal Forecasting
Abstract: The field of time series forecasting faces a critical performance plateau, where even billion-parameter foundation models struggle to outperform simple linear baselines. We argue this stagnation stems not from model architecture but from a universally adopted yet flawed 'self-stimulation' assumption, where models ignore the external influences that drive real-world systems by predicting the future using only the historical values of time series. Through a control-theoretic lens, we formally prove that this assumption imposes a hard, mathematical barrier on forecasting accuracy. To break this barrier, we introduce Influence-Aware Time Series Forecasting (IATSF), a new paradigm that reframes the task from correlation-based inference to dynamic system modeling. To operationalize this paradigm, we provide two foundational contributions. First, we introduce a leak-free, temporally-synced benchmark—a critical resource for the community—that incorporates textual influences to capture the qualitative or uncertain dynamics missed by traditional variables. Second, we develop FIATS, a lightweight, principled model engineered to interpret these influences. Its novel channel-aware mechanisms allow it to adjust its sensitivity to both textual signals and historical data in a channel-specific manner. Our results demonstrate that explicitly modeling external influences is not just an incremental improvement but the primary path forward for meaningful progress in time series forecasting.
Primary Area: learning on time series and dynamical systems
Submission Number: 11179
Loading