Keywords: Time series forecasting, Unmixing, Mamba network, Shared mechanism.
Abstract: Recent time series forecasting methods have increasingly incorporated model-driven formulations and data-driven inference to leverage their complementary strengths in interpretability and pattern learning, achieving remarkable results in many practical tasks. However, in most existing methods, the forecasting target is only used as a training supervision signal rather than being explicitly incorporated into the modeling process, making it difficult for models to leverage future modality information for structural constraint and path guidance. As a result, the modeling structure and inference path remain isolated, lacking a clear optimization objective, and ultimately degenerate into an uncontrollable and uninterpretable static mapping process. To address these challenges, we propose Proximal Deep Unrolling Network (PDUNet), a coupled and closed-loop forecasting framework that starts from modeling, leverages data-driven mechanisms for solution, and progressively optimizes the inference path. Specifically, we formulate an optimizable forecasting equation based on the coupling between future variables and historical inputs, and adopt a proximal optimization algorithm to perform step-wise decoupling and update. This optimization process is further unfolded into a dual-branch state space structure, where the Temporal-SSD captures temporal evolution through causal modeling, while the Channel-SSD employs a non-causal mechanism to model global interactions among variables, jointly enabling progressive inference and dynamic prediction. Experiments on eight public benchmark datasets show that PDUNet outperforms existing state-of-the-art models in long-term forecasting tasks.
Supplementary Material: zip
Primary Area: learning on time series and dynamical systems
Submission Number: 6447
Loading