Decoupling Permutation-Invariant and Permutation-Sensitive Dependencies for Time-Series Forecasting

16 Sept 2025 (modified: 27 Nov 2025)ICLR 2026 Conference Withdrawn SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: deep learning, machine learning
Abstract: Real-world time series often exhibit both stable patterns and dynamic variations, corresponding to fixed structures and evolving dependencies, respectively. This disparity can introduce interference when modeled jointly. We find that unifying permutation-invariant and permutation-sensitive dependencies within a single framework tends to cause gradient conflicts, leading to the loss of critical information and degraded model performance. To address these challenges, we propose \textbf{Permutation Dependency Decoupling (PDD)}, a gradient-level framework that automatically separates permutation-invariant from permutation-sensitive dependencies, thereby eliminating gradient conflicts and retaining essential information. The proposed framework integrates two specialized modules. The \textbf{Permutation-Invariant Encoder (PIE)} captures permutation invariance through perspective switching over the input data, enabling fine-grained modeling via parameter-free routing among three specialized experts. The \textbf{Permutation-Sensitive Encoder (PSE)} shifts from the traditional history-to-future mapping paradigm to a correction-based paradigm grounded in the predicted sequence. By extending the receptive field to the joint history--prediction sequence, it enables global permutation-sensitive modeling. In addition, we introduce the \textbf{Temporal Order Sensitivity Test (TOST)}, a rigorous evaluation tool designed to distinguish genuine temporal dependency modeling from mere memorization. Extensive experiments on eight real-world datasets demonstrate that PDD achieves state-of-the-art forecasting accuracy, efficiency, and robustness, while serving as a non-intrusive solution that significantly enhances the predictive performance of mainstream models. Code is anonymously available at \url{https://anonymous.4open.science/r/PDD-BAC2}.
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Submission Number: 6780
Loading