TL;DR: Morphing-Flow (MoF) stabilizes fat-tailed time series while preserving extremes. Using flow-based reshaping and test-time adaptation, it achieves superior accuracy across eight datasets.
Abstract: Temporal sequences, even after stationarization, often exhibit leptokurtic distributions with fat tails and persistent distribution shifts. These properties destabilize feature dynamics, amplify model variance, and hinder model convergence in time series forecasting. To address this, we propose Morphing-Flow (MoF), a framework that combines a spline-based transform layer (Flow) and a test-time-trained method (Morph), which adaptively normalizes non-stationary, fat-tailed distributions while preserving critical extreme features. MoF ensures that inputs remain within a network’s effective activation space—a structured, normal-like distribution—even under distributional drift. Experiments across eight datasets show that MoF achieves state-of-the-art performance: With a simple linear backbone architecture, it matches the performance of state-of-the-art models on datasets such as Electricity and ETTh2. When paired with a patch-based Mamba architecture, MoF outperforms its closest competitor by 6.3% on average and reduces forecasting errors in fat-tailed datasets such as Exchange by 21.7%. Moreover, MoF acts as a plug-and-play module, boosting performance in existing models without architectural changes.
Lay Summary: Time series data in the real world—like traffic or energy usage—often contains frequent extreme events that destabilize AI models. We developed a method called Morphing-Flow that reshapes such unpredictable data into a more balanced form while preserving important signals. This helps AI models train more reliably and make better long-term forecasts, especially in chaotic environments where surprises are the norm.
Primary Area: General Machine Learning->Sequential, Network, and Time Series Modeling
Keywords: time series forecasting, long-term prediction, model convergence, fat-tailed distributions
Submission Number: 3073
Loading