TimeGEN: A Cross-Domain and Generative Model for Time Series Forecasting

15 Sept 2025 (modified: 11 Feb 2026)Submitted to ICLR 2026EveryoneRevisionsBibTeXCC BY 4.0
Keywords: time series forecasting, transfer learning, generative models, deep learning
TL;DR: A generative model for time series forecasting that leverages transfer learning with cross-domain representations and multiscale decoding
Abstract: We propose TimeGEN, a lightweight, MLP-based generative deep learning architecture for Transfer Learning in time series forecasting. We use a variational encoder to capture high-level temporal representations across diverse series and domains. To further strengthen this generalization, we combine a reconstruction and forecasting loss, which shapes the latent space to retain local detail while capturing global predictive dependencies. In addition, temporal normalization ensures robustness to varying input scales and noise. To capture multiscale dynamics, we integrate a modular decoder that combines neural basis expansion with multi-rate interpolation, balancing long-range trends with high-frequency variations. Extensive empirical results across ten public datasets demonstrate that TimeGEN consistently outperforms SOTA methods in zero-shot and cross-domain settings. In cross-domain settings, it reduces forecasting error by more than 8% and up to 38%, while achieving a 2-30x speedup in training time compared to SOTA MLP and Transformer methods.
Primary Area: learning on time series and dynamical systems
Submission Number: 5804
Loading