RAYQUAZA : Input-Conditioned Radial Basis Decomposition for Efficient Univariate Time-Series Forecasting

ICLR 2026 Conference Submission21895 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Time series forecasting, Deep learning, parameter-efficient
TL;DR: We built RAYQUAZA, a tiny time-series model that beats huge, state-of-the-art models by learning to adapt its core building blocks to each specific forecast.
Abstract: Time-series forecasting presents a persistent trade-off between simple, scalable linear models that struggle with complex dynamics and large neural architectures that offer high accuracy at a steep computational cost. We introduce RAYQUAZA, a parameter-efficient architecture designed to fill this gap. RAYQUAZA learns an adaptive basis decomposition of the signal into three complementary components: a smooth trend extractor, a residual correction branch, and a novel input-conditioned radial basis function layer. The iRBF module dynamically learns a compact set of localized Gaussian atoms for each input sequence, enabling it to model transient, non-stationary patterns like structural breaks and spikes that challenge simpler methods. With fewer than 0.12M parameters, RAYQUAZA achieves state-of-the-art accuracy on large-scale public benchmarks and demonstrates consistently strong performance across a diverse range of forecasting domains. Crucially, it outperforms lightweight linear baselines in the majority of long-horizon forecasting scenarios while remaining two to three orders of magnitude smaller than transformer-based models. These results establish RAYQUAZA as a practical, interpretable, and efficient model, proving that adaptive basis representations can deliver high accuracy without sacrificing efficiency
Supplementary Material: zip
Primary Area: learning on time series and dynamical systems
Submission Number: 21895
Loading