Time-Gated Multi-Scale Flow Matching for Time-Series Imputation

ICLR 2026 Conference Submission20132 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Time-series imputation, Flow matching, ODE-based generative models, Transformers, Multi-scale modeling
Abstract: We address multivariate time–series imputation by learning the velocity field of a data-conditioned ordinary differential equation (ODE) via flow matching. Our method, Time-Gated Multi-Scale Flow Matching (TG-MSFM), conditions the flow on a structured endpoint comprising observed values, a per-time visibility mask, and short left/right context, processed by a time-aware Transformer whose self-attention is masked to aggregate only from observed timestamps. To recon- cile global trends with local details along the trajectory, we introduce time-gated multi-scale velocity heads on a fixed 1D pyramid and blend them through a time- dependent gate; a mild anti-aliasing filter stabilizes the finest branch. At inference, we use a second-order Heun integrator with a per-step data-consistency projection that keeps observed coordinates exactly on the straight path from the initial noise to the endpoint, reducing boundary artifacts and drift. Training adopts gap-only supervision of the velocity on missing data coordinates, with small optional regu- larizers for numerical stability. Across standard benchmarks, Time-Gated Multi- Scale Flow Matching attains competitive or improved MSE/MAE with favorable speed–quality trade-offs, and ablations isolate the contributions of the time-gated multi-scale heads, masked attention, and the data-consistent ODE integration
Supplementary Material: zip
Primary Area: learning on time series and dynamical systems
Submission Number: 20132
Loading