Continuous Time Series Generation with Irregular Observations

20 Sept 2025 (modified: 11 Feb 2026)Submitted to ICLR 2026EveryoneRevisionsBibTeXCC BY 4.0
Keywords: Irregular time series, continuous time series generationt, deep learning
TL;DR: we propose a new paradigm that decouples observations from the updating rule to unifiy discrete generative modeling with continuous-time representation, enabling faithful and flexible TSG under irregular sampling.
Abstract: Time series generation (TSG) contributes to diverse fields (e.g., healthcare), but most methods assume regularly sampled data and fixed outputs—mismatched with real-world settings where observations are irregular and sparse. This mismatch is especially problematic in domains such as clinical monitoring, where irregularly recorded data must support downstream tasks with continuous and high-resolution data. Neural Controlled Differential Equations (NCDEs) show significant potential in handling irregular time series, but still face challenges in learning dynamic temporal patterns and continuous TSG. To address this, we propose MN-TSG, a framework that explores MOE (Mixture of Experts)-NCDE and integrates it with existing TSG models for irregular or continuous TSG tasks. The key designs of MOE-NCDE are the dynamic functions with mixture of experts and the decoupled design to better optimize the MOE dynamics. Further, we employ the existing TSG model to learn the joint distribution of the mixture of experts and the time series. In this way, the model can not only generate new samples but also produce suitable experts for them to enable MOE-NCDE for refined continuous TSG tasks. We have validated the effectiveness of our method on ten public and synthetic datasets, outperforming advanced TSG baselines in both irregular-to-regular and irregular-to-continuous generation tasks.
Primary Area: learning on time series and dynamical systems
Submission Number: 23515
Loading