DeNOTS: Stable Deep Neural ODEs for Time Series

Published: 26 Jan 2026, Last Modified: 26 Feb 2026ICLR 2026 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Neural ODE, Time series, Gaussian Processes
TL;DR: DeNOTS boosts Neural CDEs for irregular time series by scaling the integration horizon, adding negative-feedback for input-to-state stability, and providing provable bounds on interpolation and integration errors.
Abstract: Neural Controlled Differential Equations (Neural CDEs) provide a principled framework for modelling irregular time series in continuous time. Their number of function evaluations (NFEs) acts as a natural analogue of depth in discrete neural networks and is typically controlled indirectly via solver tolerances. However, tightening tolerances increases numerical precision without necessarily improving expressiveness. We propose a simple alternative: scaling the integration time horizon to increase NFEs and thereby "deepen" the model. Since enlarging the interval can cause uncontrolled growth in standard vector fields, we introduce a Negative Feedback (NF) mechanism that ensures provable stability without limiting flexibility. We further establish general risk bounds for Neural CDEs and quantify discretization error using Gaussian process theory, improving robustness to integration and interpolation error. On four public benchmarks, our method, **DeNOTS**, outperforms existing approaches—including Neural RDEs and state space models—by up to $20$%. DeNOTS combines expressiveness, stability, and robustness for reliable continuous-time modelling.
Primary Area: learning on time series and dynamical systems
Submission Number: 17510
Loading