DeNOTS: Stable Deep Neural ODEs for Time Series

ICLR 2026 Conference Submission17510 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Neural ODE, Time series, Gaussian Processes
TL;DR: DeNOTS enhances Neural CDE expressiveness for irregular time series by scaling the integration horizon (instead of lowering tolerance) and making it input-to-state stable via Negative Feedback, and provides provable epistemic uncertainty bounds.
Abstract: Neural CDEs provide a natural way to process the temporal evolution of irregular time series. The number of function evaluations (NFE) is these systems' natural analog of depth (the number of layers in traditional neural networks). It is usually regulated via solver error tolerance: lower tolerance means higher numerical precision, requiring more integration steps. However, lowering tolerances does not adequately increase the models' expressiveness. We propose a simple yet effective alternative: scaling the integration time horizon to increase NFEs and "deepen`` the model. Increasing the integration interval causes uncontrollable growth in conventional vector fields, so we also propose a way to stabilize the dynamics via Negative Feedback (NF). It ensures provable stability without constraining flexibility. It also implies robustness: we provide theoretical bounds for Neural ODE risk using Gaussian process theory. Experiments on four open datasets demonstrate that our method, DeNOTS, outperforms existing approaches---including recent Neural RDEs and state space models,---achieving up to 20% improvement in metrics. DeNOTS combines expressiveness, stability, and robustness, enabling reliable modelling in continuous-time domains.
Primary Area: learning on time series and dynamical systems
Submission Number: 17510
Loading