Mitigating Discretization Bias in Neural Stochastic Differential Equations via Inference-Time Dropout
Keywords: Neural stochastic differential equations, discretization error, inference-time dropout, uncertainty estimation
Abstract: Neural stochastic differential equations (Neural SDEs) provide a principled framework for modeling complex, continuous-time dynamics by combining deep learning with Itô calculus. However, when the drift and diffusion functions are nonlinear, the common practice of substituting the noise process with a Gaussian distribution introduces non-negligible approximation errors. We show that these errors are not merely technical but fundamental, as they can directly cause failures in fitting certain classes of stochastic distributions. To address this issue, we propose the Uncertainty-Aware Neural SDE (UA-NSDE) framework, which leverages inference-time dropout to approximate an implicit uncertainty distribution. By maintaining dropout during both training and inference, UA-NSDE reduces discretization bias without imposing restrictive parametric assumptions such as Gaussianity. Empirical evaluations across synthetic and real-world benchmarks demonstrate that our method achieves more accurate and robust modeling
Supplementary Material: zip
Primary Area: learning on time series and dynamical systems
Submission Number: 19155
Loading