Closed-form uncertainty quantification of deep residual neural networks

ICLR 2026 Conference Submission20498 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: uncertainty quantification, sinusoidal, stochastic, analytic, residual
TL;DR: Just select an activation function with closed-form Gaussian integrals.
Abstract: We study the problem of propagating the mean and covariance of a general multivariate Gaussian distribution through a deep (residual) neural network using layer-by-layer moment matching. Our method computes the mean and covariance of representations by using exact definite integrals for the sine and probit activation functions. On random networks, we find orders-of-magnitude improvements in the KL divergence error metric, up to a billionfold, over popular alternatives. On real data, we find competitive statistical calibration for inference under noisy or missing features. We also give an _a priori_ bound on goodness of approximation and a preliminary analysis of stochastic activation functions, which have recently attracted general interest.
Supplementary Material: zip
Primary Area: probabilistic methods (Bayesian methods, variational inference, sampling, UQ, etc.)
Submission Number: 20498
Loading