Functional Distribution Networks (FDN)

ICLR 2026 Conference Submission13946 Authors

18 Sept 2025 (modified: 24 Nov 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Bayesian Deep Learning; Uncertainty Quantification; Hypernetworks; Generative Weight Distributions; Function-Space Learning; Epistemic Uncertainty; Out-of-Distribution Generalization; Calibration
TL;DR: Functional Distribution Networks learn input-conditioned weight distributions for calibrated, reliable, and OOD-robust predictions.
Abstract: Modern probabilistic regressors often remain overconfident under distribution shift. Functional Distribution Networks (FDN) place input-conditioned distributions over network weights, producing predictive mixtures whose dispersion adapts to the input; we train them with a Monte Carlo $\beta$--ELBO objective. We pair FDN with an evaluation protocol that separates interpolation from extrapolation and emphasizes simple OOD sanity checks. On controlled 1D tasks and small/medium UCI-style regression benchmarks, FDN remains competitive in accuracy with strong Bayesian, ensemble, dropout, and hypernetwork baselines, while providing strongly input-dependent, shift-aware uncertainty and competitive calibration under matched parameter and update budgets.
Primary Area: probabilistic methods (Bayesian methods, variational inference, sampling, UQ, etc.)
Submission Number: 13946
Loading