Keywords: Bayesian Deep Learning; Uncertainty Quantification; Hypernetworks; Generative Weight Distributions; Function-Space Learning; Epistemic Uncertainty; Out-of-Distribution Generalization; Calibration
TL;DR: Functional Distribution Networks learn input-conditioned weight distributions for calibrated, reliable, and OOD-robust predictions.
Abstract: Modern probabilistic regressors often remain overconfident under distribution shift. We present Functional Distribution Networks (FDN), an input-conditioned distribution over network weights that induces predictive mixtures whose dispersion adapts to the input. FDN is trained with a $\beta$-ELBO and Monte Carlo sampling. We further propose an evaluation protocol that cleanly separates interpolation from extrapolation and stresses OOD sanity checks—e.g., that predictive likelihood degrades under shift while in-distribution accuracy and calibration are maintained. On standard regression tasks, we benchmark against strong Bayesian, ensemble, dropout, and hypernetwork baselines under matched parameter and update budgets, and assess accuracy, calibration, and shift-awareness with standard diagnostics. Together, the framework and protocol aim to make OOD-aware, well-calibrated neural regression practical and modular.
Primary Area: probabilistic methods (Bayesian methods, variational inference, sampling, UQ, etc.)
Submission Number: 13946
Loading