Keywords: Functional Bayesian Neural Networks, Wasserstein distance, Functional Variational Inference
Abstract: Bayesian neural networks (BNNs) have made significant contributions in improving the robustness and uncertainty quantification of deep neural networks but suffer from problematic priors for network weights. We propose a new kind of indirect functional BNNs (IFBNN) by building a Wasserstein bridge, which consists of a 2-Wasserstein distance between the approximate posterior and a bridging distribution of network weights, and a 1-Wasserstein distance between the bridging distribution over functions induced by weight distributions and a functional GP prior. It can avoid the potential risks of invalid or infinite functional KL divergence commonly used by most existing functional BNNs. We demonstrate the improved extrapolation and predictive performances of the proposed IFBNN empirically on both synthetic and real-world datasets.