Abstract: We propose a general-purpose variational algorithm that forms a natural analogue
of Stein variational gradient descent (SVGD) in function space. While SVGD
successively updates a set of particles to match a target density, the method introduced here of Stein functional variational gradient descent (SFVGD) updates a set
of particle functions to match a target stochastic process (SP). The update step is
found by minimizing the functional derivative of the Kullback-Leibler divergence
between SPs. SFVGD can either be used to train Bayesian neural networks (BNNs)
or for ensemble gradient boosting. We show the efficacy of training BNNs with
SFVGD on various real-world datasets.
0 Replies
Loading