Keywords: Stein's method, variational inference, sampling, optimization
TL;DR: We provide a first finite-particle convergence rate for Stein variational gradient descent (SVGD).
Abstract: We provide a first finite-particle convergence rate for Stein variational gradient descent (SVGD). Specifically, whenever the target distribution is sub-Gaussian with a Lipschitz score, SVGD with $n$ particles and an appropriate step size sequence drives the kernel Stein discrepancy to zero at an order $1/\sqrt{\log\log n}$ rate. We suspect that the dependence on $n$ can be improved, and we hope that our explicit, non-asymptotic proof strategy will serve as a template for future refinements.
0 Replies
Loading