## A Finite-Particle Convergence Rate for Stein Variational Gradient Descent

02 Oct 2022, 17:24 (modified: 23 Nov 2022, 20:17)OPT 2022 PosterReaders: Everyone
Keywords: Stein's method, variational inference, sampling, optimization
TL;DR: We provide a first finite-particle convergence rate for Stein variational gradient descent (SVGD).
Abstract: We provide a first finite-particle convergence rate for Stein variational gradient descent (SVGD). Specifically, whenever the target distribution is sub-Gaussian with a Lipschitz score, SVGD with $n$ particles and an appropriate step size sequence drives the kernel Stein discrepancy to zero at an order $1/\sqrt{\log\log n}$ rate. We suspect that the dependence on $n$ can be improved, and we hope that our explicit, non-asymptotic proof strategy will serve as a template for future refinements.
0 Replies