A Finite-Particle Convergence Rate for Stein Variational Gradient Descent

Published: 21 Sept 2023, Last Modified: 02 Nov 2023NeurIPS 2023 posterEveryoneRevisionsBibTeX
Keywords: Stein Variational Gradient Descent, SVGD, variational inference, sampling, optimization, Stein's method
TL;DR: We provide the first finite-particle convergence rate for Stein variational gradient descent (SVGD).
Abstract: We provide the first finite-particle convergence rate for Stein variational gradient descent (SVGD), a popular algorithm for approximating a probability distribution with a collection of particles. Specifically, whenever the target distribution is sub-Gaussian with a Lipschitz score, SVGD with $n$ particles and an appropriate step size sequence drives the kernel Stein discrepancy to zero at an order ${1/}{\sqrt{\log\log n}}$ rate. We suspect that the dependence on $n$ can be improved, and we hope that our explicit, non-asymptotic proof strategy will serve as a template for future refinements.
Submission Number: 5497
Loading