Convergence of Stein variational gradient descent under a weaker smoothness conditionDownload PDF

Lukang Sun, Avetik Karagulyan, Peter Richtárik

29 Jan 2023 (modified: 14 Apr 2023)OpenReview Archive Direct UploadReaders: Everyone
Abstract: Stein Variational Gradient Descent (SVGD) is an important alternative to the Langevin-type algorithms for sampling from probability distributions of the form . In the existing theory of Langevin-type algorithms and SVGD, the potential function is often assumed to be -smooth. However, this restrictive condition excludes a large class of potential functions such as polynomials of degree greater than . Our paper studies the convergence of the SVGD algorithm for distributions with -smooth potentials. This relaxed smoothness assumption was introduced by Zhang et al. [2019a] for the analysis of gradient clipping algorithms. With the help of trajectory-independent auxiliary conditions, we provide a descent lemma establishing that the algorithm decreases the divergence at each iteration and prove a complexity bound for SVGD in the population limit in terms of the Stein Fisher information.
0 Replies

Loading