Particle-based Variational Inference with Preconditioned Functional Gradient FlowDownload PDF

Published: 29 Nov 2022, Last Modified: 17 Nov 2024SBM 2022 PosterReaders: Everyone
Keywords: Posterior Sampling, Particle-based VI
Abstract: Particle-based variational inference (VI) minimizes the KL divergence between model samples and the target posterior with gradient flow estimates. With the popularity of Stein variational gradient descent (SVGD), the focus of particle-based VI algorithms have been on the properties of functions in Reproducing Kernel Hilbert Space (RKHS) to approximate the gradient flow. However, the requirement of RKHS restricts the function class and algorithmic flexibility. This paper remedies the problem by proposing a general framework to obtain tractable functional gradient flow estimates. The functional gradient flow in our framework can be defined by a general functional regularization term that includes the RKHS norm as a special case. We also use our framework to propose a new particle-based VI algorithm: \emph{preconditioned functional gradient flow} (PFG). Compared with SVGD, the proposed preconditioned functional gradient method has several advantages: larger function classes; greater scalability in the large particle-size scenarios; better adaptation to ill-conditioned target distribution; provable continuous-time convergence in KL divergence. Both theory and experiments have shown the effectiveness of our framework.
Student Paper: Yes
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/particle-based-variational-inference-with/code)
1 Reply

Loading