Wasserstein Gradient Flow over Variational Parameter Space for Variational Inference

Published: 22 Jan 2025, Last Modified: 11 Mar 2025AISTATS 2025 PosterEveryoneRevisionsBibTeXCC BY 4.0
Abstract: Variational Inference (VI) optimizes varia- tional parameters to closely align a variational distribution with the true posterior, being ap- proached through vanilla gradient descent in black-box VI or natural-gradient descent in natural-gradient VI. In this work, we reframe VI as the optimization of an objective that concerns probability distributions defined over a variational parameter space. Subsequently, we propose Wasserstein gradient descent for solving this optimization, where black-box VI and natural-gradient VI can be interpreted as special cases of the proposed Wasserstein gradient descent. To enhance the efficiency of optimization, we develop practical methods for numerically solving the discrete gradient flows. We validate the effectiveness of the pro- posed methods through experiments on syn- thetic and real-world datasets, supplemented by theoretical analyses.
Submission Number: 598
Loading