Sliced Wasserstein Variational InferenceDownload PDF

Published: 29 Jan 2022, Last Modified: 05 May 2023AABI 2022 PosterReaders: Everyone
Keywords: Variational Inference, Wasserstein Distance, Intractable Model
TL;DR: We propose a novel variational inference method using sliced wasserstein distance.
Abstract: Variational Inference approximates an unnormalized distribution via the minimization of \textit{Kullback-Leibler} (KL) divergence. Although this divergence is efficient for computation and has been widely used in applications, it suffers from some unreasonable properties. For example, it is not a proper metric, i.e., it is non-symmetric and does not preserve the triangle inequality. On the other hand, optimal transport distances recently have shown some advantages over KL divergence. To make use of these advantages, we propose a new variational inference method by minimizing sliced Wasserstein distance. This sliced Wasserstein distance can be approximated simply by running very few MCMC steps without solving any optimization problem. Our approximation also does not require a tractable density function of variational distributions so that approximating families can be amortized by generators like neural networks. Experiments on synthetic and real data are illustrated to show the performance of the proposed method.
Reviewer: song.liu@bristol.ac.uk
1 Reply

Loading