Optimal Latent TransportDownload PDF

26 Sept 2022, 12:09 (modified: 09 Nov 2022, 02:12)NeurReps 2022 PosterReaders: Everyone
Keywords: Deep generative models, Wasserstein metric, Earth movers distance, Latent space geometry, Riemannian manifolds
TL;DR: Using the Wasserstein metric to define a geometry on the latent space of a Variational Autoencoder
Abstract: It is common to assume that the latent space of a generative model is a lower-dimensional Euclidean space. We instead endow the latent space with a Riemannian structure. Previous work endows this Riemannian structure by pulling back the Euclidean metric of the observation space or the Fisher-Rao metric on the decoder distributions to the latent space. We instead investigate pulling back the Wasserstein metric tensor on the decoder distributions to the latent space. We develop an efficient realization of this metric, and, through proof of concept experiments, demonstrate that the approach is viable.
4 Replies