Feature-Based Metrics for Exploring the Latent Space of Generative ModelsDownload PDF

12 Feb 2018 (modified: 05 May 2023)ICLR 2018 Workshop SubmissionReaders: Everyone
Abstract: Several recent papers have treated the latent space of deep generative models, e.g., GANs or VAEs, as Riemannian manifolds. The argument is that operations such as interpolation are better done along geodesics that minimize path length not in the latent space but in the output space of the generator. However, this implicitly assumes that some simple metric such as L2 is meaningful in the output space, even though it is well known that for, e.g., semantic comparison of images it is woefully inadequate. In this work, we consider imposing an arbitrary metric on the generator’s output space and show both theoretically and experimentally that a feature-based metric can produce much more sensible interpolations than the usual L2 metric. This observation leads to the conclusion that analysis of latent space geometry would benefit from using a suitable, explicitly defined metric.
Keywords: latent space, metric, Riemannian, geodesic, interpolation, generative models
TL;DR: We propose that a feature-based metric provides a more meaningful structure for the latent space of a generative model than the usual L2 norm on output manifold, and demonstrate this experimentally in the task of image interpolation using GANs.
5 Replies

Loading