Abstract: Riemannian geometry provides powerful tools to explore the latent space of generative models while preserving the inherent structure of the data. Distance and volume measures can be computed from a Riemannian metric defined by pulling back the Euclidean metric from the data to the latent manifold.
With this in mind, most generative models are stochastic, and so is the pullback metric. Yet, manipulating stochastic objects is at best impractical, and at worst unachievable. To perform operations such as interpolations, or measuring the distance between data points, we need a deterministic approximation of the pullback metric.
In this work, we define a new metric as the expected norm derived from the stochastic pullback metric. We show this norm defines a Finsler metric. We compare it with norm induced by the expected pullback metric. We show that in high dimensions, the norms converge to each other at a rate of $\mathcal{O}\left(\frac{1}{D}\right)$.
Submission Length: Regular submission (no more than 12 pages of main content)
Changes Since Last Submission: More experiments have been done in order to address two main requests from the reviewers: (1) highlighting the difference between the Finslerian and the Riemannian geodesics, especially with area of high variance, and (2) try to show latent spaces of more common datasets.
Assigned Action Editor: ~Bamdev_Mishra1
Submission Number: 825
Loading