Identifying latent distances with Finslerian geometryDownload PDF

Published: 07 Nov 2022, Last Modified: 07 Apr 2024NeurReps 2022 PosterReaders: Everyone
Keywords: Finsler geometry, Riemannian geometry, Gaussian Processes, High-dimensional data, Latent space
TL;DR: A new metric is introduced to explore the latent space learnt by generative models and it is compared with the commonly used expected Riemannian metric.
Abstract: Riemannian geometry has been shown useful to explore the latent space of generative models. Effectively, we can endow the latent space with the pullback metric obtained from the data space. Because most generative models are stochastic, this metric will be de facto stochastic, and, as a consequence, a deterministic approximation of the metric is required. Here, we are defining a new metric as the expectation of the stochastic curve lengths induced by the pullback metric. We show this metric is, in fact, a Finsler metric. We compare it with a previously studied expected Riemannian metric, and we show that in high dimensions, the metrics converge to each other.
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/arxiv:2212.10010/code)
7 Replies

Loading