Fisher-Rao and pullback Hilbert cone distances on the multivariate Gaussian manifold with applications to simplification and quantization of mixtures

Published: 18 Jun 2023, Last Modified: 23 Jun 2023TAGML2023 PosterEveryoneRevisions
Keywords: Fisher-Rao distance; multivariate normal distributions; symmetric positive-definite cone; Hilbert projective distance; differential geometry; metric space
TL;DR: Arbitarily fine approximation of the Fisher-Rao distance between normals and a new fast pullback Hilbert cone distance with applications
Abstract: Data sets of multivariate normal distributions abound in many scientific areas like diffusion tensor medical imaging, structure tensor computer vision, radar signal processing, machine learning, etc. In order to process those data sets for downstream tasks like filtering, classification or clustering, one needs to define proper notions of dissimilarities and paths joining normal distributions. The Fisher-Rao distance defined as the Riemannian geodesic distance induced by the Fisher information is such a principled distance which however is not known in closed-form excepts on a few particular cases. We first report a fast and robust method to approximate arbitrarily finely the Fisher-Rao distance between normal distributions. Second, we introduce a distance based on a diffeomorphic embedding of the Gaussian manifold into a submanifold of the higher-dimensional symmetric positive-definite cone. We show that the projective Hilbert distance on the cone is a metric on the embedded Gaussian submanifold and pullback that distance with the straight line Hilbert cone geodesics to obtain a distance and paths between normal distributions. Compared to the Fisher-Rao distance approximation, the pullback Hilbert cone distance is computationally light since it requires to compute only extreme eigenvalues of matrices. Finally, we show how to use those distances in clustering tasks.
Supplementary Materials: zip
Type Of Submission: Proceedings Track (8 pages)
Submission Number: 6
Loading