Abstract: Symmetric Positive Definite (SPD) matrices are pervasive in machine learning, from data features (such as covariance matrices) to optimization process.These matrices induce a Riemannian structure, where the curvature plays a critical role in the success of approaches based on those geometries. Yet, for ML practitioners wanting to visualize SPD matrices, the existing (flat) Euclidean approaches will hide the curvature of the manifold.
To overcome this lack of expressivity in the existing algorithms, we introduce Riemannian versions of two state-of-the-art techniques, namely t-SNE and Multidimensional Scaling. Therefore, we are able to reduce a set of $c \times c$ SPD matrices into a set of $2 \times 2$ SPD matrices in order to capture the curvature information and avoid any distortion induced by flattening the representation in an Euclidean setup. Moreover, our approaches pave the way for targeting more general dimensionality reduction applications while preserving the geometry of the data. We performed experiments on controlled synthetic dataset to ensure that the low-dimensional representation preserves the geometric properties of both SPD Gaussians and geodesics. We also conduct experiments on various real datasets, such as video, anomaly detection, brain signal and others.
Submission Length: Regular submission (no more than 12 pages of main content)
Assigned Action Editor: ~Bamdev_Mishra1
Submission Number: 3338
Loading