Fast and Stable Riemannian Metrics on SPD Manifolds via Cholesky Product Geometry

ICLR 2026 Conference Submission161 Authors

01 Sept 2025 (modified: 23 Dec 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Cholesky Decomposition, Symmetric Positive Definite (SPD), SPD Manifold, Riemannian Metrics, SPD Neural Networks
TL;DR: We reveal the product structure in the Cholesky manifold and propose two fast and stable SPD metrics, which enable the construction of classifiers and residual blocks in SPD neural networks.
Abstract: Recent advances in Symmetric Positive Definite (SPD) matrix learning show that Riemannian metrics are fundamental to effective SPD neural networks. Motivated by this, we revisit the geometry of the Cholesky factors and uncover a simple product structure that enables convenient metric design. Building on this insight, we propose two fast and stable SPD metrics, Power--Cholesky Metric (PCM) and Bures--Wasserstein--Cholesky Metric (BWCM), derived via Cholesky decomposition. Compared with existing SPD metrics, the proposed metrics provide closed-form operators, computational efficiency, and improved numerical stability. We further apply our metrics to construct Riemannian Multinomial Logistic Regression (MLR) classifiers and residual blocks for SPD neural networks. Experiments on SPD deep learning, numerical stability analyses, and tensor interpolation demonstrate the effectiveness, efficiency, and robustness of our metrics.
Primary Area: learning on graphs and other geometries & topologies
Submission Number: 161
Loading