Riemannian Networks over Full-Rank Correlation Matrices

ICLR 2026 Conference Submission158 Authors

01 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Correlation matrices, Riemannian neural networks, Matrix manifolds, Riemannian manifolds
TL;DR: We extend core deep learning layers to correlation manifolds under five Riemannian geometries, introducing CorNets that outperform existing SPD and Grassmannian networks
Abstract: Representations on the Symmetric Positive Definite (SPD) manifold have garnered significant attention across different applications. In contrast, the manifold of full-rank correlation matrices, a normalized alternative to SPD matrices, remains largely underexplored. This paper introduces Riemannian networks over the correlation manifold, leveraging five recently developed correlation geometries. We systematically extend Multinomial Logistic Regression (MLR), Fully Connected (FC), and convolutional layers to these geometries. Additionally, we present methods for accurate backpropagation for two correlation geometries. Experiments comparing our approach against existing SPD and Grassmannian networks demonstrate its effectiveness.
Primary Area: learning on graphs and other geometries & topologies
Submission Number: 158
Loading