Equivariant Geodesic Networks: Geometry Preserving Learning on Riemannian Manifolds

ICLR 2026 Conference Submission19559 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Riemannian Manifolds, Equivariance, Symmetric Positive Definite (SPD) Matrices, Geometry-Preserving Networks, Equivariant Neural Networks
TL;DR: We introduce Equivariance Geodesic Networks (EGN), a geometry-preserving deep learning technique that performs equivariant operations on Riemannian manifolds like SPD spaces. It achieves novel findings such as EEG-based emotion recognition.
Abstract: Many high-dimensional data modalities—including covariance descriptors, diffusion tensors, and kernel matrices—naturally reside on Riemannian manifolds such as the space of Symmetric Positive Definite (SPD) matrices. However, conventional deep neural networks often fail to respect the intrinsic geometry of such data, leading to suboptimal representations and generalization. We introduce Equivariant Geodesic Networks (EGN), a novel architecture designed to operate directly on Riemannian manifolds while preserving key geometric properties. EGN incorporates manifold-consistent operations, including equivariant mappings, adaptive geometric bias, and structured low-rank updates that respect the underlying topology. Unlike existing methods that either flatten or project SPD data into Euclidean space, EGN directly learns on the manifold, preserving geometric consistency throughout. We provide theoretical analysis of the manifold-preserving properties of our layers and demonstrate significant empirical gains on tasks involving SPD-valued data, such as EEG-based emotion recognition and imagined speech classification. EGN outperforms existing Euclidean and pseudo-manifold baselines, offering a principled approach to end-to-end learning on Riemannian data manifolds.
Supplementary Material: zip
Primary Area: learning on graphs and other geometries & topologies
Submission Number: 19559
Loading