Keywords: Hyperbolic, SPD, Symmetric Positive Definite, Grassmannian, Riemannian Manifolds, Riemannian Networks
TL;DR: We generalize the transformation layers, such as fully connected and convolutional layer, into Riemannian spaces, and manifest our framework on ten different geometries, including three hyperbolic, five SPD, and two Grassmannian geometries.
Abstract: Recently, deep neural networks on manifold-valued representations have garnered significant attention in various machine learning applications. One recent focus is to generalize Euclidean Fully Connected (FC) and convolutional layers to non-Euclidean geometries. However, previous approaches typically focus on a few selected manifolds and rely on the specific properties of the target manifold. In contrast, this work proposes a framework for constructing FC and convolutional layers over computationally tractable Riemannian spaces, using only Riemannian geometry. This framework incorporates several previous FC layers across different geometries as special cases, and is instantiated over ten representative manifolds, including three hyperbolic models, five geometries of the Symmetric Positive Definite (SPD) manifold, and two Grassmannian perspectives. Experiments on different manifolds demonstrate the effectiveness and applicability of our approach.
Primary Area: learning on graphs and other geometries & topologies
Submission Number: 164
Loading