Curvature-Dimension Tradeoff for Generalization in Hyperbolic Space

Published: 07 Nov 2023, Last Modified: 13 Dec 2023M3L 2023 PosterEveryoneRevisionsBibTeX
Keywords: Geometric embeddings, Hyperbolic neural networks, Generalization bounds, Negative curvature, Gromov-hyperbolic geometries
TL;DR: We prove for the first time generalization bounds for models in negatively curved geometry, showing that embedding curvature and dimension play interchangeable roles for generalization.
Abstract: The inclusion of task-relevant geometric embeddings in deep learning models is an important emerging direction of research, particularly when using hierarchical data. For instance, negatively curved geometries such as hyperbolic spaces are known to allow low-distortion embedding of tree-like hierarchical structures, which Euclidean spaces do not afford. Learning techniques for hyperbolic spaces, such as Hyperbolic Neural Networks (HNNs), have shown empirical accuracy improvement over classical Deep Neural Networks in tasks involving semantic or multi-scale information, such as recommender systems or molecular generation. However, no research has investigated generalization properties specific to such geometries. In this work, we introduce generalization bounds for learning tasks in hyperbolic spaces, marking the first time such bounds have been proposed. We highlight a previously unnoticed and important difference with Euclidean embedding models, namely, under embeddings into spaces of negative curvature $-\kappa<0$ and dimension $d$, only the product $\sqrt{\kappa}\ d$ influences generalization bounds. Hence, the curvature parameter of the space can be varied at fixed $d$ with the same effect on generalization as when varying $d$.
Submission Number: 81
Loading