Ergodic and Subhomogeneous Dynamics in Hyperbolic Neural Networks

Published: 03 Feb 2026, Last Modified: 03 Feb 2026AISTATS 2026 PosterEveryoneRevisionsBibTeXCC BY 4.0
TL;DR: A unified, model-invariant framework shows why (and when) hyperbolic neural networks built from subhomogeneous layers stay stable and don’t blow up, even with noise.
Abstract: We analyze the long term behavior of hyperbolic neural networks through subhomogeneous layer maps, focusing on stability, growth control, and robustness under stochastic perturbations. This work unifies the standard hyperbolic models via explicit isometries and Möbius operations, allowing statements to be transported across representations without loss of geometric meaning. Within this model invariant view, we study iterated, noise perturbed transformations and develop an ergodic theoretic framework that characterizes their asymptotic behavior, including conditions that promote stability and convergence of averaged iterates. Beyond theory, these insights inform practical design choices for training procedures that remain well-behaved in the presence of noise and avoid unbounded parameter growth, thereby supporting more reliable use of hyperbolic representations in hierarchical and graph structured learning tasks.
Submission Number: 1331
Loading