Lorentz Direct Concatenation for Stable Training in Hyperbolic Neural NetworksDownload PDF

Published: 07 Nov 2022, Last Modified: 05 May 2023NeurReps 2022 PosterReaders: Everyone
Keywords: hyperbolic neural networks, geometric deep learning, concatenation, numerical stability
Abstract: Hyperbolic neural networks have achieved considerable success in extracting representation from hierarchical or tree-like data. However, they are known to suffer from numerical instability, which makes it difficult to build hyperbolic neural networks with deep hyperbolic layers, no matter whether the Poincaré or Lorentz coordinate system is used. In this note, we study the crucial operation of concatenating hyperbolic representations. We propose the Lorentz direct concatenation and illustrate that it is much more stable than concatenating in the tangent space. We provide some insights and show superiority of performing direct concatenation in real tasks.
4 Replies

Loading