Autoencoding Hyperbolic Representation for Adversarial Generation

Published: 08 Nov 2024, Last Modified: 08 Nov 2024Accepted by TMLREveryoneRevisionsBibTeXCC BY 4.0
Abstract: With the recent advance of geometric deep learning, neural networks have been extensively used for data in non-Euclidean domains. In particular, hyperbolic neural networks have proved successful in processing hierarchical information of data. However, many hyperbolic neural networks are numerically unstable during training, which precludes using complex architectures. This crucial problem makes it difficult to build hyperbolic generative models for real and complex data. In this work, we propose a hyperbolic generative network in which we design novel architecture and layers to improve stability in training. Our proposed network contains a hyperbolic autoencoder (AE) that produces hyperbolic embedding for input data and a hyperbolic generative adversarial network (GAN) for generating the hyperbolic latent embedding of the AE from simple noise. Our generator inherits the decoder from the AE and the generator from the GAN. Our architecture fosters expressive and numerically stable representation in the hyperbolic space. Theoretically, we validate the training of GAN in the hyperbolic space, and prove stability of our hyperbolic layers used in the AE. Experiments show that our model is capable of generating tree-like graphs as well as complex molecular data with comparable structure-related performance.
Submission Length: Regular submission (no more than 12 pages of main content)
Code: https://github.com/EricZQu/HAEGAN
Supplementary Material: zip
Assigned Action Editor: ~Stefano_Teso1
Submission Number: 2491
Loading