Keywords: unsupervised learning, hyperbolic embedding, hierarchical representation
TL;DR: We incorporate geometric properties of the embedding space to learn better hyperbolic representation
Abstract: Hyperbolic embeddings are a class of representation learning methods that offer competitive performances when data can be abstracted as a tree-like graph. However, in practice, learning hyperbolic embeddings of hierarchical data is difficult due to the different geometry between hyperbolic space and the Euclidean space. To address such difficulties, we first categorize three kinds of illness that harm the performance of the embeddings. Then, we develop a geometry-aware algorithm to tackle the above illnesses. Specifically, we introduce the dilation operation, the transitive closure regularization, and an improved negative sampling strategy to build our algorithm. We empirically validate these techniques and present a theoretical analysis of the mechanism behind the dilation operation. Experiments on synthetic and real-world datasets reveal superior performances of our algorithm.
Poster: png
1 Reply
Loading