Fully Hyperbolic Representation Learning on Knowledge Hypergraph

23 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: Representation Learning, Hyperbolic Space, Knowledge Hypergraph
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Abstract: Knowledge hypergraphs generalize knowledge graphs in terms of utilizing hyperedges to connect multiple entities and represent complicated relations within them. Existing methods either transform hyperedges into an easier to handle set of binary relations or view hyperedges as isolated and ignore their adjacencies. Both approaches have information loss and may lead to sub-optimal models. To fix these issues, we propose the Hyperbolic Hypergraph GNN (H2GNN), whose essential part is the hyper-star message passing, a novel scheme motivated by a lossless expansion of hyperedges into hierarchies, and implement a direct embedding which explicitly takes adjacent hyperedges and entity positions into account. As the name suggests, H2GNN works in the fully hyperbolic space, which can further reduce distortion and boost efficiency. We compare H2GNN with 15 baselines on both homogeneous and heterogeneous knowledge hypergraphs, and it outperforms state-of-the-art approaches in both node classification and link prediction tasks.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 7160
Loading