Structure Is All You Need: Structural Representation Learning on Hyper-Relational Knowledge Graphs

Published: 01 May 2025, Last Modified: 18 Jun 2025ICML 2025 posterEveryoneRevisionsBibTeXCC BY-NC-SA 4.0
TL;DR: Thoroughly leveraging the structure of a hyper-relational knowledge graph (HKG) is crucial for reasoning on HKGs, and a purely structure-based representation learning method can achieve state-of-the-art performance on various link prediction tasks.
Abstract: Hyper-relational knowledge graphs (HKGs) enrich knowledge graphs by extending a triplet to a hyper-relational fact, where a set of qualifiers adds auxiliary information to a triplet. While many HKG representation learning methods have been proposed, they often fail to effectively utilize the HKG's structure. This paper demonstrates that thoroughly leveraging the structure of an HKG is crucial for reasoning on HKGs, and a purely structure-based representation learning method can achieve state-of-the-art performance on various link prediction tasks. We propose MAYPL, which learns to initialize representation vectors based on the structure of an HKG and employs an attentive neural message passing consisting of fact-level message computation and entity-centric and relation-centric aggregations, thereby computing the representations based solely on the structure. Due to its structure-driven learning, MAYPL can conduct inductive inferences on new entities and relations. MAYPL outperforms 40 knowledge graph completion methods in 10 datasets, compared with different baseline methods on different datasets to be tested from diverse perspectives.
Lay Summary: Hyper-relational knowledge graphs (HKGs) represent human knowledge using facts, each of which consists of a triplet and a set of qualifiers that provides auxiliary information to the triplet. While many representation learning methods have been proposed for HKGs, these methods often fail to effectively utilize the rich structural information of HKGs. We introduce MAYPL, the first structure-oriented representation learning method for HKGs. MAYPL computes entity and relation representations by capturing their interconnections, co-occurrence and positions, which are then refined by considering the specific facts in which entities and relations belong to. Experimental results show that MAYPL achieves state-of-the-art performance on various link prediction tasks. This demonstrates that thoroughly learning and exploiting the structure of an HKG is necessary and sufficient for learning representations on HKGs.
Link To Code: https://github.com/bdi-lab/MAYPL/
Primary Area: Deep Learning->Graph Neural Networks
Keywords: Hyper-Relational Knowledge Graph, Knowledge Graph Representation Learning, Knowledge Graph Completion, Graph Neural Networks, Inductive Link Prediction
Submission Number: 9983
Loading