Hierarchical Position Embedding of Graphs with Landmarks and Clustering for Link Prediction

Published: 23 Jan 2024, Last Modified: 23 May 2024TheWebConf24EveryoneRevisionsBibTeX
Keywords: Link Prediction, Network Science, Graph Neural Networks
TL;DR: We propose a hierarchical representation of positional information using a set of representative nodes called landmarks combined with graph clustering, and show that our method achieves superior link prediction performances on various datasets.
Abstract: Learning positional information of nodes in a graph is important for link prediction tasks. We propose a representation of positional information using representative nodes called landmarks. A small number of nodes with high degree centrality are selected as landmarks, which serve as reference points for the nodes' positions. We justify this selection strategy for well-known random graph models, and derive closed-form bounds on the average path lengths involving landmarks. In a model for scale-free networks, we prove that landmarks provide asymptotically exact information on inter-node distances. We apply theoretical insights to practical networks, and propose Hierarchical Position embedding with Landmarks and Clustering (HPLC). HPLC combines graph clustering and landmark selection, where the graph is partitioned into densely connected clusters in which nodes with the highest degree are selected as landmarks. HPLC leverages the positional information of nodes based on landmarks at various levels of hierarchy such as nodes' distances to landmarks, inter-landmark distances and hierarchical grouping of clusters. Experiments show that HPLC achieves state-of-the-art performances of link prediction on various datasets in terms of HIT@K, MRR, and AUC.
Track: Graph Algorithms and Learning for the Web
Submission Guidelines Scope: Yes
Submission Guidelines Blind: Yes
Submission Guidelines Format: Yes
Submission Guidelines Limit: Yes
Submission Guidelines Authorship: Yes
Student Author: Yes
Submission Number: 351
Loading