Personalized PageRank meets Graph Attention NetworksDownload PDF

29 Sept 2021 (modified: 13 Feb 2023)ICLR 2022 Conference Desk Rejected SubmissionReaders: Everyone
Keywords: GNN, Personalized PageRank, Graph Attention Network, Graph Neural Network
Abstract: There has been a rising interest in graph neural networks (GNNs) for representation learning over the past few years. GNNs provide a general and efficient framework to learn from graph-structured data. However, GNNs typically only use the information of a very limited neighborhood for each node. A larger neighborhood would be desirable to provide the model with more information. However, increasing the size of the neighborhood is not trivial since neighborhood aggregation over many layers leads to over-smoothing. In this work, we incorporate the limit distribution of Personalized PageRank (PPR) into graph attention networks (GATs) to address this issue. Intuitively, message aggregation based on Personalized PageRank corresponds to infinitely many neighborhood aggregation layers. We show that our models outperform a variety ofbaseline models across all datasets used for our experiments. Our implementation is publicly available online.
One-sentence Summary: Personalized PageRank meets Graph Attention Networks.
1 Reply

Loading