Improving Graph Neural Networks at Scale: Combining Approximate PageRank and CoreRankDownload PDF

Published: 22 Nov 2022, Last Modified: 03 Nov 2024NeurIPS 2022 GLFrontiers WorkshopReaders: Everyone
Keywords: Graph Neural Networks, PageRank, CoreRank, Node Classification, Message Passing Mechanisms
TL;DR: We propose the CorePPR model, a scalable well-performing Graph Neural Network, which combines personalised PageRank with CoreRank scores in the message passing step.
Abstract: Graph Neural Networks (GNNs) have achieved great successes in many learning tasks performed on graph structures. Nonetheless, to propagate information GNNs rely on a message passing scheme which can become prohibitively expensive when working with industrial-scale graphs. Inspired by the PPRGo model, we propose the CorePPR model, a scalable solution that utilises a learnable convex combination of the approximate personalised PageRank and the CoreRank to diffuse multi-hop neighbourhood information in GNNs. Additionally, we incorporate a dynamic mechanism to select the most influential neighbours for a particular node which reduces training time while preserving the performance of the model. Overall, we demonstrate that CorePPR outperforms PPRGo, particularly on large graphs where selecting the most influential nodes is particularly relevant for scalability. Our code is publicly available at: https://github.com/arielramos97/CorePPR.
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/improving-graph-neural-networks-at-scale/code)
1 Reply

Loading