Affinity-Aware Graph Networks

Published: 21 Sept 2023, Last Modified: 07 Nov 2023NeurIPS 2023 posterEveryoneRevisionsBibTeX
Keywords: graph neural networks, message passing, effective resistance, hitting time
TL;DR: We propose the use of affinity measures (e.g., effective resistances, resistive embeddings, hitting times) as features in graph neural networks and show that they provide good empirical performance.
Abstract: Graph Neural Networks (GNNs) have emerged as a powerful technique for learning on relational data. Owing to the relatively limited number of message passing steps they perform—and hence a smaller receptive field—there has been significant interest in improving their expressivity by incorporating structural aspects of the underlying graph. In this paper, we explore the use of affinity measures as features in graph neural networks, in particular measures arising from random walks, including effective resistance, hitting and commute times. We propose message passing networks based on these features and evaluate their performance on a variety of node and graph property prediction tasks. Our architecture has low computational complexity, while our features are invariant to the permutations of the underlying graph. The measures we compute allow the network to exploit the connectivity properties of the graph, thereby allowing us to outperform relevant benchmarks for a wide variety of tasks, often with significantly fewer message passing steps. On one of the largest publicly available graph regression datasets, OGB-LSC-PCQM4Mv1, we obtain the best known single-model validation MAE at the time of writing.
Supplementary Material: zip
Submission Number: 14315
Loading