Keywords: graph representation learning, graph neural networks, regularity lemma, intersecting communities, graphon, scalability
TL;DR: We show how to approximate graphs by intersecting communities, which allows an efficient learning algorithm on large non-sparse graphs.
Abstract: Message Passing Neural Networks (MPNNs) are a staple of graph machine learning. MPNNs iteratively update each node’s representation in an input graph by aggregating messages from the node’s neighbors, which necessitates a memory complexity of the order of the __number of graph edges__. This complexity might quickly become prohibitive for large graphs provided they are not very sparse. In this paper, we propose a novel approach to alleviate this problem by approximating the input graph as an intersecting community graph (ICG) -- a combination of intersecting cliques. The key insight is that the number of communities required to approximate a graph __does not depend on the graph size__. We develop a new constructive version of the Weak Graph Regularity Lemma to efficiently construct an approximating ICG for any input graph. We then devise an efficient graph learning algorithm operating directly on ICG in linear memory and time with respect to the __number of nodes__ (rather than edges). This offers a new and fundamentally different pipeline for learning on very large non-sparse graphs, whose applicability is demonstrated empirically on node classification tasks and spatio-temporal data processing.
Primary Area: Graph neural networks
Submission Number: 18854
Loading