Local Clustering Graph Neural NetworksDownload PDF

28 Sept 2020, 15:52 (modified: 05 Mar 2021, 23:07)ICLR 2021 Conference Blind SubmissionReaders: Everyone
Reviewed Version (pdf): https://openreview.net/references/pdf?id=Ceh12Lw446
Keywords: Graph Neural Networks, Local Clustering, Random Walk on Graphs, Open Graph Benchmark
Abstract: Graph Neural Networks (GNNs), which benefit various real-world problems and applications, have emerged as a powerful technique for learning graph representations. The depth of a GNN model, denoted by $K$, restricts the receptive field of a node to its $K$-hop neighbors and plays a subtle role in the performance of GNNs. Recent works demonstrate how different choices of $K$ produce a trade-off between increasing representation capacity and avoiding over-smoothing. We establish a theoretical connection between GNNs and local clustering, showing that short random-walks in GNNs have a high probability to be stuck at a local cluster. Based on the theoretical analysis, we propose Local Clustering Graph Neural Networks (LCGNN), a GNN learning paradigm that utilizes local clustering to efficiently search for small but compact subgraphs for GNN training and inference. Compared to full-batch GNNs, sampling-based GNNs and graph partition-based GNNs, LCGNN performs comparably or even better, achieving state-of-the-art results on four Open Graph Benchmark (OGB) datasets. The locality of LCGNN allows it to scale to graphs with 100M nodes and 1B edges on a single GPU.
One-sentence Summary: A local clustering based graph neural networks learning paradigm.
Supplementary Material: zip
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
9 Replies

Loading