Neighbor2Seq: Deep Learning on Massive Graphs by Transforming Neighbors to SequencesDownload PDF

28 Sept 2020 (modified: 22 Oct 2023)ICLR 2021 Conference Blind SubmissionReaders: Everyone
Keywords: Graph representation learning, large-scale, sequence
Abstract: Modern Graph Neural Networks (GNNs) follow a recursive neighbor-wise message passing scheme and have achieved great success in many fields. However, this recursive design brings expensive computation and huge memory usage, making it difficult to deploy on large-scale graphs. In this work, we propose Neighbor2Seq, which transforms the hierarchical neighborhood of each node into an ordered sequence and enables the subsequent utilization of general deep learning operations, such as convolution and attention. Neighbor2Seq grants our proposed models, i.e., Neighbor2Seq-Conv and Neighbor2Seq-Attn, the ability to learn on arbitrarily large graphs as long as the Neighbor2Seq step can be precomputed. Another potential advantage obtained by the way is that Neighbor2Seq can alleviate the over-squashing issue existing in modern GNNs. We conduct thorough experiments on a massive graph with more than 111 million nodes and 1.6 billion edges, as well as several medium-scale graphs, to evaluate our proposed method. Experimental results demonstrate that our proposed method is scalable to the massive graph and achieves superior performance across datasets.
One-sentence Summary: Transforming hierarchical neighborhoods to ordered sequences and enabling general deep learning methods on massive-scale graphs.
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/arxiv:2202.03341/code)
Reviewed Version (pdf): https://openreview.net/references/pdf?id=9QqOxuiOxN
19 Replies

Loading