Local Permutation Equivariance For Graph Neural NetworksDownload PDF

Published: 28 Jan 2022, Last Modified: 13 Feb 2023ICLR 2022 SubmittedReaders: Everyone
Keywords: Graphs, Equivariance, Permutation Equivariance, Graph Neural Networks, Representations
Abstract: In this work we develop a new method, named {\it locally permutation-equivariant graph neural networks}, which provides a framework for building graph neural networks that operate on local node neighbourhoods, through sub-graphs, while using permutation equivariant update functions. The potential benefits of learning on graph-structured data are vast, and relevant to many application domains. However, one of the challenges, is that graphs are not always of the same size, and often each node in a graph does not have the same connectivity. This necessitates that the update function must be flexible to the input size, which is not the case in most other domains. Using our locally permutation-equivariant graph neural networks ensures an expressive update function through using permutation representations, while operating on a lower-dimensional space than that utilised in global permutation equivariance. Furthermore, the use of local update functions offers a significant improvement in GPU memory over global methods. We demonstrate that our method can outperform competing methods on a set of widely used graph benchmark classification tasks.
One-sentence Summary: A framework for building graph neural networks that operate on local node neighbourhoods, through sub-graphs, while using permutation equivariant update functions.
12 Replies

Loading