SpeqNets: Sparsity-aware Permutation-equivariant Graph NetworksDownload PDF

Published: 25 Mar 2022, Last Modified: 22 Oct 2023GTRL 2022 PosterReaders: Everyone
Keywords: Higher-order structures, GNNs, sparsity, permutation-equivariance
TL;DR: We propose a new hierarchy of graph isomorphism heuristics inducing fully sparsity-aware, permutation-equivariant graph networks, which offer a more fine-grained tradeoff between expressivity and scalability.
Abstract: While graph neural networks have clear limitations in approximating permutation-equivariant functions over graphs, more expressive, higher-order graph neural networks do not scale to large graphs. By introducing new heuristics for the graph isomorphism problem, we devise a class of universal, permutation-equivariant graph networks, which offers a fine-grained control between expressivity and scalability and adapt to the sparsity of the graph. These architectures lead to vastly reduced computation times compared to standard higher-order graph networks while significantly improving over standard graph neural network and graph kernel architectures in terms of predictive performance.
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/arxiv:2203.13913/code)
1 Reply

Loading