SpeqNets: Sparsity-aware Permutation-equivariant Graph NetworksDownload PDF

02 Mar 2022, 12:21 (modified: 22 Apr 2022, 23:05)GTRL 2022 PosterReaders: Everyone
Keywords: Higher-order structures, GNNs, sparsity, permutation-equivariance
TL;DR: We propose a new hierarchy of graph isomorphism heuristics inducing fully sparsity-aware, permutation-equivariant graph networks, which offer a more fine-grained tradeoff between expressivity and scalability.
Abstract: While graph neural networks have clear limitations in approximating permutation-equivariant functions over graphs, more expressive, higher-order graph neural networks do not scale to large graphs. By introducing new heuristics for the graph isomorphism problem, we devise a class of universal, permutation-equivariant graph networks, which offers a fine-grained control between expressivity and scalability and adapt to the sparsity of the graph. These architectures lead to vastly reduced computation times compared to standard higher-order graph networks while significantly improving over standard graph neural network and graph kernel architectures in terms of predictive performance.
1 Reply