Invariant Graphon Networks: Approximation and Cut Distance

Published: 23 Oct 2024, Last Modified: 25 Feb 2025NeurReps 2024 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Graph neural networks, invariant graph networks, universal approximation, graph limits, graphons, transferability, homomorphism densities, machine learning theory.
Abstract: Graph limit models, like graphons for limits of dense graphs, have recently been used to study size transferability of graph neural networks (GNNs). While most existing literature focuses on message passing GNNs (MPNNs), we attend to Invariant Graph Networks (IGNs), a powerful alternative GNN architecture (Maron et al., 2018). We generalize IGNs to graphons, introducing Invariant Graphon Networks (IWNs) which are defined using a subset of the IGN basis corresponding to bounded linear operators. Even with this restricted basis, we show universal approximation for graphon-signals in $\mathcal{L}^p$ distances using signal-weighted homomorphism densities. In contrast to the work of Cai and Wang (2022), our results reveal stronger expressivity and better align with graphon space geometry. We also highlight that, unlike other architectures such as MPNNs, IWNs are discontinuous with respect to cut distance. Yet, their transferability remains comparable to MPNNs
Submission Number: 19
Loading