Keywords: GNNs, generalization, expressivity
TL;DR: We precisely understand when more more expressive GNNs leads to better generalization compared to standard MPNNs.
Abstract: The Weisfeiler--Leman algorithm ($1$-WL) is a well-studied heuristic for the graph isomorphism problem. Recently, the algorithm has played a prominent role in understanding the expressive power of message-passing graph neural networks (MPNNs) and being effective as a graph kernel. Despite its success, $1$-WL faces challenges in distinguishing non-isomorphic graphs, leading to the development of more expressive MPNN and kernel architectures. However, the relationship between enhanced expressivity and improved generalization performance remains unclear. Here, we focus on augmenting $1$-WL and MPNNs with subgraph information and employ classical margin theory to investigate the conditions under which an architecture's increased expressivity aligns with improved generalization performance. In addition, we show that gradient flow pushes the MPNN's weights toward the maximum margin solution.
Submission Type: Extended abstract (max 4 main pages).
Poster: png
Poster Preview: png
Submission Number: 107
Loading