Message passing all the way upDownload PDF

Published: 25 Mar 2022, Last Modified: 22 Oct 2023GTRL 2022 SpotlightReaders: Everyone
Keywords: message passing, graph neural networks, higher-order gnns, weisfeiler-lehman, equivariance
TL;DR: Position paper in defence of the message passing primitive: (almost) everything can be implemented using it, and many papers do use it, so we shouldn't claim we go 'beyond' it.
Abstract: The message passing framework is the foundation of the immense success enjoyed by graph neural networks (GNNs) in recent years. In spite of its elegance, there exist many problems it provably cannot solve over given input graphs. This has led to a surge of research on going beyond message passing, building GNNs which do not suffer from those limitations---a term which has become ubiquitous in regular discourse. However, have those methods truly moved beyond message passing? In this position paper, I argue about the dangers of using this term---especially when teaching graph representation learning to newcomers. I show that any function of interest we want to compute over graphs can, in all likelihood, be expressed using pairwise message passing -- just over a potentially modified graph, and argue how most practical implementations subtly do this kind of trick anyway. Hoping to initiate a productive discussion, I propose replacing beyond message passing with a more tame term, augmented message passing.
Poster: png
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/arxiv:2202.11097/code)
6 Replies

Loading