Track: tiny paper (up to 4 pages)
Keywords: GNNs, MPNNs, Scale, Continuity, Theory
TL;DR: We show that also MPNNs can be rendered continuous across graph resolution scales.
Abstract: Standard message passing graph neural networks (MPNNs) assign vastly different latent embeddings to graphs that describe the same underlying object at different resolution scales. As a result, message passing graph neural networks generically fail to generalize across resolution scales.
Previous work showed that this issue can be overcome if instead of MPNNs certain special types of spectral graph neural networks are used.
In this tiny-paper, we show that spectral methods are not necessary for achieving scale continuity. We demonstrate that if message passing networks are suitably modified, they can be rendered scale-continuous as well. By identifying the structural requirements for continuity, we derive a class of message passing architectures that provably preserve embeddings across resolution scales. Empirically, we show that these models match the cross-scale generalization performance of spectral approaches while retaining the flexibility and scalability of local message passing methods.
Anonymization: This submission has been anonymized for double-blind review via the removal of identifying information such as names, affiliations, and identifying URLs.
Presenter: ~Christian_Koke1
Format: Yes, the presenting author will attend in person if this work is accepted to the workshop.
Funding: No, the presenting author of this submission does *not* fall under ICLR’s funding aims, or has sufficient alternate funding.
Serve As Reviewer: ~Christian_Koke1
Submission Number: 112
Loading