Track: long paper (up to 4 pages)
Keywords: Generalization, (Resolution-)Scale, Graph Neural Networks
TL;DR: Standard Graph Neural Networks turn out to not be continuous, which precludes them from consistently incorporating varying (resolution-)scale information.
Abstract: Standard graph neural networks assign vastly different latent embeddings to graphs describing the same object at different resolution scales. This precludes consistency in applications and prevents generalization between scales as would fundamentally be needed e.g. in AI4Science. We uncover the underlying obstruction, investigate its origin and show how to overcome it by modifying the message passing paradigm.
Supplementary Material: zip
Anonymization: This submission has been anonymized for double-blind review via the removal of identifying information such as names, affiliations, and identifying URLs.
Submission Number: 28
Loading