On Incorporating Scale into Graph Networks

Published: 06 Mar 2025, Last Modified: 09 Apr 2025ICLR 2025 Workshop MLMP OralEveryoneRevisionsBibTeXCC BY 4.0
Track: New scientific result
Keywords: Generalization, (Resolution-)Scale, Graph Neural Networks
TL;DR: Standard Graph Neural Networks turn out to not be continuous, which precludes them from consistently incorporating varying (resolution-)scale information.
Abstract: Standard graph neural networks assign vastly different latent embeddings to graphs describing the same physical system at different resolution scales. This precludes consistency in applications and prevents generalization between scales as would fundamentally be needed in many scientific applications. We uncover the underlying obstruction, investigate its origin and show how to overcome it.
Presenter: ~Christian_Koke1
Submission Number: 12
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview