On Incorporating Scale into Graph Networks

Published: 06 Mar 2025, Last Modified: 19 Mar 2025ICLR 2025 Workshop MLMP OralEveryoneRevisionsBibTeXCC BY 4.0
Track: New scientific result
Keywords: Generalization, (Resolution-)Scale, Graph Neural Networks
TL;DR: Standard Graph Neural Networks turn out to not be continuous, which precludes them from consistently incorporating varying (resolution-)scale information.
Abstract: Standard graph neural networks assign vastly different latent embeddings to graphs describing the same physical system at different resolution scales. This precludes consistency in applications and prevents generalization between scales as would fundamentally be needed in many scientific applications. We uncover the underlying obstruction, investigate its origin and show how to overcome it.
Presenter: ~Christian_Koke1
Submission Number: 12
Loading