Laplace-Transform-Filters render spectral Graph Neural Networks transferable

ICLR 2025 Conference Submission279 Authors

13 Sept 2024 (modified: 28 Nov 2024)ICLR 2025 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Graph Neural Networks, Spectral Graph Theory, Transferability
TL;DR: We develop a novel approach to GNN transferability based on information diffusion on graphs. We find that spectral graph neural networks are transferable from this new point of view if their filters arise as Laplace transforms of certain functions.
Abstract: We introduce a new point of view on transferability of graph neural networks based on the intrinsic notion of information diffusion within graphs. This notion is adapted to considering graphs to be similar if their overall rough structures are similar, while their fine-print articulation may differ. Transferability of graph neural networks is then considered between graphs that are similar from this novel perspective on transferability. After carefully analysing transferability of single filters, the transferability properties of entire networks are relegated to the transferability characteristics of the filters employed inside their convolutional blocks. A rigorous analysis establishes our main theoretical finding: Spectral convolutional networks are transferable between graphs whose overall rough structures align, if their filters arise as Laplace transforms of certain generalized functions. Numerical experiments illustrate and validate the theoretical findings in practice.
Supplementary Material: zip
Primary Area: learning on graphs and other geometries & topologies
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 279
Loading