Keywords: Topological Neural Networks, Persistent Homology, Positional Encodings, GNNs
TL;DR: We combine positional encodings with persistence homology and offers theoretical insights about additional expressivity and limitations with rigorous empirical validation
Abstract: Unlike words in sentences, nodes in general graphs do not have canonical positional information. As a result, the local message-passing framework of popular graph neural networks (GNNs) fails to leverage possibly relevant global structures for the task at hand. In this context, positional encoding methods emerge as an efficient approach to enrich the representational power of GNNs, helping them break node symmetries in input graphs. Similarly, multiscale topological descriptors based on persistent homology have also been integrated into GNNs to boost their expressivity. However, it remains unclear how positional encoding interplays with PH-based topological features and whether we can align the two to improve expressivity further. We address this issue with a novel notion of topological positional encoding (ToPE) that amalgamates the strengths of persistence homology and positional encoding. We establish that ToPE has provable expressivity benefits. Strong empirical assessments further underscore the effectiveness of the proposed method on several graph and language processing applications, including molecular property prediction, out-of-distribution generalization, and synthetic tree tasks.
Primary Area: learning on graphs and other geometries & topologies
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 9289
Loading