Keywords: topological machine learning, graph neural networks, heterophily, homophily, filtration, hierarchical graph learning, filtration learning
TL;DR: Two novel topological machine learning approaches using heterophily based topological filtration and filtration learning for multi-scale gnns
Abstract: Graph neural networks (GNNs) are a powerful method of learning representations of graph-structured data. While they excel at learning class-discriminative representations of nodes in homophilous graphs, where connecting nodes tend to belong to the same class, many GNNs struggle with heterophilous graphs whose inter-class connections can muddy the message passing. Inspired by this finding, we propose a topological filtration scheme, treating graphs as 1-dimensional simplicial complexes N with a filter function based on estimated edge heterophily, and introduce two methodologies that use a backbone GNN to learn from the resulting graph filtration. The first trains a GNN on each graph in the filtration sequence consecutively for a portion of the total training time, using embeddings from previous graphs to initialize node embeddings in subsequent graphs. The second approach uses a novel message passing scheme to pass messages jointly within each and between graph levels in the filtration sequence with common nodes. Both methods enhance the influence of early birth adjacent nodes in homophilous subgraphs, yet allow for the model to learn from the full range of heterophilous and homophilous connections in the graph. We further extend our approach to learn a graph filtration sequence of graphs through a learnable node filter function. Experiments show that our heterophily-filtered GNNs achieve superior node classification accuracy on heterophilous and homophilous networks alike.
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Resubmission: No
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 5132
Loading