Contextualized Messages Boost Graph Representations

Published: 07 Apr 2025, Last Modified: 07 Apr 2025Accepted by TMLREveryoneRevisionsBibTeXCC BY 4.0
Abstract: Graph neural networks (GNNs) have gained significant attention in recent years for their ability to process data that may be represented as graphs. This has prompted several studies to explore their representational capability based on the graph isomorphism task. Notably, these works inherently assume a countable node feature representation, potentially limiting their applicability. Interestingly, only a few study GNNs with uncountable node feature representation. In the paper, a new perspective on the representational capability of GNNs is investigated across all levels—node-level, neighborhood-level, and graph-level—when the space of node feature representation is uncountable. Specifically, the injective and metric requirements of previous works are softly relaxed by employing a pseudometric distance on the space of input to create a soft-injective function such that distinct inputs may produce similar outputs if and only if the pseudometric deems the inputs to be sufficiently similar on some representation. As a consequence, a simple and computationally efficient soft-isomorphic relational graph convolution network (SIR-GCN) that emphasizes the contextualized transformation of neighborhood feature representations via anisotropic and dynamic message functions is proposed. Furthermore, a mathematical discussion on the relationship between SIR-GCN and key GNNs in literature is laid out to put the contribution into context, establishing SIR-GCN as a generalization of classical GNN methodologies. To close, experiments on synthetic and benchmark datasets demonstrate the relative superiority of SIR-GCN, outperforming comparable models in node and graph property prediction tasks.
Certifications: Reproducibility Certification
Submission Length: Regular submission (no more than 12 pages of main content)
Changes Since Last Submission: 1. Added discussions on metrics in Section 3. 2. Modified Figure 1 to better highlight the proposed concepts. 3. Added discussions on *dynamic* and *static* message functions in Sections 3.1 and 4.4. 4. Added descriptions for Figure 2 to better highlight *dynamic* message functions modeled as MLPs. 5. Added brief remarks on heterophily in Section 4.6. 6. Added more detailed description for the HeteroEdgeCount dataset in Section 5.1. 7. Added more recent MPNN-based baselines for Benchmarking GNNs in Section 5.2. 8. Added more experimental results on the Open Graph Benchmark in Section 5.2. 9. Added recommendations for future works in Section 6. 10. Added more experimental results on Heterophilous Datasets in Appendix B. 11. Added in-depth discussions on computational runtime complexity in Appendix D.
Code: https://github.com/briangodwinlim/SIR-GCN
Assigned Action Editor: ~Moshe_Eliasof1
Submission Number: 4138
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview