Oversmoothing as Loss of Sign: Towards Structural Balance in Graph Neural Networks

26 Sept 2024 (modified: 05 Feb 2025)Submitted to ICLR 2025EveryoneRevisionsBibTeXCC BY 4.0
Keywords: graph neural networks, oversmoothing
Abstract: Oversmoothing is a common phenomenon in a wide range of graph neural networks (GNNs), where node representation becomes homogeneous and thus model performance worsens as the number of layers increases. Various strategies have been proposed to combat oversmoothing, but they are based on different heuristics and lack a unified understanding of their inherent mechanisms. In this paper, we revisit the concept of signed graphs and show that a wide class of anti-oversmoothing techniques can be viewed as the propagation on corresponding signed graphs with both positive and negative edges. Leveraging the classic theory of signed graphs, we characterize the asymptotic behaviors of existing methods and reveal that they deviate from the ideal state of structural balance that provably prevents oversmoothing and improves node classification performance. Driven by this unified analysis and theoretical insights, we propose Structural Balanced Propagation (SBP) where we explicitly enhance the structural balance of the signed graph with the help of label and feature information. We theoretically and empirically prove that SBP can improve the structural balance to alleviate oversmoothing under certain conditions. Experiments on synthetic and real-world datasets demonstrate the effectiveness of our methods, highlighting the value of our signed graph framework.
Supplementary Material: zip
Primary Area: learning on graphs and other geometries & topologies
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 5637
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview