Enhancing Deep Graph Neural Networks via Improving Signal Propagation

23 Sept 2023 (modified: 25 Mar 2024)ICLR 2024 Conference Withdrawn SubmissionEveryoneRevisionsBibTeX
Keywords: signal propagation, deep graph convolutional networks, over-smoothing, initialization
Abstract: Graph neural networks (GNNs) suffer from the \textit{curse of depth}, a phenomenon where performance degrades significantly as network depth increases. In this work, we aim to provide a more principled analysis and solution via the lens of signal propagation. We identify three metrics for a good signal propagation in graph neural nets: forward-propagation, backward-propagation, and graph embedding variation (GEV). We prove that traditional initialization methods, which deteriorate the performance of deep GNNs, fail to simultaneously control the three metrics. To tackle this issue, we develop a new GNN initialization method called \textbf{S}ignal \textbf{P}ropagation \textbf{o}n \textbf{G}raph (SPoGInit), which searches for weight variances that minimize the three metrics. % By carefully designing and optimizing initial weight metrics, In various datasets, SPoGInit achieves notable performance enhancements in node classification tasks as GNNs grow deeper. For instance, we observed a 2.2\% gain in test accuracy on OGBN-Arxiv dataset as the depth increases from 4 to 64.
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 7190
Loading