Rank Collapse Causes Over-Smoothing and Over-Correlation in Graph Neural Networks

Published: 18 Nov 2023, Last Modified: 29 Nov 2023LoG 2023 PosterEveryoneRevisionsBibTeX
Keywords: Graph-based Learning, graph neural networks, over-smoothing, over-correlation, expressivity, rank collapse
TL;DR: We show that a rank collapse of node representations is the fundamental issue behind over-smoothing and over-correlation.
Abstract: Our study reveals new theoretical insights into over-smoothing and feature over-correlation in graph neural networks. Specifically, we demonstrate that with increased depth, node representations become dominated by a low-dimensional subspace that depends on the aggregation function but not on the feature transformations. For all aggregation functions, the rank of the node representations collapses, resulting in over-smoothing for particular aggregation functions. Our study emphasizes the importance for future research to focus on rank collapse rather than over-smoothing. Guided by our theory, we propose a sum of Kronecker products as a beneficial property that provably prevents over-smoothing, over-correlation, and rank collapse. We empirically demonstrate the shortcomings of existing models in fitting target functions of node classification tasks.
Supplementary Materials: zip
Submission Type: Full paper proceedings track submission (max 9 main pages).
Agreement: Check this if you are okay with being contacted to participate in an anonymous survey.
Poster: png
Poster Preview: png
Submission Number: 93
Loading