Not too little, not too much: a theoretical analysis of graph (over)smoothingDownload PDF

Published: 24 Nov 2022, Last Modified: 05 May 2023LoG 2022 OralReaders: Everyone
Keywords: random graph, oversmoothing, theory
TL;DR: We rigorously prove the co-existence of beneficial finite graph smoothing and oversmoothing on two representative examples.
Abstract: We analyze graph smoothing with \emph{mean aggregation}, where each node successively receives the average of the features of its neighbors. Indeed, it has been observed that Graph Neural Networks (GNNs), which generally follow some variant of Message-Passing (MP) with repeated aggregation, may be subject to the \emph{oversmoothing} phenomenon: by performing too many rounds of MP, the node features tend to converge to a non-informative limit. At the other end of the spectrum, it is intuitively obvious that \emph{some} MP rounds are necessary, but existing analyses do not exhibit both phenomena at once. In this paper, we consider simplified linear GNNs, and rigorously analyze two examples of random graphs for which a finite number of mean aggregation steps provably improves the learning performance, before oversmoothing kicks in. We identify two key phenomena: graph smoothing shrinks non-principal directions in the data faster than principal ones, which is useful for regression, and shrinks nodes within communities faster than they collapse together, which improves classification.
PDF File: pdf
Type Of Submission: Extended abstract (max 4 main pages).
Agreement: Check this if you are okay with being contacted to participate in an anonymous survey.
Type Of Submission: Extended abstract.
Poster: png
6 Replies

Loading