Shedding Light on Random Dropping and Oversmoothing

Published: 28 Oct 2023, Last Modified: 21 Dec 2023NeurIPS 2023 GLFrontiers Workshop PosterEveryoneRevisionsBibTeX
Keywords: oversmoothing
Abstract: Graph Neural Networks (GNNs) are widespread in graph representation learning. *Random dropping* approaches, notably DropEdge and DropMessage, claim to alleviate the key issues of overfitting and oversmoothing by randomly removing elements of the graph representation. However, their effectiveness is largely unverified. In this work, we show empirically that they have a limited effect in reducing oversmoothing at test time due to their training time exclusive nature. We show that DropEdge in particular can be seen as a form of training data augmentation, and its benefits to model generalization are not strictly related to oversmoothing, suggesting that in practice, the precise link between oversmoothing and performance is more nuanced than previously thought. We address the limitations of current dropping methods by *learning* to drop via optimizing an information bottleneck, which enables dropping to be performed effectively at test time.
Submission Number: 45
Loading