Hub-hub connections matter: Improving edge dropout to relieve over-smoothing in graph neural networks

Published: 01 Jan 2023, Last Modified: 11 Jan 2025Knowl. Based Syst. 2023EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: In recent years, graph neural networks (GNNs) have become the most widely used techniques for irregular data analysis. The core of GNNs lies in feature and/or label smoothing, which endows GNNs with promising expressive power. However, over-smoothing by excessive convolution on graph will inevitably result in performance degradation. While DropEdge has been shown to be an effective approach to tackle this problem, it views the over-smoothing as a macroscopic phenomenon and treats all edges equally. By tapping into the smoothness at a microscopic level, it is found that the smoothing processes in different regions are diverse. In particular, the first sign of over-smoothing generally emerges among connected hubs (i.e., nodes with high degrees). With this insight, we propose a novel edge dropout method to retrain early over-smoothing in graph learning, so as to relieve the global over-smoothing. Moreover, we introduce the siamese network architecture to alleviate the inconsistency problem inherent in Edge dropout methods (e.g. DropEdge). Extensive experiments on multiple benchmark datasets show that the proposed method consistently leads to improved performance and significantly outperforms the standard DropEdge with enhanced robustness.
Loading