Tackling Long-Tailed Distribution Issue in Graph Neural Networks via Normalization

Published: 01 Jan 2024, Last Modified: 14 May 2025IEEE Trans. Knowl. Data Eng. 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Graph Neural Networks (GNNs) have attracted much attention due to their superior learning capability. Despite the successful applications of GNNs in many areas, their performance suffers heavily from the long-tailed node degree distribution. Most prior studies tackle this issue by devising sophisticated model architectures. In this article, we aim to improve the performance of tail nodes (low-degree or hard-to-classify nodes) via a generic and light normalization method. In detail, we propose a novel normalization method for GNNs, termed as ResNorm, which Reshapes a long-tailed distribution into a normal-like distribution via Normalization. The ResNorm includes two operators. First, the scale operator reshapes the distribution of the node-wise standard deviation (NStd) so as to improve the accuracy of tail nodes. Second, the analysis of the behavior of the standard shift indicates that the standard shift serves as a preconditioner on the weight matrix, increasing the risk of over-smoothing. To address this issue, we design a new shift operator for ResNorm, which simulates the degree-specific parameter strategy in a low-cost manner. Extensive experiments on various node classification benchmark datasets have validated the effectiveness of ResNorm in improving the performance of tail nodes as well as the overall performance.
Loading