Can Graph Neural Networks Go Deeper Without Over-Smoothing? Yes, With a Randomized Path Exploration!Download PDF

15 May 2023OpenReview Archive Direct UploadReaders: Everyone
Abstract: Graph Neural Networks (GNNs) have emerged as one of the most powerful approaches for learning on graph-structured data, even though they are mostly restricted to being shallow in nature. This is because node features tend to become indistin- guishable when multiple layers are stacked together. This phe- nomenon is known as over-smoothing. This paper identifies two core properties of the aggregation approaches that may act as primary causes for over-smoothing. These properties are namely recursiveness and aggregation from higher to lower-order neigh- borhoods. Thus, we attempt to address the over-smoothing issue by proposing a novel aggregation strategy that is orthogonal to the other existing approaches. In essence, the proposed aggregation strategy combines features from lower to higher-order neighbor- hoods in a non-recursive way by employing a randomized path exploration approach. The efficacy of our aggregation method is verified through an extensive comparative study on the benchmark datasets w.r.t. the state-of-the-art techniques on semi-supervised and fully-supervised learning tasks.
0 Replies

Loading