A Fractional Graph Laplacian Approach to Oversmoothing

Published: 21 Sept 2023, Last Modified: 02 Nov 2023NeurIPS 2023 posterEveryoneRevisionsBibTeX
Keywords: Graph Neural Networks, Graph Neural ODE, Fractional Laplacian, Oversmoothing
TL;DR: We introduce the fractional graph Laplacian neural ODE, which alleviates oversmoothing and is well-suited for both directed and undirected graphs, as well as various levels of homophily.
Abstract: Graph neural networks (GNNs) have shown state-of-the-art performances in various applications. However, GNNs often struggle to capture long-range dependencies in graphs due to oversmoothing. In this paper, we generalize the concept of oversmoothing from undirected to directed graphs. To this aim, we extend the notion of Dirichlet energy by considering a directed symmetrically normalized Laplacian. As vanilla graph convolutional networks are prone to oversmooth, we adopt a neural graph ODE framework. Specifically, we propose fractional graph Laplacian neural ODEs, which describe non-local dynamics. We prove that our approach allows propagating information between distant nodes while maintaining a low probability of long-distance jumps. Moreover, we show that our method is more flexible with respect to the convergence of the graph’s Dirichlet energy, thereby mitigating oversmoothing. We conduct extensive experiments on synthetic and real-world graphs, both directed and undirected, demonstrating our method’s versatility across diverse graph homophily levels. Our code is available at https://github.com/RPaolino/fLode
Supplementary Material: pdf
Submission Number: 7454
Loading