Dissecting the Diffusion Process in Linear Graph Convolutional NetworksDownload PDF

21 May 2021, 20:43 (edited 08 Jan 2022)NeurIPS 2021 PosterReaders: Everyone
  • Keywords: Graph Neural Networks, Graph Heat Equation, Linear Models
  • TL;DR: We addreess the fundemental limitations of linear GCNs from a continuous perspective and make linear GCNs comparable to state-of-the-art nonlinear GCNs.
  • Abstract: Graph Convolutional Networks (GCNs) have attracted more and more attentions in recent years. A typical GCN layer consists of a linear feature propagation step and a nonlinear transformation step. Recent works show that a linear GCN can achieve comparable performance to the original non-linear GCN while being much more computationally efficient. In this paper, we dissect the feature propagation steps of linear GCNs from a perspective of continuous graph diffusion, and analyze why linear GCNs fail to benefit from more propagation steps. Following that, we propose Decoupled Graph Convolution (DGC) that decouples the terminal time and the feature propagation steps, making it more flexible and capable of exploiting a very large number of feature propagation steps. Experiments demonstrate that our proposed DGC improves linear GCNs by a large margin and makes them competitive with many modern variants of non-linear GCNs.
  • Supplementary Material: pdf
  • Code Of Conduct: I certify that all co-authors of this work have read and commit to adhering to the NeurIPS Statement on Ethics, Fairness, Inclusivity, and Code of Conduct.
  • Code: https://github.com/yifeiwang77/DGC
13 Replies