Analysis of Graph Convolutional Networks using Neural Tangent KernelsDownload PDF

23 Jun 2022 (modified: 05 May 2023)ECMLPKDD 2022 Workshop MLG SubmissionReaders: Everyone
Keywords: Graph Convolutional Networks, Neural Tangent Kernels, Semi-Supervised Learning
Abstract: Graph Convolutional Networks (GCNs) have emerged as powerful tools for learning on network structured data. Although empirically successful, GCNs exhibit certain behaviour that has no rigorous explanation—for instance, the performance of GCNs significantly degrades with increasing network depth, whereas it improves marginally with depth using skip connections. This paper focuses on semi-supervised learning on graphs, and explores the above observations through the lens of Neural Tangent Kernels (NTKs). To analyse the influence of depth, we derive NTKs corresponding to infinitely wide GCNs with and without skip connections and allowing non-linear output layer. While the constancy property of NTK is lost with the non-linear output layer, we show empirically that the approximation is similar to linear output layer. Using the newly derived NTK we analyse the influence of depth in GCNs and provide a comparison of different skip connections
0 Replies

Loading