Graph Neural Tangent Kernel and Graph Neural Network Gaussian Processes for Node Classification/ Regression

23 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: learning on graphs and other geometries & topologies
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: deep learning, graph neural networks, kernel methods, gaussian processes, neural tangent kernel, graph convolutional networks
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
TL;DR: Graph Neural Tangent Kernel (GNTK) and Graph Neural Network Gaussian Processes (GNNGP) for three Architectures (namely Graph Neural Network (GNN), Skip-Concatenate GNN and Graph Attention Neural Network) are derived and evaluated.
Abstract: This work analyzes Graph Neural Networks, a generalization of Fully-Connected Deep Neural Nets on Graph Structured Data, when their width, that is the number of nodes in each fully-connected layers is increasing to infinity. Infinite Width Neural Networks are connecting Deep Learning to Gaussian Processes and Kernels, both Machine Learning Frameworks with long traditions and extensive theoretical foundations. Gaussian Processes and Kernels have much less hyperparameters then Neural Networks and can be used for uncertainty estimation, making them more user friendly for applications. This works extends the increasing amount of research connecting Gaussian Processes and Kernels to Neural Networks. The Kernel and Gaussian Process closed forms are derived for a variety of architectures, namely the standard Graph Neural Network, the Graph Neural Network with Skip-Concatenate Connections and the Graph Attention Neural Network. All architectures are evaluated on a variety of Datasets on the task of Transductive Node Regression and Classification. Extending the setting to Inductive Graph Learning tasks is straightforward and is briefly discussed in 7.5.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
Supplementary Material: zip
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 8114
Loading