GraphMix: Regularized Training of Graph Neural Networks for Semi-Supervised LearningDownload PDF

25 Sept 2019 (modified: 22 Oct 2023)ICLR 2020 Conference Blind SubmissionReaders: Everyone
Keywords: Regularization, Graph Neural Networks, Mixup, Manifold Mixup, Semi-supervised Object Classification over graph Data
TL;DR: Regularization techniques for training Graph Neural Networks. We show that with our simple method, the state-of-the-art results can be achieved even with simpler Graph Neural Network architectures, at virtullay no additional computation cost.
Abstract: We present GraphMix, a regularization technique for Graph Neural Network based semi-supervised object classification, leveraging the recent advances in the regularization of classical deep neural networks. Specifically, we propose a unified approach in which we train a fully-connected network jointly with the graph neural network via parameter sharing, interpolation-based regularization and self-predicted-targets. Our proposed method is architecture agnostic in the sense that it can be applied to any variant of graph neural networks which applies a parametric transformation to the features of the graph nodes. Despite its simplicity, with GraphMix we can consistently improve results and achieve or closely match state-of-the-art performance using even simpler architectures such as Graph Convolutional Networks, across three established graph benchmarks: Cora, Citeseer and Pubmed citation network datasets, as well as three newly proposed datasets : Cora-Full, Co-author-CS and Co-author-Physics.
Code: https://github.com/anon777000/GraphMix
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/arxiv:1909.11715/code)
Original Pdf: pdf
12 Replies

Loading