Neural Graph Machines: Learning Neural Networks Using GraphsDownload PDF

26 Apr 2024 (modified: 22 Oct 2023)Submitted to ICLR 2017Readers: Everyone
Abstract: Label propagation is a powerful and flexible semi-supervised learning technique on graphs. Neural network architectures, on the other hand, have proven track records in many supervised learning tasks. In this work, we propose a training objective for neural networks, Neural Graph Machines, for combining the power of neural networks and label propagation. The new objective allows the neural networks to harness both labeled and unlabeled data by: (a) allowing the network to train using labeled data as in the supervised setting, (b) biasing the network to learn similar hidden representations for neighboring nodes on a graph, in the same vein as label propagation. Such architectures with the proposed objective can be trained efficiently using stochastic gradient descent and scaled to large graphs. The proposed method is experimentally validated on a wide range of tasks (multi- label classification on social graphs, news categorization and semantic intent classification) using different architectures (NNs, CNNs, and LSTM RNNs).
Conflicts: google.com, cam.ac.uk
Keywords: Semi-Supervised Learning, Natural language processing, Applications
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/arxiv:1703.04818/code)
11 Replies

Loading