Graph-Based Neural Network Models with Multiple Self-Supervised Auxiliary TasksDownload PDF

28 Sept 2020 (modified: 05 May 2023)ICLR 2021 Conference Withdrawn SubmissionReaders: Everyone
Keywords: Graph Neural Networks, Self-supervised learning, Multi-task learning, Graph Convolutional Networks, Semi-supervised learning
Abstract: Self-supervised learning is currently gaining a lot of attention, as it allows neural networks to learn robust representations from large quantities of unlabeled data. Additionally, multi-task learning can further improve representation learning by training networks simultaneously on related tasks, leading to significant performance improvements. In this paper, we propose a general framework to improve graph-based neural network models by combining self-supervised auxiliary learning tasks in a multi-task fashion. Since Graph Convolutional Networks are among the most promising approaches for capturing relationships among structured data points, we use them as a building block to achieve competitive results on standard semi-supervised graph classification tasks.
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
Reviewed Version (pdf): https://openreview.net/references/pdf?id=VhrMu5Rmni
5 Replies

Loading