ScatterSample: Diversified Label Sampling for Data Efficient Graph Neural Network LearningDownload PDF

Published: 24 Nov 2022, Last Modified: 05 May 2023LoG 2022 PosterReaders: Everyone
Keywords: Graph Neural Network, Active Learning
Abstract: What target labels are most effective for graph neural network (GNN) training? In some applications where GNNs excel-like drug design or fraud detection, labeling new instances is expensive. We develop a data-efficient active sampling framework, ScatterSample, to train GNNs under an active learning setting. ScatterSample employs a sampling module termed DiverseUncertainty to collect instances with large uncertainty from different regions of the sample space for labeling. To ensure diversification of the selected nodes, DiverseUncertainty clusters the high uncertainty nodes and selects the representative nodes from each cluster. Our ScatterSample algorithm is further supported by rigorous theoretical analysis demonstrating its advantage compared to standard active sampling methods that aim to simply maximize the uncertainty and not diversify the samples. In particular, we show that ScatterSample is able to efficiently reduce the model uncertainty over the whole sample space. Our experiments on five datasets show that ScatterSample significantly outperforms the other GNN active learning baselines, specifically it reduces the sampling cost by up to 50% while achieving the same test accuracy.
Type Of Submission: Full paper proceedings track submission (max 9 main pages).
TL;DR: Propose a new active learning algorithm for graph neural networks
PDF File: pdf
Agreement: Check this if you are okay with being contacted to participate in an anonymous survey.
Type Of Submission: Full paper proceedings track submission.
Poster: png
Poster Preview: png
5 Replies

Loading