No Change, No Gain: Empowering Graph Neural Networks with Expected Model Change Maximization for Active Learning

Published: 21 Sept 2023, Last Modified: 19 Jan 2024NeurIPS 2023 spotlightEveryoneRevisionsBibTeX
Keywords: Graph Neural Networks, Expected Model Change Maximization
Abstract: Graph Neural Networks (GNNs) are crucial for machine learning applications with graph-structured data, but their success depends on sufficient labeled data. We present a novel active learning (AL) method for GNNs, extending the Expected Model Change Maximization (EMCM) principle to improve prediction performance on unlabeled data. By presenting a Bayesian interpretation for the node embeddings generated by GNNs under the semi-supervised setting, we efficiently compute the closed-form EMCM acquisition function as the selection criterion for AL without re-training. Our method establishes a direct connection with expected prediction error minimization, offering theoretical guarantees for AL performance. Experiments demonstrate our method's effectiveness compared to existing approaches, in terms of both accuracy and efficiency.
Supplementary Material: pdf
Submission Number: 10494