SubMix: Learning to Mix Graph Sampling HeuristicsDownload PDF

Published: 08 May 2023, Last Modified: 26 Jun 2023UAI 2023Readers: Everyone
Keywords: Graph, Subgraph Sampling, Graph Neural Network
TL;DR: We develop subgraph sampling procedure that is end-to-end trainable, that we train jointly with graph neural networks.
Abstract: Sampling subgraphs for training Graph Neural Networks (GNNs) is receiving much attention from the GNN community. While a variety of methods have been proposed, each method samples the graph according to its own heuristic. However, there has been little work in mixing these heuristics in an end-to-end trainable manner. In this work, we design a generative framework for graph sampling. Our method, SubMix, parameterizes graph sampling as a convex combination of heuristics. We show that a continuous relaxation of the discrete sampling process allows us to efficiently obtain analytical gradients for training the sampling parameters. Our experimental results illustrate the usefulness of learning graph sampling in three scenarios: (1) robust training of GNNs by automatically learning to discard noisy edge sources; (2) improving model performance by trainable and online edge subset selection; and (3) by integrating our framework into state-of-the-art (SOTA) decoupled GNN models, for homogeneous OGBN datasets. Our method raises the SOTA on challenging ogbn-arxiv and ogbn-products, respectively, by over 4 and 0.5 percentage points.
Supplementary Material: pdf
0 Replies