Learning on Random Balls is Sufficient for Estimating (Some) Graph ParametersDownload PDF

May 21, 2021 (edited Jan 12, 2022)NeurIPS 2021 PosterReaders: Everyone
  • Keywords: graph parameters, Benjamini-Schramm convergence, random sampling, graph learning theory, graph classification, GNN
  • TL;DR: A graph parameter is estimable by GNNs+random sampling if and only if it is continuous in randomized Benjamini-Schramm topology.
  • Abstract: Theoretical analyses for graph learning methods often assume a complete observation of the input graph. Such an assumption might not be useful for handling any-size graphs due to the scalability issues in practice. In this work, we develop a theoretical framework for graph classification problems in the partial observation setting (i.e., subgraph samplings). Equipped with insights from graph limit theory, we propose a new graph classification model that works on a randomly sampled subgraph and a novel topology to characterize the representability of the model. Our theoretical framework contributes a theoretical validation of mini-batch learning on graphs and leads to new learning-theoretic results on generalization bounds as well as size-generalizability without assumptions on the input.
  • Supplementary Material: pdf
  • Code Of Conduct: I certify that all co-authors of this work have read and commit to adhering to the NeurIPS Statement on Ethics, Fairness, Inclusivity, and Code of Conduct.
  • Code: zip
13 Replies