On the Expressiveness and Generalization of Hypergraph Neural NetworksDownload PDF

Published: 24 Nov 2022, Last Modified: 05 May 2023LoG 2022 PosterReaders: Everyone
Keywords: hypergraph neural networks, neural network expressiveness
Abstract: This extended abstract describes a framework for analyzing the expressiveness, learning, and (structural) generalization of hypergraph neural networks (HyperGNNs). Specifically, we focus on how HyperGNNs can learn from finite datasets and generalize structurally to graph reasoning problems of arbitrary input sizes. Our first contribution is a fine-grained analysis of the expressiveness of HyperGNNs, that is, the set of functions that they can realize. Our result is a hierarchy of problems they can solve, defined in terms of various hyperparameters such as depths and edge arities. Next, we analyze the learning properties of these neural networks, especially focusing on how they can be trained on a finite set of small graphs and generalize to larger graphs, which we term structural generalization. Our theoretical results are further supported by the empirical results.
Type Of Submission: Extended abstract (max 4 main pages).
TL;DR: We present a framework for analyzing the expressiveness and generalization of hypergraph neural networks.
PDF File: pdf
Type Of Submission: Extended abstract.
Poster: jpg
Poster Preview: jpg
6 Replies

Loading