Transferable Hypergraph Neural Networks via Spectral Similarity

Published: 18 Nov 2023, Last Modified: 29 Nov 2023LoG 2023 PosterEveryoneRevisionsBibTeX
Keywords: hypergraphs, graph neural networks, graph signal processing, spectral graph theory, hypergraph Laplacian, graph diffusion
TL;DR: Hypergraph Neural Networks built from Graph Neural Networks are transferable if the graph representations are spectrally-similar.
Abstract: Hypergraphs model higher-order interactions in complex systems, e.g., chemicals reacting only in the presence of an enzyme or rumors spreading across groups, and encompass both the notion of an undirected graph and a simplicial complex. Nonetheless, due to computational complexity, machine learning on hypergraph-structured data is notoriously challenging. In an effort to transfer hypergraph neural network models, addressing this challenge, we extend results on the transferability of Graph Neural Networks (GNNs) to design a convolutional architecture for processing signals supported on hypergraphs via GNNs, which we call Hypergraph Expansion Neural Networks (HENNs). Exploiting multiple spectrally-similar graph representations of hypergraphs, we establish bounds on the transferability error. Experimental results illustrate the importance of considering multiple graph representations in HENNs, and show promise of superior performance when transferability is required.
Submission Type: Full paper proceedings track submission (max 9 main pages).
Agreement: Check this if you are okay with being contacted to participate in an anonymous survey.
Software: https://github.com/MHayhoe/HGLearning
Poster: png
Poster Preview: png
Submission Number: 192
Loading