Hypergraph Attention Isomorphism Network by Learning Line Graph ExpansionDownload PDFOpen Website

Published: 01 Jan 2020, Last Modified: 12 Sept 2023IEEE BigData 2020Readers: Everyone
Abstract: Graph neural networks (GNNs) are able to achieve state-of-the-art performance for node representation and classification in a network. But, most of the existing GNNs can be applied to simple graphs, where an edge connects only a pair of nodes. Studies have shown that hypergraphs are effective to model real-world relationships which are of higher order in nature. Recently, graph neural networks are proposed for hypergraphs, but they implicitly use clique or star expansions to convert the hypergraph to a simple graph, or use computationally expensive hypergraph Laplacian.In this work, we propose a novel hypergraph neural network for semi-supervised hypernode classification, which operates directly on the hypergraphs with varying hyperedge sizes. Within each layer, it indirectly works on the line graph of the given hypergraph, without actually forming the line graph explicitly. Moreover, it also employs a self-attention mechanism to learn the weights of those edge relationships. Experimentally, HAIN is able to improve the state-of-the-art hypernode classification performance on all the datasets we use. We make the source code available to ease the reproducibility of the results.
0 Replies

Loading