EdgeMask-HGNN: Learning to Sparsify Hypergraphs for Scalable Node Classification in Hypergraph Neural Networks
Keywords: Hypergraph Neural Networks; Hypergraph Sparsification
Abstract: Hypergraph Neural Networks (HGNNs) have achieved remarkable performance in various learning tasks involving hypergraphs— a data model for higher-order relationships across diverse domains and applications. However, the scalability of HGNNs is limited by the computational and memory demands incurred by dense hypergraph structures. Existing unsupervised sparsifiers address the scalability issue but sacrifice downstream predictive performance. To address this, we propose **EdgeMask-HGNN**, a novel framework that introduces a learnable, task-aware sparsification mechanism to reduce the hypergraph size while preserving predictive performance. EdgeMask-HGNN offers two distinct masking: a fine-grained node-hyperedge masking and a coarse-grained hyperedge-level masking, both trained end-to-end using supervision from the downstream task. We provide theoretical analysis showing that our approach (i) yields stable model outputs under stochastic masking, and (ii) ensures convergence of retention probabilities under gradient descent.
Extensive experiments on multiple node classification benchmarks demonstrate that EdgeMask-HGNN reduces or maintains memory usage on both small- and large-scale hypergraphs without sacrificing accuracy, and in some cases outperforms HGNNs trained on full hypergraphs. Moreover, EdgeMask-HGNN consistently surpasses unsupervised sparsification baselines such as random, degree-based, and spectral sparsification.
Supplementary Material: pdf
Primary Area: learning on graphs and other geometries & topologies
Submission Number: 22002
Loading