Abstract: Hypergraph neural networks (HGNNs) effectively model multi-way interactions but suffer from severe scalability limitations due to quadratic computational costs across multiple behavioral contexts. Existing pruning approaches reduce computation using fixed, hand-crafted heuristics, which fail to adapt to diverse graph structures and often introduce representation distortions by removing semantically related nodes or creating spurious similarities that degrade contrastive learning. We propose \textbf{TriPrune-HGNN}, an adaptive hypergraph pruning framework with learnable mechanisms that eliminates manual hyperparameter tuning (over $80\%$ reduction) while achieving a superior accuracy--efficiency tradeoff. TriPrune-HGNN learns pruning schedules from graph statistics and training dynamics, adaptively mines informative contrastive pairs, and automatically balances competing learning objectives via meta-optimization. Extensive experiments on five benchmarks show that TriPrune-HGNN achieves state-of-the-art performance across all 15 evaluation metrics, while reducing inference time by $72.3\%$ and memory usage by $81.1\%$ compared to unpruned models. Compared with efficient baselines of similar memory footprint, TriPrune-HGNN attains up to $5.6\%$ lower error, demonstrating the effectiveness of adaptive pruning for large-scale hypergraph learning.
Submission Type: Long submission (more than 12 pages of main content)
Assigned Action Editor: ~Christopher_Morris1
Submission Number: 7346
Loading