Hypergraph Neural Networks through the Lens of Message Passing: A Common Perspective to Homophily and Architecture Design

TMLR Paper3514 Authors

18 Oct 2024 (modified: 09 Nov 2024)Under review for TMLREveryoneRevisionsBibTeXCC BY 4.0
Abstract: Most of the current learning methodologies and benchmarking datasets in the hypergraph realm are obtained by \emph{lifting} procedures from their graph analogs, leading to overshadowing specific characteristics of hypergraphs. This paper attempts to confront some pending questions in that regard: Q1 Can the concept of homophily play a crucial role in Hypergraph Neural Networks (HNNs)? Q2 How do models that employ unique characteristics of higher-order networks perform compared to lifted models? Q3 Do well-established hypergraph datasets provide a meaningful benchmark for HNNs? To address them, we first introduce a novel conceptualization of homophily in higher-order networks based on a Message Passing (MP) scheme, unifying both the analytical examination and the modeling of higher-order networks. Further, we investigate some natural strategies for processing higher-order structures within HNNs (such as keeping hyperedge-dependent node representations or performing node/hyperedge stochastic samplings), leading us to the most general MP formulation up to date --MultiSet. Finally, we conduct an extensive set of experiments that contextualize our proposals.
Submission Length: Long submission (more than 12 pages of main content)
Assigned Action Editor: ~Xuming_He3
Submission Number: 3514
Loading