Hypergraph Neural Networks through the Lens of Message Passing: A Common Perspective to Homophily and Architecture Design
Primary Area: learning on graphs and other geometries & topologies
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: Hypergraph Neural Network, Graph Neural Network, homophily, mini-batching, hypergraph modelling, higher-order interactions
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Abstract: Most of the current hypergraph learning methodologies and benchmarking datasets
in the hypergraph realm are obtained by lifting procedures from their graph analogs,
simultaneously leading to overshadowing hypergraph network foundations. This
paper attempts to confront some pending questions in that regard: Can the concept
of homophily play a crucial role in Hypergraph Neural Networks (HGNNs), similar
to its significance in graph-based research? Is there room for improving current
hypergraph architectures and methodologies? (e.g. by carefully addressing the
specific characteristics of higher-order networks) Do existing datasets provide a
meaningful benchmark for HGNNs? Diving into the details, this paper proposes a
novel conceptualization of homophily in higher-order networks based on a message
passing scheme; this approach harmonizes the analytical frameworks of datasets
and architectures, offering a unified perspective for exploring and interpreting
complex, higher-order network structures and dynamics. Further, we propose
MultiSet, a novel message passing framework that redefines HGNNs by allowing
hyperedge-dependent node representations, as well as introduce a novel architecture
–MultiSetMixer– that leverages a new hyperedge sampling strategy. Finally, we
provide an extensive set of experiments that contextualize our proposals and lead
to valuable insights in hypergraph representation learning.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
Supplementary Material: zip
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 6072
Loading