Abstract: In data science, hypergraphs are natural models for data exhibiting multi-way or group relationships in contrast to graphs which only model pairwise relationships. Nonetheless, many proposed hypergraph neural networks effectively reduce hypergraphs to undirected graphs via symmetrized matrix representations, potentially losing important multi-way or group information. We propose an alternative approach to hypergraph neural networks in which the hypergraph is represented as a non-reversible Markov chain. We use this Markov chain to construct a complex Hermitian Laplacian matrix — the magnetic Laplacian — which serves as the input to our proposed hypergraph neural network. We study $\textit{HyperMagNet}$ for the task of node classification, and demonstrate its effectiveness over graph-reduction based hypergraph neural networks.
Submission Length: Regular submission (no more than 12 pages of main content)
Previous TMLR Submission Url: https://openreview.net/forum?id=PCikn5sKwq
Changes Since Last Submission: Incorporated changes suggested by the reviewers including: a section on computational complexity, a discussion and interpretation of the degree-based EDVW in citation networks, a discussion of tfidf vs. BoW representations in NLP data, and directions of future work and applications, among others.
Assigned Action Editor: ~Christopher_Morris1
Submission Number: 3725
Loading