Hypergraph Attention NetworksDownload PDFOpen Website

2020 (modified: 27 Oct 2022)TrustCom 2020Readers: Everyone
Abstract: Recently, graph neural networks have achieved great success on the representation learning of the graph-structured data. However, these networks just consider the pairwise connection between nodes which cannot model the complicated connections of data in the real world. Thus, researchers began to pay attention to the hypergraph modeling. In recent years, some hypergraph neural networks have been proposed to aggregate the information of the hypergraph for representation learning. In this paper, we present hypergraph attention networks (HGATs) to encode the high-order data relation in the hypergraph. Specifically, our proposed HGATs consist of two modules: attentive vertex aggregation module and attentive hyperedge aggregation module. These two modules can implicitly assign different aggregation weights to different connected hyperedge/vertex to characterize the complex relations among data. We stack these modules to pass the messages between the hyperedges and vertices to refine the vertex/hyperedge features. Experimental results on the ModelNet40 and NTU2012 datasets show that our proposed HGATs can achieve superior performance for the visual object recognition tasks. Furthermore, we employ our HGAT for multi-view representation learning and better object classification results are achieved.
0 Replies

Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview