Abstract: Node classification is a prominent graph-based task and various Graph neural networks (GNNs) models have been applied for solving it. In this paper, we introduce a novel GNN architecture for node classification called Graph Quaternion-Valued Attention Networks (GQAT), which enhances the original graph attention networks by replacing the vector multiplication in self-attention with quaternion vector multiplication.One of the primary advantages of GQAT is the significant reduction in model parameters, as quaternion operations require only 1/4 of the calculation matrix, contributing to a more lightweight model. Moreover, GQAT excels at capturing intricate relationships between nodes, owing to the sophisticated nature of quaternion operations. We conduct extensive experiments on Cora, Citeseer, and Pubmed for node classification. The results demonstrate that GQAT outperforms conventional graph attention networks in terms of node classification accuracy while requiring fewer parameters.
Loading