Low-rank and global-representation-key-based attention for graph transformer

Published: 01 Jan 2023, Last Modified: 26 Aug 2024Inf. Sci. 2023EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Highlights•We propose a Global-Representation-based attention mechanism for graph neural network.•The low-rank mechanism provides structural information on neighbor nodes.•We address over-smoothing and heterophily issues by various compositions of mechanism.•We justify its effectiveness and superiority with eight widely used tasks.•The proposed model ranks first by statistical analysis compared with seven algorithms.
Loading