Abstract: Highlights•We propose a Global-Representation-based attention mechanism for graph neural network.•The low-rank mechanism provides structural information on neighbor nodes.•We address over-smoothing and heterophily issues by various compositions of mechanism.•We justify its effectiveness and superiority with eight widely used tasks.•The proposed model ranks first by statistical analysis compared with seven algorithms.
Loading