Abstract: Graph convolutional networks (GCNs) have achieved remarkable learning ability for dealing with various graph structural data recently. In general, GCNs have low expressive power due to their shallow structure. In this paper, to improve the expressive power of GCNs, we propose two multi-scale GCN frameworks by incorporating self-attention mechanism and multi-scale information into the design of GCNs. The self-attention mechanism allows us to adaptively learn the local structure of the neighborhood, and achieves more accurate predictions. Extensive experiments on both node classification and graph classification demonstrate the effectiveness of our approaches over several state-of-the-art GCNs.
Loading