Abstract: Unsupervised graph representation learning with GNNs is critically important due to the difficulty of obtaining graph labels in many real applications. Graph contrastive learning (GCL), a recently popular method for unsupervised learning on graphs, has achieved great success on many tasks. However, existing graph-level GCL models generally focus on comparing the graph-level representation or node-level representation. The hierarchical structure property, which is ubiquitous in many real world graphs such as social networks and molecular graphs, is largely ignored. To bridge this gap, this paper proposes a novel hierarchical graph contrastive learning model named HIGCL. HIGCL uses a multi-layered architecture and contains two contrastive objectives, inner-contrasting and hierarchical-contrasting. The former conducts inner-scale contrastive learning to learn the flat structural features in each layer, while the latter focuses on performing cross-scale contrastive learning to capture the hierarchical features across layers. Extensive experiments are conducted on graph-level tasks to show the effectiveness of the proposed method.
Loading