Abstract: Contrastive graph clustering (CGC) has become a prominent method for self-supervised representation learning by contrasting augmented graph data pairs. However, the performance of CGC methods critically depends on the choice of data augmentation, which usually limits the capacity of network generalization. Besides, most existing methods characterize positive and negative samples based on the nodes themselves, ignoring the influence of neighbors with different hop numbers on the node. In this study, a novel self-cumulative contrastive graph clustering (SC-CGC) method is devised, which is capable of dynamically adjusting the influence of neighbors with different hops. Our intuition is that better neighbors are closer and distant ones are further away in their feature space, thus we can perform neighbor contrasting without data augmentation. To be specific, SC-CGC relies on two neural networks, i.e., autoencoder network (AE) and graph autoencoder network (GAE), to encode the node information and graph structure, respectively. To make these two networks interact and learn from each other, a dynamic fusion mechanism is devised to transfer the knowledge learned by AE to the corresponding GAE layer by layer. Then, a self-cumulative contrastive loss function is designed to characterize the structural information by dynamically accumulating the influence of the nodes with different hops. Finally, our approach simultaneously refines the representation learning and clustering assignments in a self-supervised manner. Extensive experiments on 8 realistic datasets demonstrate that SC-CGC consistently performs better over SOTA techniques. The code is available at https://github.com/Xiaoqiang-Yan/JAS-SCCGC.
Loading