Abstract: Negative samples are usually used in previous contrastive learning methods, but such methods often have major defects in graph contrastive learning. In the knowledge graph, the potential relationship between nodes is mainly represented by the edge relationship between nodes. However, in the previous contrastive learning method, the distance between positive and negative samples is infinitely expanded, so this potential relationship is destroyed. Therefore, inspired by the state-of-the-art contrastive learning methods without negative examples, we propose a None-Negative Graph Contrastive Learning method (NGCL) for generalized zero-shot learning. NGCL uses the dropout graph augmentation method to make a graph node have its corresponding positive sample, and the two augmentation graphs are compared with positive samples after graph convolution and MLP. Experimental results on real-world datasets without predefined attributes demonstrate the effectiveness of our method.
Loading