None-Negative Graph Contrastive Learning for Knowledge-Driven Zero-Shot Learning

Peng Tang, Cheng Xie, Beibei Yu

Published: 01 Jan 2022, Last Modified: 21 May 2025ICEBE 2022EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Negative samples are usually used in previous contrastive learning methods, but such methods often have major defects in graph contrastive learning. In the knowledge graph, the potential relationship between nodes is mainly represented by the edge relationship between nodes. However, in the previous contrastive learning method, the distance between positive and negative samples is infinitely expanded, so this potential relationship is destroyed. Therefore, inspired by the state-of-the-art contrastive learning methods without negative examples, we propose a None-Negative Graph Contrastive Learning method (NGCL) for generalized zero-shot learning. NGCL uses the dropout graph augmentation method to make a graph node have its corresponding positive sample, and the two augmentation graphs are compared with positive samples after graph convolution and MLP. Experimental results on real-world datasets without predefined attributes demonstrate the effectiveness of our method.
Loading