DropMix: Better Graph Contrastive Learning with Harder Negative Samples

Published: 01 Jan 2023, Last Modified: 06 Aug 2024ICDM (Workshops) 2023EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: While generating better negative samples for contrastive learning has been widely studied in the areas of CV and NLP, very few work has focused on graph-structured data. Recently, Mixup has been introduced to synthesize hard negative samples in graph contrastive learning (GCL). However, due to the unsupervised learning nature of GCL, without the help of soft labels, directly mixing representations of samples could inadvertently lead to the information loss of the original hard negative and further adversely affect the quality of the newly generated harder negative. To address the problem, in this paper, we propose a novel method DropMix to synthesize harder negative samples, which consists of two main steps. Specifically, we first select some hard negative samples by measuring their hardness from both local and global views in the graph simultaneously. After that, we mix hard negatives only on partial representation dimensions to generate harder ones and decrease the information loss caused by Mixup. We conduct extensive experiments to verify the effectiveness of DropMix on six benchmark datasets. Our results show that our method can lead to better GCL performance. Our data and codes are publicly available at https://github.com/Mayueq/DropMix-Code.
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview