Full-Attention Driven Graph Contrastive Learning: with Effective Mutual Information Insight

Published: 23 Jan 2024, Last Modified: 23 May 2024TheWebConf24EveryoneRevisionsBibTeX
Keywords: Graph Representation Learning; Graph Contrastive Learning; Graph Transformer; Mutual Information.
Abstract: Graph contrastive learning has shown significant promise in unsupervised scenarios. Many techniques endeavor to maximize the mutual information between two perturbed graphs, but challenges arise when the data augmentation alters the graph's informative attributes, leading to potential noise positive pairs. While recent approaches have tried addressing this issue, they have shortcomings in guaranteeing effective data augmentation or incurring high computational costs. Only a few researches try to do data augmentation in encoder's latent space. With the help of full-attention graph transformers we may get a wider encoder's latent space to do data augmentation, while using full-attention graph transformers still causes some problems like noise information. This paper introduces GACL (Graph Attention Contrastive Learning), a novel model that integrates the full-attention transformer with a message-passing-based graph neural network as the encoder. To avoid noise information with full-attention, GACL introduces a noise modification to the full-attention. Our GACL model uniquely addresses challenges posed by full-attention and offers an innovative data augmentation strategy. Finally, we establish the concept of effective mutual information and validate the effectiveness of full-attention data augmentation. Empirical evaluations confirm GACL's superior performance, cementing its position as a state-of-the-art(SOTA) solution in the field of graph contrastive learning.
Track: Graph Algorithms and Learning for the Web
Submission Guidelines Scope: Yes
Submission Guidelines Blind: Yes
Submission Guidelines Format: Yes
Submission Guidelines Limit: Yes
Submission Guidelines Authorship: Yes
Student Author: Yes
Submission Number: 2509
Loading