Abstract: Despite remarkable advancements in graph contrastive learning techniques, the identification of interdependent relationships when maximizing cross-view mutual information remains a challenging issue, primarily due to the complexity of graph topology. In this study, we propose to formulate cross-view interdependence from the innovative perspective of information flow. Accordingly, IDEAL, a simple yet effective framework, is proposed for interdependence-adaptive graph contrastive learning. Compared with existing methods, IDEAL concurrently addresses same-node and distinct-node interdependence, circumvents the reliance on additional distribution mining techniques, and is augmentation-aware. Besides, the objective of IDEAL takes advantage of both contrastive and generative learning objectives and is thus capable of learning a uniform embedding distribution while retaining essential semantic information. The effectiveness of IDEAL is validated by extensive empirical evidence. It consistently outperforms state-of-the-art self-supervised methods by considerable margins across seven benchmark datasets with diverse scales and properties and, at the same time, showcases promising training efficiency.
Loading