Graph Contrastive Learning Reimagined: Exploring Universality

Published: 23 Jan 2024, Last Modified: 23 May 2024TheWebConf24EveryoneRevisionsBibTeX
Keywords: Graph Neural Network, Contrastive Learning for Web Graphs, Graph Representation Learning
Abstract: Graph Contrastive Learning (GCL) presents a promising training paradigm for addressing the label scarcity problem on real-world graph data. Despite its outstanding performance demonstrated in such classical web network tasks as link prediction, its generality to heterophilous networks such as marriage networks has yet to be thoroughly explored. The major factors constraining its generalizability are the encoders and positive sample collection which follow the strong homophilous assumption, which conflicts with the requirements of heterophilous graphs. The logical thought would be to equip GCL with an encoder with learnable propagation weights or generate a more homophilous graph for the input graph. However, the former is experimentally verified to be infeasible and the latter is prohibitive due to self-supervised learning. Therefore, we reaffirmed that the primary cause for its failure is the blind positive sample collection and the cross-layer decay of pseudo-supervised information. To alleviate the above shortcomings, We investigate the characteristic that homophilous graph structure has: i.e., its matrices satisfy the block-diagonal property. Based on this, a new graph contrastive learning framework with an inference module for block diagonal graph structures is proposed, called gRaph cOntraStive Exploring uNiversality (ROSEN), which constructs such structures by learning the local subspace correlations between nodes and their neighbors. It is then applied to the optimization process of contrast loss to aid in the selection of reliable positive samples from the neighborhood and to the encoder process to guarantee the generation of discriminative node representations, respectively. In order to obtain mutually beneficial information for graph structure inference and contrast loss optimization, these two important processes are updated alternately. Thus, theoretically, ROSEN follows the expectation-maximization algorithm. Extensive evaluations of real-world graphs, especially those with heterophilous, have shown the excellent performance and robustness of ROSEN.
Track: Graph Algorithms and Learning for the Web
Submission Guidelines Scope: Yes
Submission Guidelines Blind: Yes
Submission Guidelines Format: Yes
Submission Guidelines Limit: Yes
Submission Guidelines Authorship: Yes
Student Author: Yes
Submission Number: 1023
Loading