CoCoS: Enhancing Semi-Supervised Learning on Graphs with Unlabeled Data via Contrastive Context Sharing
Abstract: Graph Neural Networks (GNNs) have recently become a
popular framework for semi-supervised learning on graphstructured
data. However, typical GNN models heavily rely
on labeled data in the learning process, while ignoring
or paying little attention to the data that are unlabeled but
available. To make full use of available data, we propose a
generic framework, Contrastive Context Sharing (CoCoS), to
enhance the learning capacity of GNNs for semi-supervised
tasks. By sharing the contextual information among nodes estimated
to be in the same class, different nodes can be correlated
even if they are unlabeled and remote from each other in
the graph. Models can therefore learn different combinations
of contextual patterns, which improves the robustness of node
representations. Additionally, motivated by recent advances
in self-supervised learning, we augment the context sharing
strategy by integrating with contrastive learning, which naturally
correlates intra-class and inter-class data. Such operations
utilize all available data for training and effectively improve
a model’s learning capacity. CoCoS can be easily extended
to a wide range of GNN-based models with little computational
overheads. Extensive experiments show that Co-
CoS considerably enhances typical GNN models, especially
when labeled data are sparse in a graph, and achieves stateof-
the-art or competitive results in real-world public datasets.
Loading