Abstract: Distributed mobile devices collect unlabeled graph data from environment. Introducing popular graph contrastive learning (GCL) methods can learn node representations better. However, training high-performance GCL requires large-scale data and graph data collected by the single device is insufficient. Meanwhile, transmitting local data for centralized training suffers from non-negotiate privacy leakage and bandwidth consumption. Federated learning (FL), as a distributed learning paradigm, is commonly used for such issues. Nevertheless, direct combination of FL and GCL struggles to supplement global graph information. This absence results in neighbor information missing, thus causing the local GCL to learn biased node representations. Moreover, the combination also triggers potential gradient explosion owing to the lack of unified learning criteria. In this paper, we propose a federal GCL framework that complements missing structural information and provides unified learning criteria. The key idea is to achieve cross-client node alignment on server through local graph structural importance to reason about the global graph information. We design a hierarchical structural importance scoring method to comprehensively evaluate structural importance, thus server performs effective cross-client aggregation while maintaining local graph privacy. We demonstrate the security and prove the bandwidth-reducing advantage of the proposed framework. Extensive experiments on 3 datasets show the superior performance of our method.
Loading