Abstract: Contrastive learning shows significant potential in mitigating performance degradation in recommendation caused by data sparsity and knowledge graph (KG) noise. However, existing methods face two key limitations: (1) heuristic-based structural perturbations for generating contrastive views often overlook knowledge distribution differences between head and tail nodes, leading to the loss of local semantic features; (2) in sparse data scenarios, capturing higher-order semantic associations from global paths remains challenging. To address these issues, we propose a novel Knowledge Transfer Contrastive in Global Path Networks (KTCG). Unlike traditional knowledge transfer paradigms, KTCG employs an unsupervised relation aggregation approach to enhance tail node representations without additional training phases, producing high-quality contrastive views. We design a fine-grained knowledge aggregation mechanism to mine detailed semantic information and collaborative signals from contrastive samples and interactions. Simultaneously, we integrate augmented views with interaction data to construct a global collaborative KG and develop a global path graph neural networks (GNN) to explore complete graph topology. Finally, contrastive learning is applied across local semantic and global topological spaces to filter noise and generate robust knowledge-aware representations. Extensive experiments on three public datasets demonstrate that KTCG outperforms advanced models. Further analysis confirms KTCG’s advantages in representation uniformity and sparse robustness. The code for this study is publicly available at: https://github.com/use159/KTCG.
External IDs:doi:10.1016/j.knosys.2025.113847
Loading