Bottlenecked Heterogeneous Graph Contrastive Learning for Robust Recommendation

Published: 2025, Last Modified: 14 Jan 2026ACM Trans. Inf. Syst. 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: In recommender systems, heterogeneous graph neural networks (HGNNs) have demonstrated remarkable efficacy due to their capacity to harness rich auxiliary information within heterogeneous information networks (HINs). However, existing HGNN-based recommendation faces severe noise cascading challenge. The presence of substantial data noise can adversely affect robustness of recommender, as the graph structures are susceptible to noise and even unnoticed malicious perturbations. Moreover, these noises can propagate and accumulate through connected nodes, potentially exerting a profound impact on target nodes within the graph structure. To tackle the noise challenges, we present a Bottlenecked Heterogeneous Graph Contrastive Learning (BHGCL), aiming to enhance the robustness of recommendation systems. BHGCL can first effectively separate fine-grained latent factors from complex self-supervision signals with a disentangled-based encoder, leveraging diverse semantic information across various meta-paths. Then, by employing the information bottleneck (IB) principle, BHGCL adaptively learns to reduce noise in augmented graphs. IB can capture the minimum sufficient information from the data features, which significantly improves system performance in environments with noisy data. Experimental findings from multiple real-world datasets reveal that our approach surpasses the latest advanced recommendation systems, verifying its effectiveness and robustness. To reproduce our work, we have open-sourced our code at https://github.com/DuellingSword/BHGCL.
Loading