Abstract: Knowledge-aware recommendations improve performance by using knowledge graphs as auxiliary information. Recently, researchers have introduced the contrastive learning paradigm in knowledge-aware recommendations to enhance representation learning. However, most contrastive learning methods rely on manually or randomly generated knowledge views, making it challenging to generalize to different data distributions and alleviate knowledge noise effects. To solve these issues, we propose a mask diffusion-based contrastive learning method for knowledge-aware recommendation. Specifically, we apply local masked input to the diffusion model, using a mask prediction paradigm to adaptively generate views from both global and local perspectives, thereby enhancing the model’s generalization capability across different data distributions. Additionally, we propose a conditional inference process, leveraging user intentions to provide reasonable denoising guidance. At the same time, we design a collaborative knowledge diffusion loss aimed at improving the consistency between generated data and user behavior patterns. In this way, we combine the diffusion model with contrastive learning for the knowledge-aware recommendation, which can improve the generalization ability of the model. Our experimental results on four datasets show the effectiveness of our model.
External IDs:dblp:journals/tkde/LiZLYZ25
Loading