CLCDR: Contrastive Learning for Cross-Domain Recommendation to Cold-Start Users

Published: 01 Jan 2022, Last Modified: 30 Sept 2024ICONIP (2) 2022EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Recent advances in cross-domain recommendation have shown great potential in improving sample efficiency and coping with the challenge of data sparsity via transferring the knowledge from the source domain to the target domain. Previous cross-domain recommendation methods are generally based on extracting information from overlapping users, which limits their performance when there are not sufficient overlapping users. In this paper, we propose a contrastive-based cross-domain recommendation framework for cold-start users that simultaneously transfers knowledge about overlapping users and user-item interactions to optimize the user/item representations. To this end, two contrastive loss functions and two specific learning tasks are proposed. The proposed framework can make fuller use of the information on the source domain and reduce the demand for overlapping users, while maintaining or even enhancing recommendation performance. Experimental results on a real-world dataset demonstrate the efficacy and effectiveness of our framework on the top-N cross-domain recommendation task.
Loading