Momentum Contrastive Learning for Sequential Recommendation

Published: 01 Jan 2023, Last Modified: 03 Feb 2025CSCWD 2023EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Contrastive self-supervised learning (SSL) based Sequential Recommendations (SR) have recently achieved significant performance improvements in addressing the data sparsity problem, which hinders learning high-quality user representations. However, current contrastive SSL based models ignore the importance of consistency between sample pairs. Consistency means the similarity degree between the feature representation of encoded sample pairs, and the higher the consistency, the better the feature learning. To figure out the benefits of consistency and utilize it effectively, Momentum Contrastive Learning for Sequential Recommendation (MCL4SRec) is designed. Existing experiments on four public datasets demonstrate the superiority of MCL4SRec, which achieves state-of-the-art performance over existing baselines.
Loading