Concatenative Contrastive Sampling for Transformer-based Sequential Recommendation

TMLR Paper1751 Authors

28 Oct 2023 (modified: 17 Sept 2024)Rejected by TMLREveryoneRevisionsBibTeXCC BY 4.0
Abstract: Sequential recommendation represents a significant research direction in recommender systems, which aims to analyze users' sequential actions to forecast the subsequent item or item sequence they are likely to engage with. This entails deploying machine learning models such as Markov Chains (MC), recurrent neural networks (RNNs), and transformers to unravel {the underlying user history} patterns in recommender systems and generate recommendations {according} to their capability in processing sequential data. However, most prior endeavors, while successfully leveraging user history attributes, are constrained in capturing the interplay between user history and new items, as well as the contrastive signals between authentic and unfavorable items. To surmount these limitations, we introduce an attention-based sequential recommendation model with a concatenate-then-split structure that intentionally integrates these interactions. Experimental findings underscore the efficacy of integrating such interactions, with our new model achieving state-of-the-art performance across prevalent sequential recommendation benchmarks.
Submission Length: Regular submission (no more than 12 pages of main content)
Assigned Action Editor: ~Laurent_Charlin1
Submission Number: 1751
Loading