Multi-preference Sequence Recommendation Transformer

Published: 2025, Last Modified: 21 Jan 2026WASA (2) 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Understanding inter-sequence correlations plays a crucial role in improving sequential recommendation systems. These correlations act as implicit supervision signals to improve preference representation. Yet, existing models struggle to jointly capture common and unique user preferences across sequences. To overcome this limitation, we propose a novel model named MRSSR (Multi-preference Representation Sequence Recommendation), which extends the bidirectional Transformer-based recommendation architecture BERT4Rec. Our key contribution lies in the introduction of a preference preprocessing mechanism that operates in three stages: preference identification, preference separation, and preference recombination. Special tokens are incorporated into the input sequences to represent multiple user preferences, enabling the model to capture diverse interests within and across user behaviors. Furthermore, our framework employs regularization to ensure the diversity and generalizability of extracted preferences. We evaluate MRSSR on two public benchmark datasets—Beauty and ML-100k—and the experimental results consistently demonstrate that our model outperforms state-of-the-art baselines across multiple ranking metrics. These findings validate the effectiveness of our preference preprocessing framework in enhancing the quality of recommendations and modeling nuanced user behavior in sequential contexts.
Loading