Abstract: Sequential Recommender Systems (SRS) have emerged as
a promising technique across various domains, excelling at
capturing complex user preferences. Current SRS have employed transformer-based models to give the next-item prediction. However, their quadratic computational complexity
often lead to notable inefficiencies, posing a significant obstacle to real-time recommendation processes. Recently, Mamba
has demonstrated its exceptional effectiveness in time series
prediction, delivering substantial improvements in both efficiency and effectiveness. However, directly applying Mamba
to SRS poses certain challenges. Its unidirectional structure
may impede the ability to capture contextual information in
user-item interactions, while its instability in state estimation may hinder the ability to capture short-term patterns in
interaction sequences. To address these issues, we propose
a novel framework called SelectIve Gated MAmba for Sequential Recommendation (SIGMA). By introducing the Partially Flipped Mamba (PF-Mamba), we construct a special
bi-directional structure to address the context modeling challenge. Then, to consolidate PF-Mamba’s performance, we
employ an input-dependent Dense Selective Gate (DS Gate)
to allocate the weights of the two directions and further filter the sequential information. Moreover, for short sequence
modeling, we devise a Feature Extract GRU (FE-GRU) to
capture the short-term dependencies. Experimental results
demonstrate that SIGMA significantly outperforms existing
baselines across five real-world datasets. Our implementation code is available at https://github.com/Applied-MachineLearning-Lab/SIMGA
Loading