Abstract: Classic recommender systems (RSs) often repeatedly recommend similar items to user historical profiles or recent purchases. For this, session-based RSs (SBRSs) are extensively studied in recent years. Current SBRSs often assume a rigid-order sequence, which does not fit in many real-world cases. In fact, the next-item recommendation depends on not only current session context but also historical sessions which are often neglected by current SBRSs. Accordingly, an SBRS over relaxed-order sequences with both intra- and inter-context is more pragmatic. Inspired by the successful experience in modern language modeling, we design an efficient neural architecture to model both intra- and inter-context for next item prediction.
0 Replies
Loading