Abstract: RNNs have been shown to be excellent models for sequential data and in particular for session-based user behavior. The use of RNNs provides impressive performance benefits over classical methods in session-based recommendations. In this work we introduce a novel ranking loss function tailored for RNNs in recommendation settings. The better performance of such loss over alternatives, along with further tricks and improvements described in this work, allow to achieve an overall improvement of up to 35% in terms of MRR and Recall@20 over previous session-based RNN solutions and up to 51% over classical collaborative filtering approaches. Unlike data augmentation-based improvements, our method does not increase training times significantly.
TL;DR: Improving session-based recommendations with RNNs (GRU4Rec) by 35% using newly designed loss functions and sampling.
Keywords: gru4rec, session-based recommendations, recommender systems, recurrent neural network
Code: [![github](/images/github_icon.svg) hidasib/GRU4Rec](https://github.com/hidasib/GRU4Rec) + [![Papers with Code](/images/pwc_icon.svg) 10 community implementations](https://paperswithcode.com/paper/?openreview=ryCM8zWRb)
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 5 code implementations](https://www.catalyzex.com/paper/recurrent-neural-networks-with-top-k-gains/code)
7 Replies
Loading