LLM-BRec: Personalizing Session-based Social Recommendation with LLM-BERT Fusion Framework

Published: 31 May 2024, Last Modified: 18 Jun 2024Gen-IR_SIGIR24EveryoneRevisionsBibTeXCC BY 4.0
Keywords: Recommendation Systems, Social Recommendations, Large Lan- guage Models (LLMs), BERT, Session-based Recommendation
TL;DR: This paper introduces a framework to personalize Session-based Social Recommendation (SSR) systems by leveraging BERT's transformer architecture for session modeling and LLMs for user-profile generation.
Abstract: Recommendation models enhance online user engagement by suggesting personalized content, boosting satisfaction and retention. Session-based Recommender systems (SR) have become a significant approach, focusing on capturing users' short-term preferences for more accurate recommendations. Recently, Session-based Social Recommendation (SSR) has emerged as a new paradigm that extends SR by incorporating users' social networks and historical sessions, aiming to offer more personalized recommendations. However, current SSR models have two significant limitations: First, they have not efficiently explored user's personalized information, as they focus only on current session information. Second, they use computationally heavy graph-based algorithms for session representations, which significantly hampers the model's efficiency, especially during inference. \\ To address the aforementioned problems, this paper proposes a novel fusion framework, "LLM-BRec," which incorporates Large Language Models (LLMs) and Bidirectional Encoder Representations from Transformer (BERT) to personalize SSR. Here, For session modelling, BERT's transformer architecture and self-attention mechanism are utilized to enhance its computational efficiency by emphasizing relevant contextual information. Additionally, we leverage LLM for user-profile generation to further enhance representation at the inference stage. LLM-BRec has significantly reduced SSR's training and inference time and consistently outperformed the SOTA methods. Experiments on two social datasets and two non-social datasets demonstrate the effectiveness and efficiency of LLM-BRec.
Submission Number: 10
Loading