SPLiT: Popularity-Bias-Aware Online Prompt Optimization for LLM-based Recommendation

17 Sept 2025 (modified: 11 Feb 2026)Submitted to ICLR 2026EveryoneRevisionsBibTeXCC BY 4.0
Keywords: Recommender Systems, Large Language Models, Prompt Optimization
TL;DR: We propose SPLiT, an online prompt optimization algorithm that mitigates popularity bias in preference summaries, improving recommendation performance in LLM-based recommender systems.
Abstract: Large Language Model (LLM)-based recommender systems often rely on preference summaries to condense a user’s interaction history and help the model better capture the user's interests. The quality of downstream recommendations depends heavily on how accurately the preference summaries align with true preferences. However, prior work has overlooked popularity bias in these summaries, which often over-represents popular items, and thus, recommendation quality degrades. Moreover, the inherent randomness of LLMs produces summaries with varying fidelity and bias. To address this, we propose an online learning approach that identifies the most accurate and least biased preference summary. We formulate the preference summary selection task as a Contextual Bayesian Optimization with Constrained Set problem and introduce the Semantic Popularity Lift-based Preference Summary selecTion (SPLiT) framework. SPLiT incorporates a Semantic Popularity Lift penalty that quantifies how much a summary amplifies popularity bias. The penalty discourages selecting high-bias summaries and guides the choice toward those that better reflect the user’s true preferences. SPLiT significantly improves recommendation performance by mitigating popularity bias, achieving 13.8\% higher Normalized Discounted Cumulative Gain and 6.9\% higher Hit Rate compared with the best baseline. This highlights the importance of popularity bias-aware summary selection for debiasing prompt optimization, advancing fairness and accuracy in LLM-based recommender systems.
Primary Area: other topics in machine learning (i.e., none of the above)
Submission Number: 9689
Loading