Cold-Start Personalization via Training-Free Priors from Structured World Models

Published: 02 Mar 2026, Last Modified: 05 Mar 2026LLA 2026 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Personalization, Proactive Preference Elicitation, Adpative Questioning, Preference Elicitation, Cold-Start
TL;DR: Proactive preference elicitation for personalization in cold start settings for users with sparse and diverse preferences
Abstract: Cold-start personalization requires inferring preferences from minimal interaction when no user-specific historical data is available. The space of possible preferences is vast, yet users care about only a sparse subset and rarely articulate them upfront; combined with limited interaction budgets, this makes preference elicitation challenging. Our key insight is that preferences exhibit predictable structure across populations; e.g., users who want detailed explanations often also value worked examples. We propose PEP (Preference Elicitation with Priors), a principled system decomposition framework for cold-start personalization: learning a structured world model of preference correlations offline using latent variables, then performing Bayesian inference online without retraining. Even simple belief model instantiations (e.g., linear regression) substantially outperform end-to-end RL. Across medical, mathematical, social, and commonsense reasoning, PEP achieves 80.8% alignment with ground-truth user preferences versus 68.5% for RL, requires 3-5× fewer interactions, and adapts twice as often. Our contribution is a principled decomposition of cold-start personalization that makes Bayesian preference elicitation practical at scale for LLM systems.
Submission Number: 189
Loading