Abstract: Personalized text summarization requires models to adapt summaries to users whose interests evolve over time. Existing approaches often compress user history into static personas, rely on long-context prompting that can fail over extended interaction histories, or encode user behavior without explicitly distinguishing actions such as clicks, skips, and summary requests. We propose **IMPerSumm**, an information-modulated user preference encoder for personalized text summarization. IMPerSumm represents user histories as trajectories in a User-Interaction Graph (UIG), where behavior triples encode temporally ordered document, summary, and action transitions. The model dynamically modulates information flow between past interactions and new action-induced signals using KDE-based mutual information estimation, action-specific gates, and multi-scale memory kernels that capture short-term reactivity, long-term drift, and rare-event reinforcement. The resulting preference representation is used to predict the next behavior embedding, extract and contextualize a latent personalized summary representation, and generate the final personalized summary through a frozen pretrained summarization decoder. Experiments on PENS, OpenAI-Reddit, and PersonalSum show that IMPerSumm improves personalized summarization quality over prior personalized summarizers and strong LLM baselines under PerSEval-based evaluation. The learned encoder also transfers to news recommendation on MIND, indicating that action-aware preference modeling can support both generative and predictive personalization.
Loading