Step-Back Profiling: Distilling User Interactions for Personalized Scientific Writing

IJCAI 2024 Workshop AI4Research Submission9 Authors

Published: 03 Jun 2024, Last Modified: 05 Jun 2024AI4Research 2024EveryoneRevisionsBibTeXCC BY 4.0
Keywords: Personalization, Scientific generation, LLMs
Abstract: Large language models (LLMs) excel at a variety of natural language processing tasks, yet they struggle to generate personalized content for individuals, particularly in real-world settings like scientific writing. Addressing this challenge, we introduce step-back profiling that personalizes LLMs by abstracting user interactions into concise profiles. Our approach effectively condenses user interaction history, distilling it into profiles that encapsulate essential traits and preferences of users, thus facilitating personalization that is both effective and user-specific. Importantly, step-back profiling is a low-cost and easy-to-implement technique that does not require additional fine-tuning. Through evaluation of the LaMP benchmark, which encompasses a spectrum of language tasks requiring personalization, our approach outperformed the baseline, showing improvements of up to 3.6 points. We curated the Personalized Scientific Writing (PSW) dataset to further study multi-user personalization in challenging real-world scenarios. This dataset requires the models to write scientific papers given specialized author groups with diverse academic backgrounds. On PSW, we demonstrate the value of capturing collective user characteristics via step-back profiling for collaborative writing. Extensive experiments and analysis validate our method's state-of-the-art performance and broader applicability -- an advance that paves the way for more user-tailored scientific applications with LLMs.
Submission Number: 9
Loading