References Indeed Matter? Reference-Free Preference Optimization for Conversational Query Reformulation

ACL ARR 2025 February Submission754 Authors

11 Feb 2025 (modified: 09 May 2025)ACL ARR 2025 February SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Abstract: Conversational query reformulation (CQR) has become indispensable for improving retrieval in dialogue-based applications. However, existing approaches typically rely on reference passages for optimization, which are **impractical** to acquire in real-world scenarios. To address this limitation, we introduce a novel **reference-free** preference optimization framework _**DualReform**_ that generates **pseudo** reference passages from **commonly-encountered** conversational datasets containing only queries and responses. _**DualReform**_ attains this goal through two key innovations: (1) **response-based inference**, where responses serve as proxies to infer pseudo reference passages, and (2) **response refinement via the dual-role of CQR**, where a CQR model refines responses based on the shared objectives between response refinement and CQR. Despite not relying on reference passages, _**DualReform**_ achieves 96.9--99.5\% of the retrieval accuracy attainable only with reference passages and surpasses the state-of-the-art method by up to 30.5\%.
Paper Type: Long
Research Area: Information Retrieval and Text Mining
Research Area Keywords: Information Retrieval and Text Mining, Machine Learning for NLP
Contribution Types: Approaches to low-resource settings, Data resources
Languages Studied: English
Submission Number: 754
Loading