Dense Query Representations Alternative to Text Queries for Dense Conversational Search

ACL ARR 2026 January Submission9399 Authors

06 Jan 2026 (modified: 20 Mar 2026)ACL ARR 2026 January SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Conversational Search, Representation Learning
Abstract: Query rewriting in conversational search aims to transform context-dependent user utterances, often containing pronouns or ellipses, into well-formed queries suitable for retrieval. Existing approaches predominantly rely on generative language models trained with human-written queries, optimizing a language modeling objective that is inherently misaligned with the ranking objective of retrievers. As a result, improvements in query rewriting do not necessarily translate into retrieval performance gains, and feedback from retrievers cannot directly influence the rewriter. In this work, we address this objective mismatch by proposing a simple yet effective framework, DenseQR, that replaces text queries with dense representations. Instead of generating text, our model directly produces dense embeddings that are consumed by a dense retriever, enabling end-to-end optimization under the retrieval objective. This design eliminates the reliance on costly human-written rewrites and allows the representation model to be trained solely from retrieval supervision. Experiments on widely used conversational search benchmarks demonstrate that our approach consistently outperforms state-of-the-art generative query rewriting methods when paired with dense retrievers, achieving superior retrieval effectiveness while reducing training complexity.
Paper Type: Long
Research Area: Information Extraction and Retrieval
Research Area Keywords: Machine Learning for NLP
Contribution Types: NLP engineering experiment, Approaches low compute settings-efficiency
Languages Studied: English
Submission Number: 9399
Loading