Abstract: Conversational retrieval is an emerging research direction in information retrieval, gaining popularity because it aligns with users' natural search habits. Existing conversational retrieval methods supplement the current query's context by concatenating historical queries. Although simple and effective, this technique struggles with topic shifts in complex conversational scenarios as the conversation deepens. Therefore, we propose a zero-shot response-aware expansion method for conversational retrieval, called ZeRA. This method expands queries in three ways. Specifically, at the term level, it enriches query semantics by acquiring word-level expansions of the current query based on proximity principles. At the sentence level, it establishes the thematic scope of the current query by using the first query of the current round as an extension. Additionally, at the passage level, considering that adjacent queries always have similar semantics, we introduce the response document from the previous query to supplement the implicit information in the current query. Finally, we use ColBERT as the retriever without any conversational training data, making it a zero-shot conversational retrieval approach. To test the effectiveness of the proposed method, we conducted extensive experiments on the CAsT-19 and CAsT-20 datasets. Results across multiple evaluation metrics show that compared to similar zero-shot methods, ZeRA significantly improves the effectiveness of conversational retrieval with notable enhancements.
External IDs:dblp:conf/webi/WangCHZWS24
Loading