Reduce Human Labor On Evaluating Conversational Information Retrieval System: A Human-Machine Collaboration Approach

Published: 07 Oct 2023, Last Modified: 01 Dec 2023EMNLP 2023 MainEveryoneRevisionsBibTeX
Submission Type: Regular Long Paper
Submission Track: Resources and Evaluation
Submission Track 2: Resources and Evaluation
Keywords: Interactive Evaluation, Human-Machine Collaboration, Conversational Information Retrieval
Abstract: Evaluating conversational information retrieval (CIR) systems is a challenging task that requires a significant amount of human labor for annotation. It is imperative to invest significant effort into researching more labor-effective methods for evaluating CIR systems. To touch upon this challenge, we take the first step to involve active testing in CIR evaluation and propose a novel method, called HomCoE. It strategically selects a few data for human annotation, then calibrates the evaluation results to eliminate evaluation biases. As such, it makes an accurate evaluation of the CIR system at low human labor. We experimentally reveal that it consumes less than 1\% of human labor and achieves a consistency rate of 95\%-99\% with human evaluation results. This emphasizes the superiority of our method over other baselines.
Submission Number: 3159
Loading