Benchmarking Clarifying Questions for Effective Collaboration in Grounded Instruction-Based InteractionsDownload PDF

Anonymous

16 Dec 2023ACL ARR 2023 December Blind SubmissionReaders: Everyone
TL;DR: We explore when and what to ask as clarifying questions in a Minecraft environment to better understand and execute human instructions by providing tools and datasets for further research.
Abstract: Motivated by the adaptability of human intelligence across various tasks and multi-modal environments, the research community is actively engaged in developing interactive agents capable of engaging in natural conversations with humans and assisting them in real-world tasks. These agents need the ability to request feedback in the form of situated clarifying questions when communication breaks down or instructions are unclear. This paper delves into an extensive investigation of the production of clarifying questions within the context of human-centered AI instruction-based interaction, using a Minecraft environment as a grounding framework. The unique challenges presented by this scenario include the agent's requirement to navigate and complete tasks in a complex, virtual environment, relying on natural language instructions and action states. In this paper, we made the following contributions: 1) a crowd-sourcing tool for collecting grounded language instructions along with clarifying questions in times when instructions are not clear at scale with low costs; 2) a substantial dataset of grounded language instructions accompanied by clarifying questions; and 3) several state-of-the-art baselines for requesting feedback in case of unclear instructions. These contributions are suitable as a foundation for further research.
Paper Type: long
Research Area: Dialogue and Interactive Systems
Contribution Types: Publicly available software and/or pre-trained models, Data resources
Languages Studied: English
0 Replies

Loading