Enhancing Knowledge Selection with Data Processing Based on Multiple Turns of Dialog in Knowledge-Grounded Open-Domain Conversations
Abstract: In NLG, recent research in Knowledge-Grounded Text Generation aims to refine sentence specificity and naturalness. When generating texts, considering multiple turns within a conversation is considered important because it allows models to generate sentences that reflect the context of the conversation. Addressing open-domain conversations, determining the optimal conversation history for training knowledge selection models lacks prior exploration. This study aims to improve KGTG models to effectively handle complex utterances by progressively incorporating more turns. This finding offers a foundational direction for boosting knowledge selection models in text generation.
Loading