Optimizing Retrieval-Augmented Generation for E-Commerce How-To Assistance

Published: 18 Apr 2026, Last Modified: 22 Apr 2026ACL 2026 Industry Track OralEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Retrieval-Augmented Generation (RAG), Conversational AI, LLM-based Evaluation
TL;DR: This paper provides deployment-grounded evidence on how document chunking, query reformulation, and LLM-based evaluation interact to determine the quality of RAG systems in real-world support settings.
Abstract: Conversational AI is increasingly used at eBay to deliver personalized customer support. We present a production RAG-based How-To Assistant that answers support and how-to queries by grounding responses in a proprietary knowledge base. We study three factors that drive quality: (1) document chunking and contextualization for indexing, (2) query refinement methods, and (3) automatic LLM-based evaluation for rapid iteration and reliable measurement. We also describe the end-to-end system workflow - from offline indexing to real-time serving and report deployment metrics, offering practical guidance for building scalable, high-precision RAG assistants in commercial support settings.
Submission Type: Deployed
Copyright Form: pdf
Submission Number: 106
Loading