Keywords: Online shopping, large language models
TL;DR: We developed a comprehensive e-commerce guide (EC-Guide) for instruction tuning and quantization of LLMs, achieving the 2nd in Track 2 and 5th in Track 5 at the Amazon KDD Cup'24.
Abstract: Large language models (LLMs) have attracted considerable attention in various fields for their cost-effective solutions to diverse challenges, especially with advancements in instruction tuning and quantization. E-commerce, with its complex tasks and extensive product-user interactions, presents a promising application area for LLMs. However, the domain-specific concepts and knowledge inherent in e-commerce pose significant challenges for adapting general LLMs. To address this issue, our team (ZJU-AI4H) developed EC-Guide\footnote{\href{https://github.com/fzp0424/EC-Guide-KDDUP-2024 }}, a comprehensive e-commerce guide for instruction tuning and quantization of LLMs. We also heuristically integrated Chain-of-Thought (CoT) during inference to enhance arithmetic performance. Our approach achieved the 2nd place in Track 2 and 5th place in Track 5 at the Amazon KDD Cup'24. Additionally, our solution is model-agnostic, enabling effective scalability across larger systems.
Submission Number: 2
Loading