Vietnamese Elementary Math Reasoning Using Large Language Model with Refined Translation and Dense-Retrieved Chain-of-Thought

Published: 01 Jan 2024, Last Modified: 05 Aug 2025JSAI-isAI 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: State-of-the-art large language models (LLMs) have succeeded in various tasks but still show limitations in solving math reasoning problems. Although this problem is actively studied in the English language, a scarcity of work has been conducted to explore LLMs in math reasoning in low-resource languages. Recent advances in LLMs show their ability to obtain cross-lingual knowledge. However, a systematical approach to bridge the language gap and employ these LLMs to math reasoning in low-resource language has yet to be studied. This study proposes a pipeline to solve math problems in Vietnamese by integrating the chain-of-thought technique with high-quality in-context learning exemplars obtained by multilingual dense retrieval. The pipeline is model-agnostic and capable of adapting to any language without fine-tuning. Empirical results show that the proposed pipeline obtains remarkable performance gains compared to competitive baseline LLMs, paving the way for future research on employing English-focus LLMs to solve complex reasoning tasks in low-resource languages.
Loading