Pre-Calc: Learning to Use the Calculator Improves Numeracy in Language Models

Published: 13 Jun 2024, Last Modified: 06 Jul 2024ICML 2024 Workshop AI4MATH PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: MathNLI, Mathematical Reasoning
TL;DR: We propose Pre-Calc, our method to teach smaller language models how to use calculators. By pre-finetuning BERT, RoBERTa, and Flan-T5 on calculator use tasks, we improved these models' performance on tasks requiring numerical understanding.
Abstract: Quantitative and numerical comprehension in language is an important task in many fields like education and finance, but still remains a challenging task for language models. While tool and calculator usage has shown to be helpful to improve mathematical reasoning in large pretrained decoder-only language models, this remains unexplored for smaller language models with encoders. In this paper, we propose Pre-Calc, a simple pre-finetuning objective of learning to use the calculator for both encoder-only and encoder-decoder architectures, formulated as a discriminative and generative task respectively. We pre-train BERT and RoBERTa for discriminative calculator use and Flan-T5 for generative calculator use on the MAWPS, SVAMP, and AsDiv-A datasets, which improves performance on downstream tasks that require numerical understanding.
Submission Number: 20
Loading