MAMUT: A Novel Framework for Modifying Mathematical Formulas for the Generation of Specialized Datasets for Language Model Training
Abstract: Mathematical formulas are a fundamental and widely used component in various scientific fields, serving as a universal language for expressing complex concepts and relationships. While state-of-the-art transformer models excel in processing and understanding natural language, they encounter challenges with mathematical notation, which involves a complex structure and diverse representations. This study focuses on the development of specialized training datasets to enhance the encoding of mathematical content. We introduce Math Mutator (MAMUT), a framework capable of generating equivalent and falsified versions of a given mathematical formula in LaTeX notation, effectively capturing the mathematical variety in notation of the same concept. Based on MAMUT, we have generated four large mathematical datasets containing diverse notation, which can be used to train language models with enhanced mathematical embeddings. Experiments show that models trained on these datasets exhibit new SoTA performance on mathematical retrieval tasks. We publish our code, generated datasets, and pretrained mathematical models: https://github.com/aieng-lab/math-mutator.
Submission Length: Long submission (more than 12 pages of main content)
Changes Since Last Submission: Apart from adding author information, only minor changes were made that do not affect the content (e.g., correcting grammar, brief rephrasing for clarity, and improving figure placement). This second camera-ready version also includes an updated Acknowledgments section. We apologize for any inconvenience.
Code: https://github.com/aieng-lab/math-mutator
Supplementary Material: zip
Assigned Action Editor: ~Hongsheng_Li3
Submission Number: 4350
Loading