MegaMath: Pushing the Limits of Open Math Corpora

Published: 08 Jul 2025, Last Modified: 26 Aug 2025COLM 2025EveryoneRevisionsBibTeXCC BY 4.0
Keywords: Pre-training Data, Mathematical Reasoning, Synthetic Data
TL;DR: MegaMath is an open dataset of over 300B tokens from web documents, math-related code, and synthetic sources, designed to enhance language models' mathematical reasoning capabilities.
Abstract: Mathematical reasoning represents a cornerstone of human intelligence, driving problem-solving and innovation, and thus serves as a key indicator of the advanced capabilities of large language models(LLMs). However, the research community still lacks an open, adequate-scaled, high-quality mathematical corpus to match the data requirements of top-grade LLMs. We present MegaMath, an open dataset curated from diverse, mathematics-focused sources, designed to enhance LLMs' proficiency in mathematical reasoning. Specifically, MegaMath is curated via following practices: (1) Revisiting web data: We re-extract all mathematical documents with math-oriented HTML optimizations, fasttext-based filtering and deduplication, all aimed at acquiring higher-quality data specifically for the mathematical domain on the Internet. (2)Recalling Math-related code data: We identify high quality math-related code from large code training corpus, Stack-V2, further enhancing data diversity. (3) Exploring Synthetic data: We conduct various data synthesis practices, resulting in a massive dataset including both synthetic text such as QA-style data, and code. By integrating these strategies and validating their practicality via extensive ablations, MegaMath delivers 371B tokens with largest quantity and top quality among existing open math pre-training datasets.
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the COLM Code of Ethics on https://colmweb.org/CoE.html
Author Guide: I certify that this submission complies with the submission instructions as described on https://colmweb.org/AuthorGuide.html
Submission Number: 388
Loading