Position: Significant impact of numerical precision in scientific machine learning

21 Jan 2025 (modified: 18 Jun 2025)Submitted to ICML 2025 Position Paper TrackEveryoneRevisionsBibTeXCC BY 4.0
TL;DR: This position paper not only highlights the precision-related issues but also recommends reporting comparisons between FP32 and FP64 results, while also encouraging the release of FP64 models.
Abstract: The machine learning community has focused on improving computational efficiency, often through the use of reduced-precision formats such as those below the standard FP32. In contrast, little attention has given to higher-precision formats, such as FP64, despite their critical role in scientific domains like materials science, where even small numerical differences can lead to significant inaccuracies in physicochemical properties. This need for high precision extends to the emerging field of \textit{machine learning for scientific tasks}, yet it has not been thoroughly investigated. According to several studies and our toy experiment, models trained with FP32 exhibit insufficient accuracy compared to FP64, suggesting that higher precision models may be necessary in certain scientific applications. Despite the potential of scientific machine learning, this precision issue often limits their adoption as replacements for traditional scientific computing in practical research. This position paper not only highlights these precision-related issues but also recommends reporting comparisons between FP32 and FP64 results while encouraging the release of FP64 models. We believe these efforts can enable machine learning to contribute meaningfully to the natural sciences, ensuring both scientific reliability and practical applicability.
Primary Area: Research Priorities, Methodology, and Evaluation
Keywords: floating point representation, numerical precision, materials science, scientific application, machine learning potential, physics-informed neural network, density functional theory, finite-difference time-domain
Submission Number: 107
Loading