Keywords: AI for Science, Inference-time Scaling, Deep learning, Curse of dimensionality
TL;DR: We introduce SCaSML, a framework that improves pre-trained PDE solvers at inference time without retraining by deriving and efficiently solving a new PDE that governs the model's error, provably accelerating convergence.
Abstract: Solving high-dimensional partial differential equations (PDEs) is a critical challenge where modern data-driven solvers often lack reliability and rigorous error guarantees. We introduce Simulation-Calibrated Scientific Machine Learning (SCaSML), a framework that systematically improves pre-trained PDE solvers at inference time without any retraining. Our core idea is to derive a new PDE, which we term the Law of Defect, that precisely governs the error of a given surrogate model. Because this defect PDE retains the structure of the original problem, we can solve it efficiently with traditional stochastic simulators, yielding a targeted correction to the initial machine-learned solution. We prove that SCaSML achieves a faster convergence rate, with a final error bounded by the product of the surrogate and simulation errors. On challenging PDEs up to 160 dimensions, SCaSML reduces the error of various surrogate models, including PINNs and Gaussian Processes, by 20-80%. SCaSML provides a principled method to fuse the speed of machine learning with the rigor of numerical simulation, enhancing the trustworthiness of Al for scientific discovery.
Primary Area: applications to physical sciences (physics, chemistry, biology, etc.)
Submission Number: 4617
Loading