TL;DR: Calibrated uncertainty quantification of neural-PDE solvers using physics residual errors as a non-conformity score for conformal prediction.
Abstract: Simulating complex physical systems is crucial for understanding and predicting phenomena across diverse fields, such as fluid dynamics and heat transfer, as well as plasma physics and structural mechanics. Traditional approaches rely on solving partial differential equations (PDEs) using numerical methods, which are computationally expensive and often prohibitively slow for real-time applications or large-scale simulations. Neural PDEs have emerged as efficient alternatives to these costly numerical solvers, offering significant computational speed-ups. However, their lack of robust uncertainty quantification (UQ) limits deployment in critical applications. We introduce a model-agnostic, physics-informed conformal prediction (CP) framework that provides guaranteed uncertainty estimates without requiring labelled data. By utilising a physics-based approach, we can quantify and calibrate the model's inconsistencies with the physics rather than the uncertainty arising from the data. Our approach utilises convolutional layers as finite-difference stencils and leverages physics residual errors as nonconformity scores, enabling data-free UQ with marginal and joint coverage guarantees across prediction domains for a range of complex PDEs. We further validate the efficacy of our method on neural PDE models for plasma modelling and shot design in fusion reactors.
Lay Summary: This paper addresses a critical problem in using AI for scientific simulations: while neural networks can predict physical systems like weather or plasma behaviour 1000x faster than traditional methods, we don't know when to trust their predictions. We developed CP-PRE (Conformal Prediction with Physics Residual Error), a method that adds reliable "confidence scores" to AI predictions by checking how well they obey fundamental physics laws (like conservation of energy) rather than requiring expensive validation data. When tested on applications ranging from fluid dynamics to fusion reactor modelling, the method successfully identifies which predictions are trustworthy with statistical guarantees - essentially adding a physics-based "confidence meter" to AI models. This work presents a significant step in safely deploying AI in critical applications like nuclear fusion or aerospace engineering, where you need both speed and reliability, potentially accelerating scientific discovery while maintaining the safety standards these fields require.
Link To Code: https://github.com/gitvicky/CP-PRE
Primary Area: Applications->Chemistry, Physics, and Earth Sciences
Keywords: Surrogate Models, Uncertainty Quantification, Neural-PDE, Physics-Informed, Conformal Prediction, PDE Residuals, Nuclear Fusion
Submission Number: 11215
Loading