Keywords: Partial Differential Equations, neural operators, solution operators, interpretable models, out of distribution, dataset shift, physical models
TL;DR: We introduct a benchmark scientific computing approach to PDE operator learning, and a benchmark method for OOD dataset generation for PDEs
Abstract: Solving Partial Differential Equations (PDE) has long been a critical challenge in many scientific and engineering domains. Recently, neural networks have shown great promise in solving PDEs by learning solution operators from data, offering a flexible and adaptive alternative to traditional numerical solvers. Despite these advancements, there is still a need for systematic benchmarking of neural operator methods against conventional approaches and for the development of datasets representing diverse distributions for robust evaluation.
In this paper, we introduce DeepFDM, a benchmark method for learning PDE solution operators based on numerical PDE solvers.
DeepFDM leverages the structure of the PDE, in order to achieve better accuracy and generalization compared to neural solvers. It is designed as a solver for a specific class of PDEs and not as a replacement for neural solvers. Moreover, because DeepFDM learns the coefficients of the PDEs, it offers inherent interpretability. We also introduce a principled method for generating training and test data for PDE solutions, allowing for a quantifiable measure of distribution shifts. This method provides a structured approach to evaluate the out-of-distribution (OOD) performance of neural PDE operators.
Our work sets a foundation for future comparisons of neural operator methods with traditional scientific computing approaches, providing a rigorous framework for performance benchmarking, at the level of the data and at the level of the neural solver.
Primary Area: applications to physical sciences (physics, chemistry, biology, etc.)
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 8080
Loading