Preconditioning for Physics-Informed Neural Networks

24 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: applications to physical sciences (physics, chemistry, biology, etc.)
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: physics-informed neural network, partial differential equation, condition number, application
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Abstract: Physics-informed neural networks (PINNs) have shown promise in solving complex partial differential equations (PDEs). However, certain training pathologies have emerged, compromising both convergence and prediction accuracy in practical applications. In this paper, we propose to use condition number as an innovative metric to diagnose and rectify the pathologies in PINNs. Inspired by classical numerical analysis, where the condition number measures sensitivity and stability, we highlight its pivotal role in the training dynamics of PINNs. We delineate a theory that elucidates the relationship between reduced condition numbers and improved error control, as well as better convergence. Subsequently, we present an algorithm that leverages preconditioning to enhance the condition number. Evaluations on 16 PDE problems showcase the superior performance of our method. Significantly, in 7 of these problems, our method reduces errors by an order of magnitude. Furthermore, in 2 distinct cases, our approach pioneers a solution, slashing relative errors from roughly $100\\%$ to below $6\\%$ and $21\\%$, respectively.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
Supplementary Material: zip
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 8699
Loading