Keywords: Generalization, Physics Informed Neural Network, Continuous Dependence, Ordinary Differential Equations
Abstract: When applying deep-learning-based solvers to differential equations, a key challenge is how to improve their generalization ability, so that the pre-trained models could be easily adapted to new scenarios of interest. In this paper, inspired by the well-known mathematical statements on the continuous dependence of solutions to ordinary differential equations on initial values and parameters, we make a non-trivial extension of the physics-informed neural networks by incorporating additional information on the continuous dependence of solutions (abbreviated as cd-PINN). Our cd-PINN integrates the advantages of neural operators and Meta-PINN, requiring only few labeled data while enabling solving ordinary differential equations with respect to new initial values and parameters in a fast and accurate way without fine-tuning. As demonstrated through novel examples like the Logistic model, the Lotka-Volterra model as well as damped harmonic oscillators and a multiscale model for p53 activation, the accuracy of cd-PINN under those untrained conditions is usually 1-3 orders of magnitude higher than PINN. Meanwhile, the GPU time cost for training in the two approaches is comparable. Therefore, we expect our cd-PINN would be particularly useful in improving the efficiency and accuracy of deep-learning-based solvers for differential equations.
Supplementary Material: pdf
Primary Area: learning on time series and dynamical systems
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 3376
Loading