Physics-Informed Neural Networks for Derivative-Constrained PDEs

Published: 17 Jun 2024, Last Modified: 17 Jul 2024ICML2024-AI4Science PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Physics-Informed Neural Networks, Multi-objective Learning, Partial Differential Equations, Derivative-Constrained, Machine Learning
TL;DR: a self-adaptive extended PINNs framework for solving PDEs with inequality constraints on the derivatives.
Abstract: Physics-Informed Neural Networks (PINNs) have emerged as a promising approach for solving partial differential equations (PDEs) using deep learning. However, standard PINNs do not address the problem of constrained PDEs, where the solution must satisfy additional equality or inequality constraints beyond the governing equations. In this paper, we introduce Derivative-Constrained PINNs (DC-PINNs), a novel framework that seamlessly incorporates constraint information into the PINNs training process. DC-PINNs employ a constraint-aware loss function that penalizes constraint violations while simultaneously minimizing the PDE residual. Key components include self-adaptive loss balancing techniques that automatically tune the relative weighting of each term, enhancing training stability, and the use of automatic differentiation to efficiently compute exact derivatives. This study demonstrates the effectiveness of DC-PINNs on several benchmark problems related to quantitative finance: heat diffusion, Black-Scholes pricing, and local volatility surface calibration. The results showcase improvements in generating appropriate solutions that satisfy the constraints compared to baseline PINNs methods. The DC-PINNs framework opens up new possibilities for solving constrained PDEs in multi-objective optimization.
Submission Number: 40
Loading