Keywords: scientific machine learning, physics-informed neural networks, automatic differentiation, finite element method
TL;DR: We show how numerical differentiation improves physics-informed models for complex problems compared to Automatic Differentiation.
Abstract: Automatic differentiation (AD) is the default tool for computing physical derivatives in physics-informed models, but it faces significant limitations when applied to general frameworks, restricting their effectiveness on real-world problems. To overcome these challenges, we propose a hybrid approach that integrates traditional numerical solvers, such as the finite element method, within physics-informed deep learning. This framework enables the exact imposition of Dirichlet boundary conditions seamlessly, and efficiently addresses complex, non-analytic problems. The proposed approach is versatile, making it suitable for integration into any physics-informed model. Crucially, our hybrid gradient computation is up to two orders of magnitude faster than AD, as its computational cost remains unaffected by the underlying model's complexity. We validate the method on representative two and three-dimensional numerical examples and analyze the training dynamics of the hybrid framework.
Primary Area: applications to physical sciences (physics, chemistry, biology, etc.)
Submission Number: 18645
Loading