LeIn-PINN: Learned Initialization to Alleviate Convergence Failures in Physics Informed Neural Networks
Keywords: Physics Informed Machine Learning
Abstract: Physics-informed neural networks (PINNs) have had a broad research impact in modeling domains governed by partial differential equations (PDEs). However, PINNs often perform sub-optimally or even converge to trivial solutions in challenging scenarios, such as stiff PDE domains or when generalizing to unseen but related experimental contexts. Previous solutions to alleviate catastrophic PINN failures include curriculum-based training techniques and dynamic re-sampling of hard collocation points. These methods face pitfalls: designing a curriculum is ambiguous in multi-parameter PDEs, and dynamic resampling still fails in complex settings. Recent works also suggest that conflicting gradients during PINN training are a major cause of such failures.
We argue that weight initialization plays a crucial role in the emergence of catastrophic failures. To this end, we propose a novel training methodology based on Learned Initialization (LeIn) to address PINN failures. We call our variant LeIn-PINN. Through rigorous experiments on 1D and 2D PDEs, including challenging 2D fluid dynamics contexts, we show that LeIn-PINN outperforms state-of-the-art methods specifically designed to mitigate PINN failures. LeIn-PINN achieves an average performance improvement of 87% over baselines. We also provide a detailed analysis explaining the improved training dynamics of LeIn-PINN and the convergence failures of traditional PINNs by studying their loss landscapes. Finally, we demonstrate that LeIn-PINN significantly reduces spectral bias compared to traditional PINNs, even in challenging PDE domains.
Primary Area: applications to physical sciences (physics, chemistry, biology, etc.)
Submission Number: 22225
Loading