Momentum-Accelerated Structured Preconditioning for Physics-Informed Neural Networks

Published: 01 Mar 2026, Last Modified: 06 Mar 2026AI&PDE PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Momentum acceleration, PINNs optimization, preconditioning
Abstract: Physics-Informed Neural Networks (PINNs) solve PDEs by training neural surrogates that minimize governing-equation residuals alongside boundary and initial conditions. Despite their flexibility, PINNs are notoriously difficult to optimize due to stiffness, loss imbalance, and gradient noise induced by stochastic collocation. In this work, we study a structured preconditioned optimizer inspired by SOAP and propose a lightweight extension based on a momentum accelerated method - MoQ, which uses gradient extrapolation to approximate Nesterov-style acceleration. We conduct preliminary studies comparing Adam, SOAP, and the proposed SOAP+MoQ on three PDEs problems, namely, viscous Burgers' equation, 2D Poisson, and steady Navier--Stokes (Kovasznay flow). Across multiple random seeds, proposed SOAP+MoQ consistently improves accuracy over both Adam and SOAP, with potential strong gains on Poisson and Navier--Stokes. We further probe the role of gradient noise via batch reuse and observe that MoQ gains can increase in reduced-noise regimes. These preliminary results suggest MoQ is an effective stabilization mechanism for structured preconditioners in PINN optimization.
Journal Opt In: Yes, I want to participate in the IOP focus collection submission
Journal Corresponding Email: indra.ipd@ibm.com
Submission Number: 143
Loading