Keywords: Physics-Informed Machine Learning, Operator Learning, DeepONets, Partial Differential Equations
TL;DR: We introduce a new physics-informed neural operator, called Solver-In-The-Loop Deep Operator Network (SITL-DeepONet), which does not require PDE residues to train.
Abstract: We propose solver-in-the-loop deep operator network (SITL-DeepONet), a new kind of physics-informed neural operator for PDEs, based on enforcing consistency with a conventional PDE solver. SITL-DeepONet combines a DeepONet with a solver-in-the-loop training strategy, in which a conventional solver update replaces PDE residuals computed by automatic differentiation. At training time, the solver advances the network prediction at time $t_n$ by one time step, and the network learns to match this solution at $t_{n+1}$. We present experimental results where SITL-DeepONet learns the time-evolution operator of the one-dimensional viscous Burgers equation for diverse initial conditions and multiple viscosity values. SITL-DeepONet attains a mean relative $L^2$ error of $5.88\%$ at 0.051 sec per training iteration, compared with $1.38\%$ at at 5.263 sec/it achieved by the Physics-Informed DeepONet (PI-DeepONet) algorithm, demonstrating that the solver-in-the-loop paradigm remains competitive in accuracy but is two orders of magnitude faster to train.
Submission Number: 115
Loading