Keywords: PINNs, Deep Learning, Differential Equations, Large-Batch Training
TL;DR: Through a series of numerical experiments, we demonstrate that large batch sizes are always benefitial to training PINNs
Abstract: Physics Informed Neural Networks (PINNs) have demonstrated remarkable success in learning complex physical processes such as shocks and turbulence, but their applicability has been limited due to long training times. In this work, we explore the potential of large batch size training to save training time and improve final accuracy in PINNs. We show that conclusions about generalization gap brought by large batch size training on image classification tasks may not be compatible with PINNs. We conclude that larger batch sizes always beneficial to training PINNs.