Good Lattice Accelerates Physics-Informed Neural Networks

Published: 28 Jul 2023, Last Modified: 28 Jul 2023SynS & ML @ ICML2023EveryoneRevisionsBibTeX
Keywords: physics-informed neural networks, numerical analysis, Korobov space, AI for physics
TL;DR: The proposed good lattice training method significantly accelerates the training of physics-informed neural networks by utilizing number theory that offers an optimal set of collocation points.
Abstract: Physics-informed neural networks (PINNs) can solve partial differential equations (PDEs) by minimizing the physics-informed loss, ensuring the neural network satisfies the PDE at given points. However, the solutions to a PDE are infinite-dimensional, and the physics-informed loss is a finite approximation to a certain integral over the domain. This indicates that selecting appropriate points is essential. This paper proposes "good lattice training" (GLT), a technique inspired by number theoretic methods. GLT provides an optimal set of collocation points and can train PINNs to achieve competitive performance with smaller computational cost
Submission Number: 45
Loading