Separable PINN: Mitigating the Curse of Dimensionality in Physics-Informed Neural NetworksDownload PDF

Published: 21 Oct 2022, Last Modified: 22 Oct 2023DLDE 2022 PosterReaders: Everyone
Abstract: Physics-informed neural networks (PINNs) have emerged as new data-driven PDE solvers for both forward and inverse problems. While promising, the expensive computational costs to obtain solutions often restrict their broader applicability. We demonstrate that the computations in automatic differentiation (AD) can be significantly reduced by leveraging forward-mode AD when training PINN. However, a naive application of forward-mode AD to conventional PINNs results in higher computation, losing its practical benefit. Therefore, we propose a network architecture, called separable PINN (SPINN), which can facilitate forward-mode AD for more efficient computation. SPINN operates on a per-axis basis instead of point-wise processing in conventional PINNs, decreasing the number of network forward passes. Besides, while the computation and memory costs of standard PINNs grow exponentially along with the grid resolution, that of our model is remarkably less susceptible, mitigating the curse of dimensionality. We demonstrate the effectiveness of our model in various high-dimensional PDE systems. Given the same number of training points, we reduced the computational cost by $1,195\times$ in FLOPs and achieved $57\times$ speed-up in wall-clock training time on commodity GPUs while achieving higher accuracy.
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/arxiv:2211.08761/code)
1 Reply

Loading