Keywords: PINN; forward propagation
TL;DR: A novel structure of PINN that improves the training efficiency..
Abstract: Physics-Informed Neural Networks (PINNs) solve partial differential equations by embedding physical laws into their training process. A computational bottleneck, however, limits conventional PINNs. They rely on multiple backward passes to compute derivatives sequentially, a process that is memory-intensive and fails to leverage the parallelism of modern GPUs. To address this, we introduce Forward PINN, a framework that breaks from this computational architecture. Instead of relying on the backward pass, our innovation is to redesign the forward pass itself to perform differentiation. By exploiting the mathematical properties of specific activation functions, we unify the computation. The network’s output and all its necessary partial derivatives—first, second, and higher-order—are calculated concurrently within a single forward propagation. This approach effectively eliminates the need for multiple backward passes for derivative computation. We validated this new architecture on the two benchmark PDE problems: the two-dimensional heat equation and the anisotropic wave equation. The experimental results show that Forward PINN achieves accuracy comparable to its conventional counterparts while delivering performance speedups of 1.55× and 1.8× on the respective benchmark problems.
Supplementary Material: zip
Primary Area: neurosymbolic & hybrid AI systems (physics-informed, logic & formal reasoning, etc.)
Submission Number: 5139
Loading