Random Neural Network Expressivity for Non-Linear Partial Differential Equations

18 Sept 2025 (modified: 11 Feb 2026)Submitted to ICLR 2026EveryoneRevisionsBibTeXCC BY 4.0
Keywords: random neural networks, partial differential equations, approximation error, expressivity
Abstract: Neural networks with randomly generated hidden weights (RaNNs) have been extensively studied, both as a standalone learning method and as an initialization for fully trainable deep learning methods. In this work, we study RaNN expressivity for learning solutions to non-linear partial differential equations (PDEs). To achieve this, we derive approximation error bounds for time-dependent Sobolev functions and obtain a dimension-free approximation rate $\frac{1}{2}$. Our results imply that RaNNs are capable of efficiently approximating solutions to complex non-linear PDEs. When applied to Physics-Informed Neural Networks (PINNs), our bounds imply that with high probability, the physics-informed training error converges to $0$ with convergence rate free from the curse of dimensionality. Our theoretical analysis is supported by numerical experiments on two benchmark PDEs. These simulations validate the obtained convergence rate.
Supplementary Material: zip
Primary Area: learning theory
Submission Number: 11372
Loading