Keywords: Physics-Informed Neural Networks, Partial Differential Equations, Radial Basis Functions, Variational Pretraining, Adaptive Models, Data-Efficient Learning, Neural Network Initialization
TL;DR: RBFs with generative initialization enable fast, adaptive PINNs for solving multi-query PDE problems.
Abstract: Physics-Informed Neural Networks (PINNs) approximate solutions to partial differential equations (PDEs) in a data-free setting. This work replaces the MLP commonly used in PINNs with Radial Basis Functions (RBFs), leveraging their explicit structure and analytic derivatives to improve training efficiency and solution accuracy.
Building on this RBF representation, a generative pretrained initialization model based on variational inference is introduced to further enhance adaptability. Conditioned on PDE attributes, it produces informative RBF kernel parameters that provide strong starting points for PINN training, enabling rapid adaptation to new PDE conditions with minimal fine-tuning and consistently accelerating convergence compared to standard initialization.
Experiments on canonical 1D and 2D PDEs demonstrate (1) that RBF-based PINNs outperform standard MLP-based PINNs and serve as adaptive models, and (2) that variational pretraining can provide effective initialization to enhance training performance. Together, these results validate inference-through-adaptation as a promising direction for scalable, data-efficient, and adaptable PINNs.
Email Sharing: We authorize the sharing of all author emails with Program Chairs.
Data Release: We authorize the release of our submission and author names to the public in the event of acceptance.
Submission Number: 52
Loading