Fast Physics-Informed Learning via Diffusion Hypernetworks

18 Sept 2025 (modified: 14 Nov 2025)ICLR 2026 Conference Withdrawn SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: AI for Physics, Hypernetwork, Generative Model, Diffusion Model, PINNs
TL;DR: A generative hyper-diffusion model that produces PINN weights conditioned on task information, accelerating fine-tuning while achieving comparable or superior accuracy.
Abstract: Physics-Informed Neural Networks (PINNs) have emerged as a powerful tool for solving partial differential equations (PDEs), and they have become a key workhorse in many AI-for-science applications. However, PINNs remain highly sensitive to factors such as initial conditions, domain geometries, and physical parameters. As a result, they typically require full retraining when these PDE-defining parameters change. In this work, we propose a diffusion-based hypernetwork that distills knowledge from training data to substantially accelerate PINN training. Our approach leverages a denoising diffusion probabilistic framework to generate  PINN weights conditioned on PDE parameters. Once trained, the hypernetwork can directly produce PINNs for a family of parametric PDEs without requiring additional optimization. For more complex problems, the generated weights, used as initializations, reduce the training time by approximately 46% for the Burgers1D-complex dataset and 60% for the Wave2D dataset. Furthermore, the model demonstrates robustness to out-of-distribution PDE parameters, extending its applicability beyond the training distribution.
Primary Area: transfer learning, meta learning, and lifelong learning
Submission Number: 14267
Loading