PDE-PFN: Prior-Data Fitted Neural PDE Solver

ICLR 2026 Conference Submission19234 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: in-context learning, scientific foundation model, zero-shot, prior-data fitted network, PINN-prior
Abstract: Despite recent progress in scientific machine learning (SciML), existing approaches remain impractical, as they often require explicit governing equations, impose rigid input structures, and lack generalizability across PDEs. Motivated by the success of large language models (LLMs) with broad generalizability and robustness to noisy or unreliable pre-training data, we seek to bring similar capabilities to PDE solvers. In addition, inspired by the Bayesian inference mechanisms of prior-data fitted networks (PFNs), we propose PDE-PFN, a prior-data fitted neural solver that directly approximates the posterior predictive distribution (PPD) of PDE solutions via in-context Bayesian inference. PDE-PFN builds on a PFN architecture with self- and cross-attention mechanisms of Transformer and is pre-trained on low-cost approximate solutions generated by physics-informed neural networks, serving as diverse but not necessarily exact priors. Through experiments on a range of two-dimensional PDEs, we demonstrate that PDE-PFN achieves strong generalization across heterogeneous equations, robustness under noisy priors, and zero-shot inference capability. Our approach not only outperforms task-specific baselines but also establishes a flexible and robust paradigm for advancing SciML.
Supplementary Material: zip
Primary Area: applications to physical sciences (physics, chemistry, biology, etc.)
Submission Number: 19234
Loading