Towards Universal Neural Operators through Multiphysics Pretraining

16 Sept 2025 (modified: 03 Dec 2025)ICLR 2026 Conference Withdrawn SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Neural Operator, Pretrain, Multiphysics, Adapters, Foundational models
TL;DR: There could be a physical pretrain of neural operator that could be used to solve various problems
Abstract: Although neural operators found common use in contemporary data-driven physical systems simulation, their training procedure remains computationally expensive and time-consuming. Some advances have been made with the study of downstream problems, where the model is trained on a simpler problem and later fine-tuned on a more challenging one to achieve better quality and lower over-all time costs. In this research we examine capabilities of transformer-based neural operator architectures, which were previously used only for particular problems solutions, in more generalized transfer learning. We evaluate performance of the transformer- and state space model-based neural operators on wide range of downstream PDE simulation problems, including extension of models to the out-of pretraining sample parameter values, addition of new variables into the dynamics, and transfer of operator, trained on datasets, composed of solutions of several differential equations. The results indicate the ability of neural operators of advanced architectures to be used to transfer knowledge between problems, involving partial differential equations.
Primary Area: neurosymbolic & hybrid AI systems (physics-informed, logic & formal reasoning, etc.)
Submission Number: 7672
Loading