Physics-informed fine-tuning of foundation models for partial differential equations

Published: 01 Mar 2026, Last Modified: 06 Mar 2026AI&PDE PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Foundation models, Partial differential equations, Physics-informed machine learning, Fine-tuning, Scientific machine learning, Operator learning
TL;DR: We show that incorporating physics constraints (PDE residuals and boundary conditions) into fine-tuning enables PDE foundation models to adapt to new downstreams task with minimal data while maintaining physical consistency.
Abstract: Foundation models for partial differential equations (PDEs) have emerged as powerful surrogates pre-trained on diverse physical systems, but adapting them to new downstream tasks remains challenging due to limited task-specific data and distribution shifts. While fine-tuning has proven transformative in natural language processing, best practices for adapting PDE foundation models remain underexplored. Although physics-informed training has successfully trained accurate solvers across a wide range of PDE problems, its potential for fine-tuning data-based foundation models has not been systematically studied. In this work, we introduce a physics-informed fine-tuning framework that adapts pre-trained PDE foundation models by incorporating physical constraints (PDE residuals and boundary conditions) directly into the fine-tuning objective. This enables effective adaptation in data-scarce regimes while promoting physical consistency. We evaluate our method on a downstream task composed of an unseen PDE class and compare it with data-driven finetuning counterparts. Our results demonstrate that physics-informed fine-tuning achieves competitive accuracy without requiring PDE solutions for training. Furthermore, a hybrid fine-tuning strategy yields superior generalization to out-of-distribution scenarios when only minimal training data is available. These findings establish physics-informed fine-tuning as a scalable and data-efficient paradigm, providing a physically interpretable pathway for adapting foundation models in scientific machine learning.
Journal Opt In: Yes, I want to participate in the IOP focus collection submission
Journal Corresponding Email: vlad.medvedev@iisb.fraunhofer.de
Journal Notes: Time-dependent PDE; Comparison of different PDE foundation models; Hyperparameter tuning; Sensitivity analysis
Submission Number: 95
Loading