Towards Physics-guided Generative Foundation Models

Majid Farhadloo, Arun Sharma, Mingzhou Yang, Bharat Jayaprakash, William Northrop, Shashi Shekhar

Published: 03 Nov 2025, Last Modified: 04 Dec 2025CrossrefEveryoneRevisionsCC BY-SA 4.0
Abstract: A generative model is a statistical model of the joint probability distribution on a given observable variable and target variable, which can be used to "generate" random instances (outcomes) of an observation. When trained on broad and diverse data, it serves as a foundation model that reduces the training resources (e.g., time, energy, labeled samples) required for a wide range of downstream tasks. Due to recent success, such as the generative pre-trained transformer (GPT), there is ongoing excitement about foundation models. However, these models often struggle with out-of-distribution generalization and may yield unrealistic or physically invalid outputs. To address these limitations, we envision physics-guided generative foundation models (PgGenFMs), which integrate broad physical knowledge into foundation models to improve reliability and applicability. We also present a taxonomy comparing PgGenFMs to traditional models and outline strategies for embedding physical knowledge into data, training, and model design.
Loading