Towards Enforcing Hard Physics Constraints in Operator Learning Frameworks

Published: 17 Jun 2024, Last Modified: 26 Jul 2024ICML2024-AI4Science PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: operator learning, physics-informed machine learning, physics constraints
Abstract: Enforcing physics constraints in surrogate models for PDE evolution operators can improve the physics plausibility of their predictions and their convergence and generalization properties. Imposing these differential constraints softly as training loss terms can suffer from various challenges and does not guarantee faithfulness to the constraints at inference time, calling for stronger ways to impose the constraints. In this paper, we strongly enforce physics constraints in operator learning by projecting the output of any given operator surrogate model onto the space of functions satisfying a specified differential constraint, and perform that projection in a suitable transformed space. Compared to prior works, our method is efficient, compatible with any existing operator learning architecture (both during or after training), and ensures that the physics constraint holds at all points in the spatiotemporal domain. While it remains unclear how to perform the projection efficiently for nonlinear differential constraints, we describe how our approach works remarkably well for linear differential constraints by performing the projection very efficiently in Fourier space. As an example, we enforce the divergence-free condition of the incompressible Navier-Stokes equations, where our projection operator enforces the constraint without sacrificing faithfulness to the data, and does so at a small additional cost.
Submission Number: 223
Loading