Mutual Transfer Learning across Physical and Architectural Priors for Operator Learning

ICLR 2026 Conference Submission15974 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: transfer learning, operator learning, physics-informed machine learning, mutual learning
Abstract: Recently, the development of foundation models has garnered attention in scientific computing, with the goal of creating general-purpose simulators that can rapidly adapt to novel physical systems. This work introduces a mutual transfer learning framework for operator learning by leveraging the diversity of both model architectures and physical data. First, we introduce Semi-Supervised Mutual Learning for Operators (SSMO) and demonstrate that mutual learning between architecturally diverse models yields significant improvements in accuracy. Second, we validate that pre-training an operator on a wide range of physical dynamics enables substantially more data-efficient and rapid adaptation to new tasks. Our findings reveal that both cross-architecture mutual learning and cross-physics pre-training are effective, distinct strategies for developing more robust and efficient scientific foundation models. We believe that integrating these two strategies presents a promising pathway toward foundational models for scientific computing.
Primary Area: applications to physical sciences (physics, chemistry, biology, etc.)
Submission Number: 15974
Loading