A Multi-Fidelity Mixture-of-Expert Framework Integrating PDE Solvers and Neural Operators for Computational Fluid Dynamics
Keywords: Computational Fluid Dynamics, Mixture of Experts, Scientific Machine Learning
Abstract: Solving Navier-Stokes equations is essential for computational fluid dynamics. While recent advancements in neural operators provide significant speed-ups, they often struggle to generalize to out-of-distribution scenarios. On the other hand, hybrid models that integrate neural networks with conventional numerical solvers offer improved generalization ability but incur high computational costs. To address this trade-off between computational efficiency and generalization ability, we propose the Multi-Fidelity Mixture-of-Experts (MF-MoE) framework. This framework combines a pure neural operator with multiple solver-based hybrid models of varying fidelity, leveraging them as expert models. A physics-aware gating network dynamically selects the most appropriate expert based on input characteristics, optimizing both computational cost and predictive accuracy. This innovative design enables faster inference for in-distribution inputs while ensuring better generalization for out-of-distribution cases. Extensive experiments on fluid flow prediction governed by the incompressible Navier-Stokes equations demonstrate that MF-MoE consistently outperforms baseline approaches, offering an efficient solution for PDE surrogate modeling.
Supplementary Material: zip
Primary Area: applications to physical sciences (physics, chemistry, biology, etc.)
Submission Number: 22602
Loading