GeoCMON: Operator Learning on Deformable Domains via Disentangled Geometric Conditioning

ICLR 2026 Conference Submission20724 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: physics
Abstract: Partial differential equations (PDEs) defined on non-rigid, parametrically vary- ing domains with heterogeneous boundary conditions pose significant computa- tional challenges for classical numerical solvers due to repeated discretizations and costly simulations. Operator learning frameworks offer promising surrogate modeling approaches that approximate solution operators mapping domain and boundary parameters directly to solution fields without explicit meshing. How- ever, existing neural operator architectures struggle with input modality entan- glement, training instability, and limited generalization on highly deformable ge- ometries with complex boundary inputs. To address these limitations, we intro- duce GeoCMON, a Geometric-Conditioned Multi-Branch Operator Network that explicitly disentangles geometric and boundary features via specialized encod- ing branches, which are fused with a spatial trunk network through element-wise multiplication and Einstein summation to enable expressive conditioning. We aug- ment the architecture with conditional residual connections within branches to en- hance gradient flow and training stability, and adopt a weighted mean squared error loss emphasizing regions of physically significant solution magnitude to prioritize challenging prediction regimes. Comprehensive empirical evaluations on parametric PDE datasets derived from 2D Laplace problems demonstrate that GeoCMON substantially outperforms baseline multi-branch methods, achieving superior accuracy across stratified difficulty bins, improved training dynamics ev- idenced by higher synchronization scores and reduced activation variance, and enhanced feature orthogonality indicative of robust representation. Gradient noise analyses confirm that these gains do not compromise optimization stability. Col- lectively, our contributions advance scalable and interpretable operator learning for PDEs on complex, deformable domains, providing a principled architectural and methodological framework for surrogate modeling in scientific computing.
Supplementary Material: zip
Primary Area: applications to physical sciences (physics, chemistry, biology, etc.)
Submission Number: 20724
Loading