Multiscale Neural Operator: Learning Fast and Grid-independent PDE SolversDownload PDF

Published: 01 Feb 2023, Last Modified: 25 Nov 2024Submitted to ICLR 2023Readers: Everyone
Keywords: physics-informed machine learning, pinns, scientific machine learning, neural ODEs, neural operators, machine learning, neural networks, Matryoshka, multiphysics, multiscale, parametrizations, closure, subgrid, superstructures, partial differential equations, PDEs, differential equations, numerical solvers, physics, hpc, surrogate, reduced order modeling, model reduction, uncertainty quantification, climate, fluid dynamics, physics, computational physics
TL;DR: We are the first to embed grid-independent neural operators as closure model or parametrization in physical simulations -- in doing so we created a fast and accurate surrogate of multiscale PDEs.
Abstract: Numerical simulations in climate, chemistry, or astrophysics are computationally too expensive for uncertainty quantification or parameter-exploration at high-resolution. Reduced-order or surrogate models are multiple orders of magnitude faster, but traditional surrogates are inflexible or inaccurate and pure machine learning (ML)-based surrogates too data-hungry. We propose a hybrid, flexible surrogate model that exploits known physics for simulating large-scale dynamics and limits learning to the hard-to-model term, which is called parametrization or closure and captures the effect of fine- onto large-scale dynamics. Leveraging neural operators, we are the first to learn grid-independent, non-local, and flexible parametrizations. Our \textit{multiscale neural operator} is motivated by a rich literature in multiscale modeling, has quasilinear runtime complexity, is more accurate or flexible than state-of-the-art parametrizations and demonstrated on the chaotic equation multiscale Lorenz96.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
Submission Guidelines: Yes
Please Choose The Closest Area That Your Submission Falls Into: Machine Learning for Sciences (eg biology, physics, health sciences, social sciences, climate/sustainability )
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/multiscale-neural-operator-learning-fast-and/code)
5 Replies

Loading