Multiscale Neural Operator: Learning Fast and Grid-independent PDE SolversDownload PDF

Published: 15 Jun 2022, Last Modified: 05 May 2023ICML-AI4Science PosterReaders: Everyone
Keywords: physics-informed machine learning, pinns, scientific machine learning, neural ODEs, neural operators, machine learning, neural networks, Matryoshka, multiphysics, multiscale, parametrizations, closure, subgrid, superstructures, partial differential equations, PDEs, differential equations, numerical solvers, physics, hpc, surrogate, reduced order modeling, model reduction, uncertainty quantification, climate, fluid dynamics, physics, computational physics
TL;DR: We create a fast and grid-independent surrogate model of multiscale PDEs by combining neural operators with coarse-grained simulations.
Abstract: Numerical simulations in climate, chemistry, or astrophysics are computationally too expensive for uncertainty quantification or parameter-exploration at high-resolution. Reduced-order or surrogate models are multiple orders of magnitude faster, but traditional surrogates are inflexible or inaccurate and pure machine learning (ML)-based surrogates too data-hungry. We propose a hybrid, flexible surrogate model that exploits known physics for simulating large-scale dynamics and limits learning to the hard-to-model term, which is called parametrization or closure and captures the effect of fine- onto large-scale dynamics. Leveraging neural operators, we are the first to learn grid-independent, non-local, and flexible parametrizations. Our $\textit{multiscale neural operator}$ is motivated by a rich literature in multiscale modeling, has quasilinear runtime complexity, is more accurate or flexible than state-of-the-art parametrizations and demonstrated on the chaotic equation multiscale Lorenz96.
Track: Original Research Track
0 Replies

Loading