Zebra: a continuous generative transformer for solving parametric PDEs

Published: 03 Mar 2024, Last Modified: 04 May 2024AI4DiffEqtnsInSci @ ICLR 2024 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: PDE, Deep Learning, Foundation Model, Transformer
TL;DR: We introduce Zebra, a generative continuous transformer for solving parametric PDEs. Zebra, once pretrained on a family of PDE, outperforms existing neural solvers.
Abstract: Foundation models have revolutionized deep learning, moving beyond task-specific architectures to versatile models pre-trained using self-supervised learning on extensive datasets. These models have set new benchmarks across domains, including natural language processing, computer vision, and biology, due to their adaptability and state-of-the-art performance on downstream tasks. Yet, for solving PDEs or modeling physical dynamics, the potential of foundation models remains untapped due to the limited scale of existing datasets. This study presents Zebra, a novel generative model that adapts language model techniques to the continuous domain of PDE solutions. Pre-trained on specific PDE families, Zebra excels in dynamic forecasting, surpassing existing neural operators and solvers, and establishes a promising path for foundation models extensively pre-trained on varied PDE scenarios to tackle PDE challenges with scarce data.
Submission Number: 50
Loading