Latent Diffusion Transformer with Local Neural Field as PDE Surrogate Model

Published: 03 Mar 2024, Last Modified: 04 May 2024AI4DiffEqtnsInSci @ ICLR 2024 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Dynamical Systems, PDE, Deep Learning, Diffusion Transformer, Neural Fields
TL;DR: AROMA employs a novel diffusion transformer and discretization-free approach to improve the modeling and simulation of complex dynamical systems.
Abstract: We introduce a diffusion transformer architecture, AROMA (Attentive Reduced Order Model with Attention), for dynamics modeling of complex systems. By employing a discretization-free encoder and a local neural field decoder, we construct a latent space that accurately captures spatiality without requiring traditional space discretization. The diffusion transformer models the dynamics in this latent space conditioned on the previous state. It refines the predictions providing enhanced stability compared to traditional transformers and then enabling longer rollouts. AROMA demonstrates superior performance over existing neural field methods in simulating 1D and 2D equations, highlighting the effectiveness of our approach in capturing complex dynamical behaviors.
Submission Number: 40
Loading