In-Context Neural PDE: Learning to Adapt a Neural Solver to Different Physics

27 Sept 2024 (modified: 05 Feb 2025)Submitted to ICLR 2025EveryoneRevisionsBibTeXCC BY 4.0
Keywords: Spatio-temporal prediction, PDEs, in-context learning, neural solvers
TL;DR: We propose "in-context neural PDE," which learns from previous states the parameters to feed into a neural solver in order to predict the next state of a PDE.
Abstract: We address the problem of predicting the next state of a dynamical system governed by *unknown* temporal partial differential equations (PDEs) using limited time-lapse data. While transformers offer a natural solution to this task through in-context learning, the inductive bias of temporal PDEs suggests a more tailored and effective approach. Specifically, when the underlying temporal PDE is fully known, classical numerical solvers can evolve the state with only a few parameters. Building on this observation, we introduce a large transformer-based hypernetwork that processes successive states to generate parameters for a much smaller neural ODE-like solver, which then predicts the next state through time integration. This framework, termed as *in-context neural PDE*, decouples parameter estimation from state prediction, offering closer alignment with classical numerical methods for improved interpretability while preserving the in-context learning capabilities of transformers. Numerical experiments on diverse physical datasets demonstrate that our method outperforms standard transformer-based models, reducing sample complexity and improving generalization, making it an efficient and scalable approach for spatiotemporal prediction in complex physical systems.
Supplementary Material: zip
Primary Area: applications to physical sciences (physics, chemistry, biology, etc.)
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 12097
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview