AI-Hamilton: Leveraging In-Context Learning for Modeling Hamiltonian Systems

ICLR 2026 Conference Submission16464 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Structure preserving dynamics, Hamiltonian systems, In-context learning, neural operators
TL;DR: Leveraging symplectic structure to enhance in-context learning for forecasting Hamiltonian system dynamics.
Abstract: We present a novel approach to learning Hamiltonian systems from observational data, combining the strengths of in-context learning (ICL) and hypernetworks with the rigorous guarantees of structure-preserving numerical methods. ICL, a unique and powerful capability exhibited by large language models (LLMs), enables pre-trained LLMs to adapt their predictions based on auxiliary information known as "context". While a few studies have explored applying ICL to neural operator learning, most of these approaches treat operators as "black-boxes," offering no guarantees of physical consistency in their predictions. To address this limitation, we propose ICL-based neural operators explicitly designed to preserve the symplectic structure inherent to Hamiltonian dynamical systems. Through extensive experiments on a range of Hamiltonian systems, we demonstrate the proposed model's ability to maintain structural fidelity while achieving improved prediction accuracy compared to black-box ICL-based operators.
Supplementary Material: pdf
Primary Area: applications to physical sciences (physics, chemistry, biology, etc.)
Submission Number: 16464
Loading