Poisson-Algebraic Parallel Scan: A fast symplectic framework for neural Hamiltonians

15 Sept 2025 (modified: 22 Nov 2025)ICLR 2026 Conference Withdrawn SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Lie-Poisson algebra, Hamiltonian geometry, Lie group, AI4Science
Abstract: Learning Hamiltonian neural networks (HNNs) that respect the intrinsic symplectic structure of physical systems has emerged as a foundational framework for robust long-term predictions in scientific machine learning. Nevertheless, existing HNN methods face critical limitations: (i) inherent sequential integration prevents parallel computation, causing significant computational bottlenecks, and (ii) unconstrained neural architectures lead to instability when extrapolating dynamics beyond training regimes. To address these fundamental challenges, we introduce $\textit{Poisson-Algebraic Parallel Scan (PAPS)}$, a novel framework that leverages a carefully constructed Poisson algebraic decomposition of the learned Hamiltonian. By embedding polynomial generators explicitly closed under Poisson brackets, PAPS induces an associative Lie-group structure that naturally facilitates parallel-scan (prefix-sum) computation. Our method achieves exact symplectic integration with up to three orders-of-magnitude ($1000\times$) speedup at $10^3$ integration steps, significantly outperforming existing HNN approaches. Moreover, the structured algebraic representation inherent in PAPS ensures intrinsic physical consistency, delivering stable and reliable extrapolation far beyond the training distribution. Extensive theoretical analyses and rigorous numerical experiments validate the superior computational scalability of our approach, highlighting PAPS as a powerful new direction for scalable and physically consistent neural Hamiltonian modeling.
Primary Area: other topics in machine learning (i.e., none of the above)
Submission Number: 5437
Loading