Prediction-Consistent Koopman Autoencoders

22 Sept 2023 (modified: 25 Mar 2024)ICLR 2024 Conference Withdrawn SubmissionEveryoneRevisionsBibTeX
Keywords: Scientific machine learning, nonlinear dynamics, time-series forecasting, long-term predictions, Koopman operator, Koopman autoencoders, prediction consistency, consistency regularization, limited training data, noisy training data
TL;DR: We propose the prediction-consistent Koopman autoencoder (pcKAE) for accurate long-term forecasting of nonlinear dynamical systems from limited and noisy training data.
Abstract: Data-driven modeling of high-dimensional spatio-temporal dynamical systems, which are often governed by nonlinear partial differential equations (PDEs), poses a serious challenge in the absence of sufficient or high-quality training data. Recently developed Koopman autoencoders (KAEs) leverage the expressivity of deep neural networks (DNNs) and the spectral structure of Koopman operator to learn a reduced-order feature space exhibiting simpler linear dynamics. However, limited and noisy training datasets present a significant roadblock and results in a lack of generalizability due to inconsistency in training data. In this paper we propose the prediction-consistent Koopman autoencoder (pcKAE) which is capable of generating accurate long-term predictions even with limited and noisy training data. We introduce a consistency regularization term that enforces consistency among predictions at different time-steps, making pcKAE more robust and generalizable compared to its counterparts. An analytical justification is presented for such consistency regularization using the Koopman spectral theory. Experimentally, we demonstrate that with limited training data, pcKAE outperforms existing state-of-the-art KAE models for several test-cases, ranging from simple pendulum to kinetic plasmas, fluid flows and sea surface temperature data.
Supplementary Material: pdf
Primary Area: applications to physical sciences (physics, chemistry, biology, etc.)
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 4553
Loading