A Variational Condition for Minimal-Residual Latent RepresentationsDownload PDF

01 Mar 2023 (modified: 25 May 2023)Submitted to Tiny Papers @ ICLR 2023Readers: Everyone
Keywords: Variational Principles, Partial Differential Equations, Autoencoders, Physics-Informed Machine Learning
TL;DR: We derive a variational condition for obtaining reconstructed data that better satisfy physics equations using autoencoders.
Abstract: Autoencoders are a useful unsupervised-learning architecture that can be used to build surrogate models of systems governed by partial differential equations, enabling a more cost-effective route to study complex phenomena across science and engineering. In this article, we address two key questions underpinning this procedure: whether the reconstructed output satisfies the partial differential equation, and whether other latent vectors not corresponding to the encoding of any training data satisfy the same equation. Our results spell out some relevant conditions, and clarify the different impact of three main design decisions (architecture, training criterion, and choice of training solutions) on the final result.
5 Replies

Loading