Abstract: We present a new approach for incorporating hard physical constraints into reservoir computing (RC). The goal of this work is to increase the reliability, trustworthiness, and generalizability of RC by guaranteeing adherence to known physical laws, particularly while simulating high dimensional systems such as spatiotemporal fluid flows. A reservoir is commonly implemented as a single-layer recurrent neural network in which only the linear output layer is trained and all other parameters are randomly initialized and fixed. Therefore, training a reservoir only involves solving a least squares problem for the weights of the final layer, posing an excellent opportunity to analytically enforce hard constraints. We show that physical constraints, such as conservation laws and boundary conditions, can be imposed in the training procedure and can be guaranteed to hold for forecasting. We introduce physics enforced reservoir computing (PERC) in which the RC training procedure is augmented with a linear homogeneous constraint defined by a linear operator. We then demonstrate this method by enforcing conserved quantities in the Kuramoto-Sivashinsky system and zero-divergence constraints (mass conservation) in the Kolmogorov flow. In both cases, we enforce these constraints to near machine precision. We provide our code online here: https://github.com/dtretiak/PhysicsEnforcedReservoirComputing/tree/main.
Loading