Structure preserving neural networks based on ODEsDownload PDF

Published: 21 Oct 2022, Last Modified: 05 May 2023DLDE 2022 PosterReaders: Everyone
Keywords: Constrained neural networks, Structure preserving deep learning
TL;DR: This manuscript describes a general and systematic way to impose desired mathematical structures on neural networks, thanks to ODE models.
Abstract: Neural networks have gained much interest because of their effectiveness in many applications. However, their mathematical properties are generally not well understood. In the presence of some underlying geometric structure in the data or in the function to approximate, it is often desirable to consider this in the design of the neural network. In this work, we start with a non-autonomous ODE and build neural networks using a suitable, structure-preserving, numerical time-discretisation. The structure of the neural network is then inferred from the properties of the ODE vector field. To support the flexibility of the approach, we go through the derivation of volume-preserving, mass-preserving and Lipschitz constrained neural networks. Finally, a mass-preserving network is applied to the problem of approximating the dynamics of a conservative dynamical system. On the other hand, a Lipschitz constrained network is demonstrated to provide improved adversarial robustness to a CIFAR-10 classifier.
1 Reply

Loading