Expressive Power of Randomized SignatureDownload PDF

Published: 17 Oct 2021, Last Modified: 05 May 2023DLDE Workshop -- NeurIPS 2021 PosterReaders: Everyone
Abstract: We consider the question whether the time evolution of controlled differential equations on general state spaces can be arbitrarily well approximated by (regularized) regressions on features generated themselves through randomly chosen dynamical systems of moderately high dimension. On the one hand this is motivated by paradigms of reservoir computing, on the other hand by ideas from rough path theory and compressed sensing. Appropriately interpreted this yields provable approximation and generalization results for generic dynamical systems by regressions on states of random, otherwise untrained dynamical systems, which usually are approximated by recurrent or LSTM networks. The results have important implications for transfer learning and energy efficiency of training. We apply methods from rough path theory, convenient analysis, non-commutative algebra and the Johnson-Lindenstrauss Lemma to prove the approximation results.
Publication Status: This work is unpublished.
4 Replies