Framing RNN as a kernel method: A neural ODE approachDownload PDF

Published: 09 Nov 2021, Last Modified: 22 Oct 2023NeurIPS 2021 OralReaders: Everyone
Keywords: recurrent neural networks, neural ODE, kernel method, theory of deep learning, generalization bounds
Abstract: Building on the interpretation of a recurrent neural network (RNN) as a continuous-time neural differential equation, we show, under appropriate conditions, that the solution of a RNN can be viewed as a linear function of a specific feature set of the input sequence, known as the signature. This connection allows us to frame a RNN as a kernel method in a suitable reproducing kernel Hilbert space. As a consequence, we obtain theoretical guarantees on generalization and stability for a large class of recurrent networks. Our results are illustrated on simulated datasets.
Code Of Conduct: I certify that all co-authors of this work have read and commit to adhering to the NeurIPS Statement on Ethics, Fairness, Inclusivity, and Code of Conduct.
TL;DR: Via a neural ODE approach, we frame RNN as a kernel method and derive theoretical guarantees on generalization and stability.
Supplementary Material: pdf
Code: https://github.com/afermanian/rnn-kernel
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 5 code implementations](https://www.catalyzex.com/paper/arxiv:2106.01202/code)
10 Replies

Loading