Framing RNN as a kernel method: A neural ODE approachDownload PDF

21 May 2021, 20:45 (edited 14 Jan 2022)NeurIPS 2021 OralReaders: Everyone
  • Keywords: recurrent neural networks, neural ODE, kernel method, theory of deep learning, generalization bounds
  • TL;DR: Via a neural ODE approach, we frame RNN as a kernel method and derive theoretical guarantees on generalization and stability.
  • Abstract: Building on the interpretation of a recurrent neural network (RNN) as a continuous-time neural differential equation, we show, under appropriate conditions, that the solution of a RNN can be viewed as a linear function of a specific feature set of the input sequence, known as the signature. This connection allows us to frame a RNN as a kernel method in a suitable reproducing kernel Hilbert space. As a consequence, we obtain theoretical guarantees on generalization and stability for a large class of recurrent networks. Our results are illustrated on simulated datasets.
  • Supplementary Material: pdf
  • Code Of Conduct: I certify that all co-authors of this work have read and commit to adhering to the NeurIPS Statement on Ethics, Fairness, Inclusivity, and Code of Conduct.
  • Code:
10 Replies