Stateful ODE-Nets using Basis Function ExpansionsDownload PDF

May 21, 2021 (edited Nov 04, 2021)NeurIPS 2021 PosterReaders: Everyone
  • Keywords: Neural ODEs, Dynamical Systems, Differential Equations
  • TL;DR: We take advantage of basis transformations to introduce stateful normalization layers as well as a methodology for compressing ODE-Nets.
  • Abstract: The recently-introduced class of ordinary differential equation networks (ODE-Nets) establishes a fruitful connection between deep learning and dynamical systems. In this work, we reconsider formulations of the weights as continuous-in-depth functions using linear combinations of basis functions which enables us to leverage parameter transformations such as function projections. In turn, this view allows us to formulate a novel stateful ODE-Block that handles stateful layers. The benefits of this new ODE-Block are twofold: first, it enables incorporating meaningful continuous-in-depth batch normalization layers to achieve state-of-the-art performance; second, it enables compressing the weights through a change of basis, without retraining, while maintaining near state-of-the-art performance and reducing both inference time and memory footprint. Performance is demonstrated by applying our stateful ODE-Block to (a) image classification tasks using convolutional units and (b) sentence-tagging tasks using transformer encoder units.
  • Supplementary Material: pdf
  • Code Of Conduct: I certify that all co-authors of this work have read and commit to adhering to the NeurIPS Statement on Ethics, Fairness, Inclusivity, and Code of Conduct.
  • Code: https://github.com/afqueiruga/StatefulOdeNets
19 Replies

Loading