Taylor-Mode Automatic Differentiation for Higher-Order Derivatives in JAXDownload PDF

22 Sep 2019NeurIPS 2019 Workshop Program Transformations SubmissionReaders: Everyone
  • Keywords: higher-order differentiation, taylor derivatives, automatic differentiation, neural ordinary differential equations
  • TL;DR: Generalizes forward-mode AD for higher-order derivatives, implements in JAX, applications to Neural ODEs.
  • Abstract: One way to achieve higher-order automatic differentiation (AD) is to implement first-order AD and apply it repeatedly. This nested approach works, but can result in combinatorial amounts of redundant work. This paper describes a more efficient method, already known but with a new presentation, and its implementation in JAX. We also study its application to neural ordinary differential equations, and in particular discuss some additional algorithmic improvements for higher-order AD of differential equations.
6 Replies