Taylor-Mode Automatic Differentiation for Higher-Order Derivatives in JAX

Sep 22, 2019 Submission readers: everyone
  • Keywords: higher-order differentiation, taylor derivatives, automatic differentiation, neural ordinary differential equations
  • TL;DR: Generalizes forward-mode AD for higher-order derivatives, implements in JAX, applications to Neural ODEs.
  • Abstract: One way to achieve higher-order automatic differentiation (AD) is to implement first-order AD and apply it repeatedly. This nested approach works, but can result in combinatorial amounts of redundant work. This paper describes a more efficient method, already known but with a new presentation, and its implementation in JAX. We also study its application to neural ordinary differential equations, and in particular discuss some additional algorithmic improvements for higher-order AD of differential equations.
0 Replies