Taylor-Mode Automatic Differentiation for Higher-Order Derivatives in JAXDownload PDF

Published: 07 Oct 2019, Last Modified: 05 May 2023Program Transformations @NeurIPS2019 OralReaders: Everyone
Keywords: higher-order differentiation, taylor derivatives, automatic differentiation, neural ordinary differential equations
TL;DR: Generalizes forward-mode AD for higher-order derivatives, implements in JAX, applications to Neural ODEs.
Abstract: One way to achieve higher-order automatic differentiation (AD) is to implement first-order AD and apply it repeatedly. This nested approach works, but can result in combinatorial amounts of redundant work. This paper describes a more efficient method, already known but with a new presentation, and its implementation in JAX. We also study its application to neural ordinary differential equations, and in particular discuss some additional algorithmic improvements for higher-order AD of differential equations.
6 Replies

Loading