A Short Review of Automatic Differentiation Pitfalls in Scientific Computing
Keywords: Automatic differentiation, pitfalls, review
TL;DR: We review situations in which automatic differentiation can produce unintuitive or wrong gradients.
Abstract: Automatic differentiation, also known as backpropagation, AD, autodiff, or algorithmic differentiation, is a popular technique for computing derivatives of computer programs. While AD has been successfully used in countless engineering, science and machine learning applications, it can sometimes nevertheless produce surprising results. In this paper we categorize problematic usages of AD and illustrate each category with examples such as chaos, time-averages, discretizations, fixed-point loops, lookup tables, linear solvers, and probabilistic programs, in the hope that readers may more easily avoid or detect such pitfalls.
Submission Number: 22