Some highlights on Source-to-Source Adjoint AD

Laurent Hascoet

Oct 19, 2017 (modified: Oct 19, 2017) NIPS 2017 Workshop Autodiff Submission readers: everyone
  • Abstract: Algorithmic Differentiation (AD) provides the analytic derivatives of functions given as programs. Adjoint AD, which computes gradients, is similar to Back Propagation for Machine Learning. AD researchers study strategies to overcome the difficulties of adjoint AD, to get closer to its theoretical efficiency. To promote fruitful exchanges between Back Propagation and adjoint AD, we present three of these strategies and give our view of their interest and current status.
  • TL;DR: Survey of Source-Transformation AD classic techniques possibly useful to Machine-Learning Back Propagation
  • Keywords: Algorithmic Differentiation, Adjoint mode, Reverse mode, Data-Flow analysis, Source-to-Source transformation

Loading