Abstract: Algorithmic Differentiation (AD) provides the analytic derivatives of functions given as programs. Adjoint AD, which computes gradients, is similar to Back Propagation for Machine Learning. AD researchers study strategies to overcome the difficulties of adjoint AD, to get closer to its theoretical efficiency. To promote fruitful exchanges between Back Propagation and adjoint AD, we present three of these strategies and give our view of their interest and current status.
TL;DR: Survey of Source-Transformation AD classic techniques possibly useful to Machine-Learning Back Propagation
Keywords: Algorithmic Differentiation, Adjoint mode, Reverse mode, Data-Flow analysis, Source-to-Source transformation
3 Replies
Loading