Open Peer Review. Open Publishing. Open Access. Open Discussion. Open Directory. Open Recommendations. Open API. Open Source.
Some highlights on Source-to-Source Adjoint AD
Oct 19, 2017 (modified: Oct 19, 2017)NIPS 2017 Workshop Autodiff Submissionreaders: everyone
Abstract:Algorithmic Differentiation (AD) provides the analytic derivatives of functions given as programs. Adjoint AD, which computes gradients, is similar to Back Propagation for Machine Learning. AD researchers study strategies to overcome the difficulties of adjoint AD, to get closer to its theoretical efficiency. To promote fruitful exchanges between Back Propagation and adjoint AD, we present three of these strategies and give our view of their interest and current status.
TL;DR:Survey of Source-Transformation AD classic techniques possibly useful to Machine-Learning Back Propagation