Comparison of two gradient computation methods in Python

Sri Hari Krishna Narayanan, Paul Hovland, Kshitij Kulshreshtha, Devashri Nagarkar, Kaitlyn MacIntyre, Riley Wagner, Deqing Fu

Oct 28, 2017 (modified: Oct 28, 2017) NIPS 2017 Workshop Autodiff Submission readers: everyone
  • Abstract: Gradient based optimization and machine learning applications require the computation of derivatives. For example, artificial neural networks (ANNs), a widely used learning system, use backpropagation to calculate the error contribution of each neuron after a batch of data is processed. Languages such as Python and R are are popular for machine learning. Therefore, there has been interest in the last few years to build tools for Python and R to compute derivatives.
  • TL;DR: Compares the performance and features of two automatic differentiation tools, autograd and ADOL-C
  • Keywords: Automatic Differentiation, Python, autograd, adolc, performance comparison

Loading