- Abstract: Automatic Differentiation is increasingly an important component of Machine Learning packages. For evaluation the gradient, the first order reverse mode, also known as back-propagation, is optimal and is widely used. However, the functionalities of evaluating second and higher order derivatives are limited. One reason is that high order derivatives are used to be evaluated by an overlay of first order forward and reverse modes. Here we describe an algorithm that directly implements the high order reverse mode. It's more efficient than previous methods, especially when evaluating sparse derivatives.