Reproducibility and Ablation Study of “Augmented Neural ODEs”Download PDF

29 Dec 2019 (modified: 05 May 2023)NeurIPS 2019 Reproducibility Challenge Blind ReportReaders: Everyone
Abstract: The recent publication "Augmented Neural ODEs" (ANODE) by Dupont et al. describes a variation of Neural Ordinary Differential Equation networks that addresses a central aspect of treating a neural network as an ordinary differential equation (ODE): solution trajectories of such functions may not overlap. The authors address this issue by augmenting the dimensions of the space in which the ODE is solved to simplify the computation of the trajectories. Here, we report on the reproducibility of the results presented in Dupont et al., and perform ablation and robustness experiments. Most results presented in the original study are reproducible given the author's implementation. Small variations in hyper-parameters did not cause drastic changes to model the performance. To test the theoretical implications of dimensionality augmentation we replaced the adaptive step-size methods with fixed step-size methods. We find that fixed step-size methods perform with higher accuracy as more dimensions are used for augmentation, supporting the conjecture that ANODEs simplify flows. We also demonstrate that ANODEs become unstable when replacing adaptive step-size methods with fixed step-size methods on the MNIST data set. ANODEs were unable to achieve stable loss trends, underlining the need for adaptive step-size methods when ANODEs are trained on large data-sets.
Track: Ablation
NeurIPS Paper Id: https://openreview.net/forum?id=BylEPErxUS
3 Replies

Loading