Storchastic: A Framework for General Stochastic Automatic DifferentiationDownload PDF

21 May 2021, 20:49 (edited 26 Oct 2021)NeurIPS 2021 PosterReaders: Everyone
  • Keywords: gradient estimation, automatic differentiation, optimization, stochastic computation graphs
  • TL;DR: We present a framework for gradient estimation in stochastic computation graphs that incorporates many estimators and extends to any-order differentiation.
  • Abstract: Modelers use automatic differentiation (AD) of computation graphs to implement complex Deep Learning models without defining gradient computations. Stochastic AD extends AD to stochastic computation graphs with sampling steps, which arise when modelers handle the intractable expectations common in Reinforcement Learning and Variational Inference. However, current methods for stochastic AD are limited: They are either only applicable to continuous random variables and differentiable functions, or can only use simple but high variance score-function estimators. To overcome these limitations, we introduce Storchastic, a new framework for AD of stochastic computation graphs. Storchastic allows the modeler to choose from a wide variety of gradient estimation methods at each sampling step, to optimally reduce the variance of the gradient estimates. Furthermore, Storchastic is provably unbiased for estimation of any-order gradients, and generalizes variance reduction techniques to higher-order gradient estimates. Finally, we implement Storchastic as a PyTorch library at github.com/HEmile/storchastic.
  • Supplementary Material: pdf
  • Code Of Conduct: I certify that all co-authors of this work have read and commit to adhering to the NeurIPS Statement on Ethics, Fairness, Inclusivity, and Code of Conduct.
  • Code: https://github.com/HEmile/storchastic
11 Replies

Loading