Differentiable Sampling of Categorical Distributions Using the CatLog-Derivative Trick

Published: 20 Jun 2023, Last Modified: 17 Sept 2023Differentiable Almost EverythingEveryoneRevisionsBibTeX
Keywords: gradient estimation, categorical random variables, probability theory, discrete distributions
TL;DR: Differentiable sampling with Rao-Blackwellised multivariate categorical distributions.
Abstract: Categorical random variables can faithfully represent the discrete and uncertain aspects of data as part of a discrete latent variable model. Learning in such models necessitates taking gradients with respect to the parameters of the categorical probability distributions, which is often intractable due to their combinatorial nature. A popular technique to estimate these otherwise intractable gradients is the Log-Derivative trick. This trick forms the basis of the well-known REINFORCE gradient estimator and its many extensions. While the Log-Derivative trick allows us to differentiate through samples drawn from categorical distributions, it does not take into account the discrete nature of the distribution itself. Our first contribution addresses this shortcoming by introducing the CatLog-Derivative trick -- a variation of the Log-Derivative trick tailored towards categorical distributions. Secondly, we use the CatLog-Derivative trick to introduce IndeCateR, a novel and unbiased gradient estimator for the important case of products of independent categorical distributions with provably lower variance than REINFORCE. Thirdly, we empirically show that IndeCateR can be efficiently implemented and that its gradient estimates have significantly lower bias and variance for the same number of samples compared to the state of the art.
Submission Number: 29
Loading