Learning Group Importance using the Differentiable Hypergeometric Distribution

Published: 20 Jun 2023, Last Modified: 18 Jul 2023AABI 2023 - Fast TrackEveryoneRevisionsBibTeX
Keywords: hypergeometric distribution, weakly-supervised learning, reparameterization trick, group importance, variational clustering, gumbel softmax
TL;DR: We propose the differentiable hypergeometric distribution and show the advantage of explicitly learning subset sizes.
Abstract: Partitioning a set of elements into subsets of a priori unknown sizes is essential in many applications. These subset sizes are rarely explicitly learned - be it the cluster sizes in clustering applications or the number of shared versus independent generative latent factors in weakly-supervised learning. Probability distributions over correct combinations of subset sizes are non-differentiable due to hard constraints, which prohibit gradient-based optimization. In this work, we propose the differentiable hypergeometric distribution. The hypergeometric distribution models the probability of different group sizes based on their relative importance. We introduce reparameterizable gradients to learn the importance between groups and highlight the advantage of explicitly learning the size of subsets in two typical applications: weakly-supervised learning and clustering. In both applications, we outperform previous approaches, which rely on suboptimal heuristics to model the unknown size of groups.
Publication Venue: ICLR 2023
Submission Number: 7
Loading