The Convolution-Closed Hurdle Motif With an Application to Tensor Decomposition

Published: 17 Jun 2024, Last Modified: 20 Jul 20242nd SPIGM @ ICML PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Bayesian, tucker, tensor decomposition, sparse probabilistic modeling
Abstract: This paper introduces a novel inference scheme for a class of hurdle priors that exploits sparsity to scale inference in potentially high-dimensional models with convolution-closed non-negative likelihoods, such as the Poisson. We apply an instance of the class of hurdle priors, the hurdle gamma prior, to a probabilistic non-negative Tucker decomposition and derive an inference scheme that scales with only the nonzero latent parameters in the core tensor. This scheme avoids the typical exponential blowup in computational cost present in Tucker decomposition, efficiently mapping the data to a high-dimensional latent space. We derive and implement a closed-form Gibbs sampler for full posterior inference and fit our model to longitudinal microbiome data. Using our inference motif to quickly fit our model, we reveal interpretable qualitative structure and encouraging classification results.
Submission Number: 58
Loading