AdaCat: Adaptive Categorical Discretization for Autoregressive ModelsDownload PDF

Published: 20 May 2022, Last Modified: 22 Oct 2023UAI 2022 PosterReaders: Everyone
Keywords: density estimation, generative model, autoregressive model, reinforcement learning, audio, offline RL, discretization
TL;DR: Adaptively discretize 1-D conditional distributions for more expressive autoregressive generative models
Abstract: Autoregressive generative models can estimate complex continuous data distributions, like trajectory rollouts in an RL environment, image intensities, and audio. Most state-of-the-art models discretize continuous data into several bins and use categorical distributions over the bins to approximate the continuous data distribution. The advantage is that the categorical distribution can easily express multiple modes and are straightforward to optimize. However, such approximation cannot express sharp changes in density without using significantly more bins, which makes it parameter inefficient. We propose an efficient, expressive, multimodal parameterization called Adaptive Categorical Discretization (AdaCat). AdaCat discretizes each dimension of an autoregressive model adaptively, which allows the model to allocate density to fine intervals of interest, improving parameter efficiency. AdaCat generalizes both categoricals and quantile-based regression. AdaCat is a simple add-on to any discretization-based distribution estimator. In experiments, AdaCat improves density estimation for real-world tabular data, images, audio, and trajectories, and improves planning in model-based offline RL.
Supplementary Material: zip
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 3 code implementations](https://www.catalyzex.com/paper/arxiv:2208.02246/code)
4 Replies

Loading