PixelSNAIL: An Improved Autoregressive Generative ModelDownload PDF

12 Feb 2018 (modified: 05 May 2023)ICLR 2018 Workshop SubmissionReaders: Everyone
Abstract: Autoregressive generative models achieve the best results in density estimation tasks involving high dimensional data, such as images or audio. They pose density estimation as a sequence modeling task, where a recurrent neural network (RNN) models the conditional distribution over the next element conditioned on all previous elements. In this paradigm, the bottleneck is the extent to which the RNN can model long-range dependencies, and the most successful approaches rely on causal convolutions. Taking inspiration from recent work in meta reinforcement learning, where dealing with long-range dependencies is also essential, we introduce a new generative model architecture that combines causal convolutions with self attention. In this paper, we describe the resulting model and present state-of-the-art log-likelihood results on heavily benchmarked datasets: CIFAR-10 (2.85 bits per dim), $32 \times 32$ ImageNet (3.80 bits per dim) and $64 \times 64$ ImageNet (3.52 bits per dim). Our implementation is publicly available at \url{https://github.com/neocxi/pixelsnail-public}.
Keywords: autoregressive
TL;DR: New Neural Autoregressive Generative Model
3 Replies