Improved Autoregressive Modeling with Distribution SmoothingDownload PDF

28 Sep 2020 (modified: 25 Jan 2021)ICLR 2021 OralReaders: Everyone
  • Keywords: generative models, autoregressive models
  • Abstract: While autoregressive models excel at image compression, their sample quality is often lacking. Inspired by randomized smoothing for adversarial defense, we incorporate randomized smoothing techniques into autoregressive generative modeling. We first model a smoothed version of the data distribution and then recover the data distribution by learning to reverse the smoothing process. We demonstrate empirically on a 1-d dataset that by appropriately choosing the smoothing level, we can keep the proposed process relatively easier to model than directly learning a data distribution with a high Lipschitz constant. Since autoregressive generative modeling consists of a sequence of 1-d density estimation problems, we believe the same arguments can be generalized to an autoregressive model. This seemingly simple procedure drastically improves the sample quality of existing autoregressive models on several synthetic and real-world datasets while obtaining competitive likelihoods on synthetic datasets.
  • Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
11 Replies