Beyond In-Place Corruption: Insertion and Deletion In Denoising Probabilistic ModelsDownload PDF

Published: 15 Jun 2021, Last Modified: 05 May 2023INNF+ 2021 posterReaders: Everyone
Keywords: Diffusion, Masking, Sequence, Structure, Machine Learning
TL;DR: We extend diffusion generative models for sequences by including insert and delete operations.
Abstract: Denoising diffusion probabilistic models (DDPMs) have shown impressive results on sequence generation by iteratively corrupting each example and then learning to map corrupted versions back to the original. However, previous work has largely focused on in-place corruption, adding noise to each pixel or token individually while keeping their locations the same. In this work, we consider a broader class of corruption processes and denoising models over sequence data that can insert and delete elements, while still being efficient to train and sample from. We demonstrate that these models outperform standard in-place models on an arithmetic sequence task, and that when trained on the text8 dataset they can be used to fix spelling errors without any fine-tuning.
4 Replies

Loading