Neurosymbolic Deep Generative Models for Sequence Data with Relational ConstraintsDownload PDF

Published: 31 Oct 2022, Last Modified: 05 Jan 2023NeurIPS 2022 AcceptReaders: Everyone
Keywords: neurosymbolic, sequence, program synthesis, generative, constraint, music, poetry
TL;DR: We use program synthesis to extract a distribution of constraints over sequence data, and then synthesize data adhering to that distribution of constraints with added controllability..
Abstract: There has been significant recent progress designing deep generative models that generate realistic sequence data such as text or music. Nevertheless, it remains difficult to incorporate high-level structure to guide the generative process, and many such models perform well on local coherence, but less so on global coherence. We propose a novel approach for incorporating global structure in the form of relational constraints between different subcomponents of an example (e.g., lines of a poem or measures of music). Our generative model has two parts: (i) one model to generate a realistic set of relational constraints, and (ii) a second model to generate realistic data satisfying these constraints. For model (i), we propose a constrained optimization algorithm that infers the relational constraints present in the training data, and then learn a generative model based on the resulting constraint data. In our experiments, we show that our approach significantly improves over state-of-the-art in terms of capturing high-level structure in the data, while performing comparably or better in terms of low-level structure. We also show that using constrained optimization for part (ii) as well leads to increased controllability with little decrease in quality compared to pure learning-based models.
Supplementary Material: pdf
14 Replies

Loading