- TL;DR: "We present flexible compositional image generation and its applications to continual learning and generalization"?
- Abstract: Humans are able to both learn quickly and rapidly adapt their knowledge. One major component is the ability to incrementally combine many simple concepts to accelerates the learning process. We show that energy based models are a promising class of models towards exhibiting these properties by directly combining probability distributions. This allows us to combine an arbitrary number of different distributions in a globally coherent manner. We show this compositionality property allows us to define three basic operators, logical conjunction, disjunction, and negation, on different concepts to generate plausible naturalistic images. Furthermore, by applying these abilities, we show that we are able to extrapolate concept combinations, continually combine previously learned concepts, and infer concept properties in a compositional manner.
- Keywords: Compositional Generation, Energy Based Model, Compositionality, Generative Models
- Code: https://drive.google.com/file/d/138w7Oj8rQl_e40_RfZJq2WKWb41NgKn3