SPREAD DIVERGENCEDownload PDF

25 Sept 2019 (modified: 05 May 2023)ICLR 2020 Conference Blind SubmissionReaders: Everyone
TL;DR: A new divergence family dealing with distributions with different supports for training implicit generative models.
Abstract: For distributions $p$ and $q$ with different supports, the divergence $\div{p}{q}$ may not exist. We define a spread divergence $\sdiv{p}{q}$ on modified $p$ and $q$ and describe sufficient conditions for the existence of such a divergence. We demonstrate how to maximize the discriminatory power of a given divergence by parameterizing and learning the spread. We also give examples of using a spread divergence to train and improve implicit generative models, including linear models (Independent Components Analysis) and non-linear models (Deep Generative Networks).
Code: https://drive.google.com/file/d/1p6l7J1HpcNTV1RrF12wwCza-98m1J8di/view?usp=sharing
Keywords: divergence minimization, generative model, variational inference
Original Pdf: pdf
7 Replies

Loading