Towards Principled Objectives for Contrastive DisentanglementDownload PDF

25 Sept 2019 (modified: 05 May 2023)ICLR 2020 Conference Blind SubmissionReaders: Everyone
Keywords: Disentanglement, Contrastive
Abstract: Unsupervised learning is an important tool that has received a significant amount of attention for decades. Its goal is `unsupervised recovery,' i.e., extracting salient factors/properties from unlabeled data. Because of the challenges in defining salient properties, recently, `contrastive disentanglement' has gained popularity to discover the additional variations that are enhanced in one dataset relative to another. %In fact, contrastive disentanglement and unsupervised recovery are often combined in that we seek additional variations that exhibit salient factors/properties. Existing formulations have devised a variety of losses for this task. However, all present day methods exhibit two major shortcomings: (1) encodings for data that does not exhibit salient factors is not pushed to carry no signal; and (2) introduced losses are often hard to estimate and require additional trainable parameters. We present a new formulation for contrastive disentanglement which avoids both shortcomings by carefully formulating a probabilistic model and by using non-parametric yet easily computable metrics. We show on four challenging datasets that the proposed approach is able to better disentangle salient factors.
Code: https://drive.google.com/file/d/1IlBRrf2Zm4YOOb67IQOrwXlaXlO6-4Q4/view?usp=sharing
Original Pdf: pdf
8 Replies

Loading