Hierarchical-Latent Generative Models are Robust View Generators for Contrastive Representation Learning

21 Sept 2023 (modified: 25 Mar 2024)ICLR 2024 Conference Withdrawn SubmissionEveryoneRevisionsBibTeX
Keywords: Generative Models, GANs, Contrastive Learning, Representation Learning, Hierarchical Models
Abstract: A growing area of research is exploiting pre-trained generative models as a data source for contrastive representation learning, generating anchors and the associated positive views through perturbations of the latent codes. In this study, we make significant advances in this field by formalizing the properties of a specific category of generative models, which we term Hierarchical-Latent. We show how the intrinsic properties of these models can successfully be used to create robust views for contrastive learning, outperforming not only previous methods' performance but also surpassing classic approaches trained with genuine real data. The proposed framework is evaluated on different generators and contrastive learning techniques, also investigating the effects of employing a discriminator to filter out low-quality images. Eventually, we test continuous sampling in our experiments, where the generator dynamically samples new synthetic data during contrastive training of the encoder, showing competitive or faster training time with respect to a real-data approach, while allowing a virtually unlimited training set.
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 3563
Loading