On the Theoretical Analysis of Dense Contrastive Learning

21 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: Contrastive Learning, Dense Contrastive Learning
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Abstract: Contrastive learning has achieved outstanding performance in self-supervised learning. However, the canonical image-level matching pretext is unsuitable for multi-object dense prediction tasks like segmentation and detection. Recently, numerous studies have focused on dense contrastive learning (DCL) that adopts patch-level contrast to learning representations aware of local information. Although empirical evidence has validated its superiority, to date, there has not been any theoretical work that could formally explain and guarantee the effectiveness of DCL methods, which hinders their principled development. To bridge this gap, using the language of spectral graph theory, we establish the first theoretical framework for modeling and analyzing DCL by dissecting the corresponding patch-level positive-pair graph. Specifically, by decoupling the image-level and patch-level supervision, we theoretically characterize how different positive pair selection strategies affect the performance of DCL, and verify these insights on both synthetic and real-world datasets. Furthermore, drawing inspiration from the theory, we design two unsupervised metrics to guide the selection of positive pairs.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 3157
Loading