Multitask Contrastive Learning

22 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: representation learning, contrastive learning, generalization, multi-task learning
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Abstract: Multi-task and contrastive learning are both aimed at enhancing the robustness of learned embeddings. But combining these two fields presents challenges. Supervised contrastive learning brings together examples of the same class while pushing apart examples of different classes, which is intuitive in single-task scenarios. However, it becomes less intuitive when dealing with multiple tasks, which might require different notions of similarity. In this work, we introduce a novel method, Multi-Task Contrastive Loss (MTCon), that improves the generalization capabilities of learned embeddings by concurrently incorporating supervision from multiple similarity metrics. MTCon learns task weightings that consider the uncertainty associated with each task, reducing the influence of uncertain tasks. In a series of experiments, we show that these learned weightings enhance out-of-domain generalization to novel tasks. Across three distinct multi-task datasets, we find that networks trained with MTCon consistently outperform networks trained with weighted multi-task cross-entropy in both in-domain and out-of domain multi-task learning scenarios. Code will be made available upon publication.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 6334
Loading