Probabilistic Contrastive Learning with Explicit Concentration on the Hypersphere

26 Sept 2024 (modified: 02 Oct 2024)ICLR 2025 Conference Withdrawn SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: representation learning, von Mises–Fisher distribution, uncertainty, probabilistic contrastive learning
TL;DR: We proposed an unnormalized and regularized form of von Mises–Fisher distribution for probabilistic contrastive learning
Abstract: Contrastive learning is predominantly deterministic, limiting its effectiveness in noisy and uncertain environments. We propose a probabilistic approach inspired by the von Mises-Fisher (vMF) distribution, embedding representations on a hyperspherical space. To address numerical instability, we introduce an unnormalized and regularized vMF distribution, preserving essential properties with theoretical guarantees. The concentration parameter, $\kappa$, serves as an interpretable measure of aleatoric uncertainty. Empirical evaluations show a strong correlation between estimated $\kappa$ and unseen data corruption severity, enabling effective failure analysis and enhancing out-of-distribution detection without modeling epistemic uncertainty. From a fresh perspective, our approach introduces a flexible alignment mechanism for improved uncertainty estimation in high-dimensional spaces while remaining compatible with existing contrastive learning frameworks.
Supplementary Material: zip
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Resubmission: Yes
Student Author: No
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Large Language Models: Yes, at the sentence level (e.g., fixing grammar, re-wording sentences)
Submission Number: 7406
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview