Improving Contrastive Learning of Sentence Embeddings with Focal InfoNCE

Published: 07 Oct 2023, Last Modified: 01 Dec 2023EMNLP 2023 FindingsEveryoneRevisionsBibTeX
Submission Type: Regular Short Paper
Submission Track: Semantics: Lexical, Sentence level, Document Level, Textual Inference, etc.
Submission Track 2: Machine Learning for NLP
Keywords: Contrastive Learning, Sentence Textual Similarity, Sentence Embdding, Negative Sample Reweighing
Abstract: The recent success of SimCSE has greatly advanced state-of-the-art sentence representations. However, the original formulation of SimCSE does not fully exploit the potential of hard negative samples in contrastive learning. This study introduces an unsupervised contrastive learning framework that combines SimCSE with hard negative mining, aiming to enhance the quality of sentence embeddings. The proposed focal-InfoNCE function introduces self-paced modulation terms in the contrastive objective, downweighting the loss associated with easy negatives and encouraging the model focusing on hard negatives. Experimentation on various STS benchmarks shows that our method improves sentence embeddings in terms of Spearman's correlation and representation alignment and uniformity.
Submission Number: 802
Loading