SupCL-GSS: Supervised Contrastive Learning with Guided Sample Selection

ICLR 2026 Conference Submission25054 Authors

20 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Supervised Contrastive Learning, Hard Negatives, Model Calibration
TL;DR: SupCL-GSS guides supervised contrastive learning with data-map–based difficulty to form hard positives/negatives by label and difficulty, improving accuracy and calibration (lower ECE) across diverse in-/out-of-domain NLP tasks.
Abstract: We present Supervised Contrastive Learning with Guided Sample Selection (SupCL-GSS), that leverages data maps to construct "hard" positives and "hard" negatives for text classification on pre-trained language models. In our method, we first measure training dynamics to identify the learning difficulty of each training sample with respect to a model---whether samples are easy-to-learn or ambiguous. We then construct positive and negative sets for supervised contrastive learning that allow guided sample selection based on both samples' learning difficulty and their class labels. We empirically validate our proposed method on various NLP tasks including sentence-pair classification (e.g., natural language inference, paraphrase detection, commonsense reasoning) and single-sentence classification (e.g., sentiment analysis, opinion mining), both on in- and out-of-domain settings. Our method achieves better performance and yields lower expected calibration errors compared to competitive baselines.
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Submission Number: 25054
Loading