Synergizing Dynamic Score Aggregation with Contrastive Regularization for Open-Set Semi-Supervised Out-of-Distribution Detection

18 Sept 2025 (modified: 13 Nov 2025)ICLR 2026 Conference Withdrawn SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Out-of-distribution detection, Semi-supervised leaning, Machine learning
TL;DR: Further explore the application prospects of optimal transport theory and contrastive learning in out-of-distribution detection task.
Abstract: Semi-supervised learning (SSL) has achieved remarkable progress by leveraging both limited labeled data and abundant unlabeled data. However, unlabeled datasets often contain out-of-distribution (OOD) samples from unknown classes, which can lead to performance degradation in open-set SSL scenarios. Current approaches primarily address this issue by identifying outliers through OOD detection. Yet, methods relying solely on neural networks are constrained by the absence of labeled OOD samples for supervision. To overcome this limitation, we propose a novel open-set OOD detection framework named **SDM**, which **S**ynergizes **D**ynamic Score Aggregation (DSA) and **M**atrix Contrastive Regularization (MCR). Specifically, we formulate OOD detection as a semi-unbalanced optimal transport (SemiUOT) problem and derive pseudo-labels by solving it. The DSA module dynamically converts SemiUOT into a classical optimal transport (OT) formulation. Unlike existing OT-based methods, DSA provides theoretically grounded and more accurate pseudo OOD scores while avoiding the direct computation of the transport plan. Meanwhile, the MCR module enhances feature discrimination through contrastive learning, thereby improving overall performance. Empirical results demonstrate the superiority of SDM. Additionally, we conduct extensive analytical experiments to elucidate the properties of each component.
Primary Area: other topics in machine learning (i.e., none of the above)
Submission Number: 12309
Loading