Abstract: Open-Set Semi-Supervised Learning (OS-SSL) refers to the task of learning classifiers with labeled and unlabeled instances, but the unlabeled data may contain the instances associated with unseen labels, dubbed as Out-Of-Distribution (OOD) instances. The main idea of OS-SSL is to detect and filter out the OOD instances to avoid the negative effects for classifier training. The current methods treat the detection of OOD instances and the classification of seen labels as two specific tasks, and achieve OS-SSL by using a shared backbone with two task-specific heads. However, we found that such way may result in a conflict problem between the two specific heads by observing the gradients of the shared backbone parameters. To solve this problem, we develop a novel OS-SSL method, namely Closed LOop NEtworks (Clone), which employs two independent networks to detect OOD instances and classify seen labels, but makes a feedback loop between them. We can train Clone in an end-to-end manner efficiently. We empirically compare Clone with the existing OS-SSL methods on several benchmark datasets. The experimental results indicate that Clone can outperform existing baseline methods in most settings.
Loading