Abstract: Semi-supervised Domain Adaptation (SSDA) aims to learn a well-performed model using fully labeled source samples and scarcely labeled target samples, along with unlabeled target samples. Due to the dominant presence of labeled samples from the source domain in the training data, both the feature extractor and classifier can display bias towards the source domain. This can result in sub-optimal feature extraction for the challenging target samples that have notable differences from the source domain. Moreover, the source-favored classifier can hinder the classification performance of the target domain. To this end, we propose a novel Joint Contrastive Learning with Sensitivity (JCLS) in this paper, which consists of sensitivity-aware feature contrastive learning (SFCL) and class-wise probabilistic contrastive learning (CPCL). Different from the traditional contrastive learning, SFCL pays more attention to the sensitive samples during optimizing the feature extractor, and consequently the feature discrimination of unlabeled samples can be enhanced. CPCL performs class-wise contrastive learning in the probabilistic space to enforce the cross-domain classifier to match the real distribution of source and target samples. By combining these two components, our JCLS is able to extract domain-invariant and compact features and obtain a well-performed classifier. We conduct the experiments on the DomainNet and Office-Home benchmarks, and the results show that our approach achieves state-of-the-art performance.
0 Replies
Loading