Unveiling the Power of Shared Spaces: A Gating-Driven Mechanism for Semi-Supervised Domain Adaptation
Keywords: Semi-Supervised Domain Adaptation; Shared Space; Gating-Driven Mechanism
TL;DR: This paper first theoretically reveals the advantages of learning a shared feature space for enhancing transferability. Then, we develop a framework to learn a shared space, which is implemented by a gating-driven SSDA enhancement mechanism.
Abstract: Domain adaptation (DA) aims to enhance the generalization ability of models in scenarios where labeled data in the target domain is scarce. In DA research, semi-supervised domain adaptation (SSDA) can utilize the labeled information in the target domain more effectively compared to unsupervised domain adaptation (UDA), thus achieving superior transfer performance and gaining widespread attention. Existing SSDA methods implicitly learn feature spaces in the process of aligning feature spaces between domains; however, the underlying mechanisms remain insufficiently explored. To address this issue, this paper first theoretically reveals the advantages of learning a shared feature space for enhancing transferability. Based on our theoretical insights, we develop a framework to learn a shared space, which is implemented by a gating-driven SSDA enhancement mechanism. It is feasible to explicitly filters out inconsistent features across domains compared with existing methods. Extensive experimental results demonstrate the significant improvements of the proposed gating-driven enhancement mechanism on state-of-the-art SSDA models. Our code is anonymously provided in https://anonymous.4open.science/r/ICLR_8979.
Primary Area: transfer learning, meta learning, and lifelong learning
Submission Number: 8979
Loading