Enhanced Unsupervised Domain Adaptation with Dual-Attention Between Classification and Domain Alignment

Published: 01 Jan 2024, Last Modified: 17 Nov 2024ICASSP 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Unsupervised Domain Adaptation (UDA) deals with transferring knowledge from labeled source domains to unlabeled target domains. This addresses the challenge of different distributions across domains, commonly known as domain shift. Numerous methods attempt to align distributions across domains while learning the core tasks (e.g., classification) on source domain separately. However, limited research has explored the mutual influence between classification and domain alignment. In this paper, we discuss the conflicting optimization between domain alignment and classification tasks, emphasizing the risk of negative transfer due to conflicting optimization directions. For better optimization consistency, these tasks should concentrate on the common information of features. To address this issue, we propose an innovative framework Dual-attention between classification and Domain Alignment (DuDA). DuDA employs gradient-based saliency maps to generate interpretable attentions, concurrently enhancing both classification and domain alignment through a dual-attention mechanism. Experimental results verify the effectiveness of DuDA in mitigating negative transfer and its strong adaptability and promising performance.
Loading