Domain Adaptation with Cauchy-Schwarz Divergence

Published: 26 Apr 2024, Last Modified: 15 Jul 2024UAI 2024 posterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Domain Adaptation, Transfer Learning, Cauchy-Schwarz Divergence
TL;DR: We introduce Cauchy-Schwarz (CS) divergence to the problem of unsupervised domain adaptation (UDA) and show both theoretical and practical advantages.
Abstract: Domain adaptation aims to use training data from one or multiple source domains to learn a hypothesis that can be generalized to a different, but related, target domain. As such, having a reliable measure for evaluating the discrepancy of both marginal and conditional distributions is crucial. We introduce Cauchy-Schwarz (CS) divergence to the problem of unsupervised domain adaptation (UDA). The CS divergence offers a theoretically tighter generalization error bound than the popular Kullback-Leibler divergence. This holds for the general case of supervised learning, including multi-class classification and regression. Furthermore, we illustrate that the CS divergence enables a simple estimator on the discrepancy of both marginal and conditional distributions between source and target domains in the representation space, without requiring any distributional assumptions. We provide multiple examples to illustrate how the CS divergence can be conveniently used in both distance metric- or adversarial training-based UDA frameworks, resulting in compelling performance. The code of our paper is available at \url{https://github.com/ywzcode/CS-adv}.
List Of Authors: Yin, Wenzhe and Yu, Shujian and Lin, Yicong and Liu, Jie and Sonke, Jan-Jakob and Gavves, Efstratios
Latex Source Code: zip
Signed License Agreement: pdf
Code Url: https://github.com/ywzcode/CS-adv
Submission Number: 504
Loading