Open Peer Review. Open Publishing. Open Access. Open Discussion. Open Directory. Open Recommendations. Open API. Open Source.
Minimal-Entropy Correlation Alignment for Unsupervised Deep Domain Adaptation
Pietro Morerio, Jacopo Cavazza, Vittorio Murino
Feb 15, 2018 (modified: Feb 19, 2018)ICLR 2018 Conference Blind Submissionreaders: everyoneShow Bibtex
Abstract:In this work, we face the problem of unsupervised domain adaptation with a novel deep learning approach which leverages our finding that entropy minimization is induced by the optimal alignment of second order statistics between source and target domains. We formally demonstrate this hypothesis and, aiming at achieving an optimal alignment in practical cases, we adopt a more principled strategy which, differently from the current Euclidean approaches, deploys alignment along geodesics. Our pipeline can be implemented by adding to the standard classification loss (on the labeled source domain), a source-to-target regularizer that is weighted in an unsupervised and data-driven fashion. We provide extensive experiments to assess the superiority of our framework on standard domain and modality adaptation benchmarks.
TL;DR:A new unsupervised deep domain adaptation technique which efficiently unifies correlation alignment and entropy minimization
Keywords:unsupervised domain adaptation, entropy minimization, image classification, deep transfer learning
Enter your feedback below and we'll get back to you as soon as possible.