AdaME: Adaptive learning of multisource adaptationensemblesDownload PDF

05 Oct 2022, 00:13 (modified: 14 Nov 2022, 18:00)NeurIPS 2022 Workshop DistShift PosterReaders: Everyone
Keywords: distribution shift, ensemble, multisource domain adaptation
Abstract: We present a new adaptive algorithm to build multisource domain adaptation neural networks ensembles. Since the standard convex combination ensembles cannot succeed in this scenario, we present a learnable domain-weighted combination and new learning guarantees based on the deep boosting algorithm. We introduce and analyze a new algorithm, ADAME, for this scenario and show that it benefits from favorable theoretical guarantees, is risk-averse and reduces the worst-case mismatch between the inference and training distributions. We also report the results of several experiments demonstrating its performance in the FMOW-WILDSdataset.
1 Reply